Thursday, November 15, 2012

Stone-Tipped Weapons: Older than We Thought

Stone-tipped spears have been around for at least 500,000 years, according to new research. That is about 200,000 years earlier than previously thought.

Why is that important? In part because it suggests that modern humans did not invent this technology. They did not get it from the Neandertals, nor did Neandertals get it from modern humans. Instead, it now seems that Neandertals and modern humans both used stone-tipped spears because both inherited this technology from an earlier form of human life.

It is generally believed that Neandertals and modern humans diverged about 500,000 years ago. The current view is that both came from earlier humans known as Homo heidelbergensis.

"Rather than being invented twice, or by one group learning from the other, stone-tipped spear technology was in place much earlier," according to Benjamin Schoville, who coauthored the study and is affiliated with the Institute of Human Origins at Arizona State University. "Although both Neandertals and humans used stone-tipped spears, this is the first evidence that this technology originated prior to or near the divergence of these two species," Schoville said according to a press release from his university.

Caption: A ~500,000-year-old point from Kathu Pan 1. Multiple lines of evidence indicate that points from Kathu Pan 1 were used as hafted spear tips. Scale bar = 1 cm. Credit: Jayne Wilkins. Usage Restrictions: Image may be used to illustrate coverage of this research only.

"This changes the way we think about early human adaptations and capacities before the origin of our own species," said Jayne Wilkins, a lead author from the University of Toronto. Technological advance—in this case stone-tipped spears—is now seen as more widely shared among the various forms of humanity and not so confined to anatomically modern humans like us. Creating stone-tipped spears requires more forethought and care than simpler stone tools, especially in preparing the tips for mounting to the wooden shaft of the spear. This process is called “hafting,” and the result is that a more efficient hunting weapon is created.

In this study, researchers re-examined stone points discovered more than thirty years ago. By comparing the damage to the spear tips with simulated damage re-created under laboratory conditions, researchers found evidence that strongly supports the view that the original tips were used for spears.

"When points are used as spear tips, there is a lot of damage that forms at the tip of the point, and large distinctive fractures form. The damage on these ancient stone spear points is remarkably similar to those produced with our calibrated crossbow experiment, and we demonstrate they are not easily created from other processes," said coauthor Kyle Brown, a skilled stone tool replicator from the University of Cape Town.

Brown, along with others who worked on the current paper, also collaborated on a study just released describing further stone weapons refinements that occurred about 70,000 years ago and probably gave modern humans an advantage over Neadertals. For more on that, see Better Technology, Better Weapons.

The most recent findings that push the date of stone-tipped spears back to 500,000 years ago are published as "Evidence for Early Hafted Hunting Technology" in the November 16, 2012 issue of Science.

Wednesday, November 7, 2012

A Living, Breathing Lung-on-a-Chip

Human cells can be grown outside the human body. In a petri dish, they may develop in ways that resemble the cells inside the body. But their function and activity are limited. For example, in a dish, lung cells are just lung cells. They don’t breathe.

Using new technology, however, researchers have put lung cells on a chip. The cells on a chip have suddenly become a lung-on-a-chip, active, moving, and breathing.

In a paper published in the in the November 7 issue of Science Translational Medicine, researchers report on their use of recently-developed organ-on-a-chip technology. They describe how they built and used "a biomimetic microdevice that reconstitutes organ-level lung functions to create a human disease model-on-a-chip."

Caption: Wyss Institute's human breathing lung-on-a-chip. Credit: Wyss Institute, Harvard University. Usage Restrictions: None.

Already the device has led to two discoveries directly applicable to the lung disease, edema, which is a major concern for some cancer patients. First, development of the disease is accelerated by the physical movement of the lungs. This is "something that clinicians and scientists never suspected before," according to Donald Ingber, senior author of the study.

Second, researchers identified one drug, currently under development, that might help prevent the problem. For Ingber, this is the main attraction of organ-on-a-chip technology. "This on-chip model of human pulmonary edema can be used to identify new potential therapeutic agents in vitro," Ingber says.

This could accelerate the speed of drug development and testing while reducing the cost. The main advantage is that an organ-on-a-chip gives researchers the opportunity to test a wide array of potential drug compounds. Tests can be run not just on nonhuman animals or on cultured human cells but on functioning or working small-scale models of human organs.

Beyond its value in pharmaceutical research, it is not clear where this research may lead, but it is one more way in which the boundary we once drew between the living and the nonliving is being erased, along with the line between the natural and the artifical.

The work was funded by the National Institutes of Health (NIH) and the Food and Drug Administration (FDA), Defense Advanced Research Projects Agency (DARPA), and the Wyss Institute for Biologically Inspired Engineering at Harvard University. The paper is entitled "A Human Disease Model of Drug Toxicity–Induced Pulmonary Edema in a Lung-on-a-Chip Microdevice" and appears in the November 7, 2012 issue of Science Translational Medicine.

Better Technology, Better Weapons

Ongoing archeological discoveries from coastal South Africa point consistently to a technological and cultural explosion occurring there more than 70,000 years ago. The latest paper, appearing in the November 7 issue of the journal Nature, fills in more detail about remarkable advances in stone tool technology that only appear in Europe some 50,000 years later.

The new findings, reported by an international team of researchers led by Curtis Marean, help fill in a more comprehensive picture of the culture that flourished on the coast of South Africa for thousands of years. In 2009, Marean's team published a report showing how the controlled use of fire played a key role in the engineering of stone tools. The 2012 paper provides evidence that this technology was used for at least 11,000 years by the inhabitants of the coast.

"Eleven thousand years of continuity is, in reality, an almost unimaginable time span for people to consistently make tools the same way," said Marean. "This is certainly not a flickering pattern."

PHOTO: Caption: These microlith blades show a flat edge with a rounded "cutting" edge. Credit: Simen Oestmo. Used by permission of the ASU Institute of Human Origins for the purposes of illustrating coverage of the accompanying article.

One possibility suggested by this research is that the 70,000 year old technology found in South Africa was brought out of Africa by modern humans. If so, it may help explain why Neandertals disappeared as modern humans entered Europe and Asia. Advances in technology made it possible to create light-weight points for spears or arrows, mostly likely used for small spears launched by spear-throwing devices known as atlatls, which effectively extend the length of the throwing arm.

"When Africans left Africa and entered Neanderthal territory they had projectiles with greater killing reach, and these early moderns probably also had higher levels of pro-social (hyper-cooperative) behavior. These two traits were a knockout punch. Combine them, as modern humans did and still do, and no prey or competitor is safe," said Marean. "This probably laid the foundation for the expansion out of Africa of modern humans and the extinction of many prey as well as our sister species such as Neanderthals."

If there is any truth to this conjecture, it is a sobering truth. This technological advance makes it easier to kill.

The new paper reports on findings at the Pinnacle Point excavation site, a mere some 50 miles from Blombos cave, home to similar findings and to the first "chemical engineering laboratory" for the production of the pigment, ochre. Whoever lived there was technologically and culturally advanced, with all the ambiguities that implies.

The paper, "An Early and Enduring Advanced Technology Originating 71,000 Years Ago in South Africa," appears in the November 7 issue of the journal Nature.

Enhancement at Work: A New Report

A new report on human enhancement and its growing impact on the workplace has just been released by top-level British science and policy groups. The Royal Society, the Academy of Medical Science, the British Academy, and the Royal Academy of Engineering collaborated on a research project throughout 2012, resulting in the November 2012 report.

Image from the cover of Human Enhancement and the Future of Work.

Among the conclusions: "Advances in a range of areas in science and engineering such as neuroscience, regenerative medicine and bionics are already enhancing, or could in the next decade enhance, the physical and cognitive capacity of individuals in the workplace." The report is entitled Human Enhancement and the Future of Work.

Even the advocates of human enhancement find something uniquely troubling about the prospect of enhancement technologies in the workplace. Will employers coerce their workers? Will use of enhancement technology be a non-negotiable prerequisite for success in an increasingly competitive work environment? Will employees have full access to information about potential side-effects?

The report notes the following: "Cognitive-enhancing drugs present the greatest immediate challenge...They are already available without prescription through internet purchasing, are relatively cheap and are increasinly being used by healthy individuals."

In response to these challenges, the report does not recommend sanctions or bans, but it does press the case urgently for widening the dialogue and reforming policies and regulations.

Thursday, November 1, 2012

Human Germline Modification: A Step Closer?

Human germline modification—often described as "designer babies—has come a step closer. It has been shown that in nonhuman primates, it is possible to transplant specialized cells that produce sperm. When combined with other steps, this may make germline modification feasible and safe for human use.

The new research involves nonhuman primates. Its purpose is to set the stage for clinical trials in human beings. The goal for using this technique in human beings is to overcome infertility, especially for cancer survivors who were treated with radiation or chemotherapy. In men, that treatment may destroy the ability to produce sperm. If the cancer treatment occurs after puberty, sperm can be stored in advance. But if the treatment occurs before a young boy's body produces sperm, permanent infertility may result.

"Men can bank sperm before they have cancer treatment if they hope to have biological children later in their lives," according to University of Pittsburgh researcher Kyle Orwig, lead researcher. "But that is not an option for young boys who haven't gone through puberty, can't provide a sperm sample, and are many years away from thinking about having babies," Orwig said according to a press release from the university.

Photo by Bertrand Devouard, 2006, available at Wikimedia

No medical solution is now available, but the report published today opens the possibility that in the future, young male cancer survivors will be transplanted with cells that can restore their ability to produce sperm and to become fathers. To be clear: Orwig's group did not work with human subjects. But by showing that the technique works in rhesus monkeys, they help make the case that it could work in humans and should be tried.

"This is the first study to demonstrate that transplanted spermatogonial stem cells can produce functional sperm in higher primates," Orwig said. "This is an important step toward human translation." The study is published in the November 2012 issue of the journal, Cell Stem Cell.

The cells that were transplanted into the rhesus monkeys are called "spermatogonial stem cells" or SSCs. Researchers used frozen or cryopreserved SSCs.

In the future, one possibility is that SSCs might be produced from stem cells, such as induced pluripotent stem cells. In addition, the SSCs might be genetically modified before they are transplanted. In nonhuman animals, this would provide a new way to create transgenic animals for research.

Another possibility is that this technique, if used to restore fertility to men who cannot produce sperm, might also be used for human germline modification. In a 2006 article, Hiroshi Kubota and Ralph L. Brinster (a pioneer in developing this technique) suggested that SSC transplantation may be used for precisely this purpose. "Another potential clinical application using human SSCs is GERMLINE GENE THERAPY" (Capital letters in original). They suggest that "germline gene therapy using SSCs will become a promising and feasible approach, although considerable ethical concerns exist."

What makes all this especially interesting is that by transplanting SSCs, researchers may make it possible for fertility to be restored without the use of in vitro fertilization. The Orwig paper suggests this quite clearly: SSC transplantation may be capable of "enabling the recipient male to father his own genetic children, possibly through normal coitus." If the SSCs are genetically modified first, we would have germline modification without IVF.

When human germline modification is suggested, many find the idea frightening. It is generally assumed that religious people will be universally opposed. That is not true, not even among Catholics.

What the official Catholic position opposes is the destruction of human embryos or even their creation outside the human body, which IVF requires. The Vatican is not opposed to using high tech medicine to create healthy babies.

In 2004, this is what a Vatican commission had to say: “Germ line genetic engineering with a therapeutic goal in man would in itself be acceptable were it not for the fact that is it is hard to imagine how this could be achieved without disproportionate risks especially in the first experimental stage, such as the huge loss of embryos and the incidence of mishaps, and without the use of reproductive techniques. A possible alternative would be the use of gene therapy in the stem cells that produce a man’s sperm, whereby he can beget healthy offspring with his own seed by means of the conjugal act.”

It almost sounds here like the Vatican was suggesting the technique that is being developed. It should be noted that this statement was released while John Paul II was pope. It was drafted by a commission headed by Cardinal Ratzinger, who is now Benedict XVI.

One should not expect Catholics or any other religious community to lead a chorus of praise for human germline modification. At most, one might expect guarded comments from religious leaders, coupled with the demand that this technology be limited to therapy and not used for enhancement. But the key point is this: if human germline modification technology is developed, religious leaders may actually be open to its use.

But if it is developed for therapy, who really thinks it will be limited in that way? If it works to create a healthy baby, why not use it to create a better baby?

The article, entitled "Spermatogonial stem cell transplantation into Rhesus testes regenerates spermatogenesis producing functional sperm," appears in the November 2012 issue of the journal, Cell Stem Cell.

Thursday, October 25, 2012

Resveratrol and Enhancement? Not So Fast

Does resveratrol help healthy people become even healthier? Does it improve metabolic health and possibly even help us live longer?

A new study casts doubts on these hopes. In the October 25 issue of Cell Metabolism, researchers at Washington University School of Medicine publish the results of their study involving 29 healthy middle-aged women. They asked whether resveratrol boosts metabolic health. When they ran the tests and collected the evidence, the answer was simple: No.

The study divided the women into two groups. Fifteen were given 75 milligrams of resveratrol each day, the same as they would get in 8 liters (more than 10 bottles) of red wine. The other fourteen received a sugar-pill placebo.

Researchers measured the women's sensitivity to insulin and the rate of the glucose uptake. The result, according to Samuel Klein, senior investigator, is that "we were unable to detect any effect of resveratrol. In addition, we took small samples of muscle and fat tissue from these women to look for possible effects of resveratrol in the body's cells, and again, we could not find any changes in the signaling pathways involved in metabolism," Klein said in a press release issued by Washington University School of Medicine.

Photo Credit: Robert Boston. No usage restrictions.

This study is small, but what makes it interesting is that it involves healthy human beings. In nonhuman trials, resveratrol seems to enhance the health of healthy animals. And in human trials involving people with metabolic problems, resveratrol seems beneficial.

According to Klein, "Few studies have evaluated the effects of resveratrol in people," Klein explains. "Those studies were conducted in people with diabetes, older adults with impaired glucose tolerance or obese people who had more metabolic problems than the women we studied. So it is possible that resveratrol could have beneficial effects in people who are more metabolically abnormal than the subjects who participated in the study."

That point goes right to the heart of the human enhancement debate. Often, "enhancement" is distinguished from therapy. While therapy improves the health of the sick, enhancement improves the health of the healthy. This study seems to suggest that resveratrol may be therapeutic, but it is not an enhancement.

Right now, however, the picture is not completely clear. Those who drink red wine in moderation are less likely than others to develop heart disease and diabetes? Is it the resveratrol, the wine, or the interactions between them?

According to Klein, "We were unable to detect a metabolic benefit of resveratrol supplementation in our study population, but this does not preclude the possibility that resveratrol could have a synergistic effect when combined with other compounds in red wine."

The article, entitled "Resveratrol Supplementation Does Not Improve Metabolic Function in Nonobese Women with Normal Glucose Tolerance," appears in the October 25 issue of Cell Metabolism.

Thursday, October 4, 2012

Human-Neandertal Interbreeding: When and Where?

Comparison between Neandertal and anatomically modern human genomes shows a history of interbreeding. Some living human beings—those with ancestry in Europe and Asia—carry the results of that interbreeding in their DNA. Those with ancestry in sub-Saharan Africa typically do not.

We also know that Neandertals lived in Eurasia from 230,000 until about 30,000 years ago. Where they came from or why they disappeared remains an open question. And we know that anatomically modern humans first appear in Africa at least 200,000 years ago. Some of them made their way to Asia and Europe sometime in the last 100,000 years.

So when did modern human/Neandertal interbreeding last occur? Did it occur deep in our past, before modern humans and Neandertal ancestors left Africa? Or did it occur after both left Africa, sometime—in other words—within the past 100,000 years?

A new study claims to find evidence that the interbreeding occurred out of Africa. Researchers argue that on the basis careful analysis of the shared DNA, the most recent interbreeding occurred sometime between 37,000 and 86,000 years ago.

Caption: Reconstruction of a Neandertal, 2006, by Stefan Scheer, from Stefanie Krull, Neanderthal Museum Picture Library, Mettmann, Germany

If so, it is pretty strong evidence that the interbreeding occurred after anatomically modern human left Africa. This may have occurred in the Middle East, researchers point out, but probably not just at the beginning of the modern human migration out of Africa. The most recent interbreeding, they conclude, occurs well after this 100,000 date, suggesting ”a more recent period, possibly when modern humans carrying Upper Paleolithic technologies expanded out of Africa.”

In that case, the conceptual challenge posed by the modern human/Neandertal interbreedng remains clearly in front of us. What is the human species? Were Neandertals human? And what are we to make of our new insight into modern human diversity. All puzzling questions, to put it mildly.

The article, "The Date of Interbreeding between Neandertals and Modern Humans," is published in the current issue of PLOS Genetics, where it is available free to the public.

Engineered Eggs

Researchers in Japan have reported success in generating mouse eggs or oocytes from pluripotent stem cells. When fertilized, these induced eggs grew into live, healthy pups capable of producing their own offspring. The work is reported in the October 5 issue of the journal Science.

The research team used two different types of pluripotent cells, embryonic and induced. In both cases, they were able to produce cells that are the precursor of the cells of the ovaries, which form eggs. Once they produced these cells and grew them in clusters, they implanted them into the bodies of female mice, where they developed into cell structures that functioned like ovaries. From these reconstituted ovaries, researchers harvested mature oocytes, much as they would for in vitro fertilization (IVF).

The next step, predictably, was to fertilize these eggs and implant them in surrogate mother mice. Once born, the pups developed and were allowed to breed, producing viable offspring.

Pups from ES-oocyte. Female offspring from primordial germ cell-like cell-derived oocytes were fully fertile. Courtesy of Katsuhiko Hayashi.

The most immediate impact of this research will be to advance our understanding of the fundamentals of reproductive biology, especially the development of egg cells. If similar strategies will work with human pluripotent stem cells—especially induced cells—this research may open new approaches for reproductive medicine in the years ahead.

What other possibilities might there be? Again, if the work can be replicated in human beings, two things might happen. Somewhat more remote is the possibility that this strategy will be used for the purposes of human germline modification or so-called “designer babies.” For example, pluripotent stem cells might be genetically modified before they are induced to become the source of oocytes. The modification could be to avoid a disease or for the purposes of enhancement.

More likely, of course, is that this strategy will be used to create human oocytes for research purposes. For example, human induced ovary-like cells could be implanted into a mouse or other nonhuman animal, grown to the right stage of development, then “harvested” in order to collect a significant number of oocytes.

Today, research in certain areas is hampered because of limited supplies of human oocytes. One area that comes to mind is nuclear transfer or cloning. While “Dolly” the sheep is now only a distant memory, this advance brings closer the possibility that with an ample supply of human oocytes for experimentation, researchers will learn how to create human clones reliably.

So the big question is whether this research can be replicated in humans. On that point, here's how the article concludes: "our system serves as a robust foundatin to investige and further reconstitution femaile germline development in vitro, not only in mice, but also in other mammals, including humans."

The article, entitled "Offspring from Oocytes Derived from in vitro Primordial Germ Cell-like Cells in Mice," appears in the 5 October 2012 issue of the journal, Science.

A New Source for New Neurons

The day when stem cell research will give us treatments for common brain disorders such as Parkinson’s or Alzheimer’s just got a little closer. So, by the way, did the day when this research will be used to enhance the capacities of the normal or healthy human brain. The latest advance comes from an international team based mostly in Germany, which has figured out a way to generate new neurons from cells that already exist in the human brain.

The human brain naturally contains specialized cells called pericytes. Usually they are located at the edge of the capillaries that carry blood to the brain. They play a vital role in maintaining the blood-brain barrier.

Neurons. Photo from National Institutes of Health.

Now, thanks to the discovery reported in the October 5 issue of Cell Stem Cell, pericytes might be about to learn a new trick: forming new neurons. Using stem cell reprogramming techniques, researchers learned that two factors—Sox2 and Mash1—would induce pericytes to change their developmental state and begin to function as newly-formed neurons.

According to the article, “these induced neuronal cells acquire the ability of repetitive action potential firing and serve as synaptic targets for other neurons, indicating their capability of integrating into neuronal networks.” In other words, they do what neurons normally do. They process signals from one end of the cell to another. They form synaptic connections with other neurons. And they integrate into larger networks.

Will this become a new strategy for treating diseases or injuries to brain cells? That is the hope, but difficult challenges remain. How can living pericytes in a functioning human brain be targeted and induced to become neurons? If they generate new neurons, will they function properly? Will they integrate themselves into a functioning brain, preferably taking up the cognitive processes that are lost because of disease or injury?

The authors conclude that “much needs to be learned” but that “our data provide strong support for the notion that neuronal reprogramming of cells of pericytic origin within the damaged brain may become a viable approach to replace degenerated neurons.”

According to Benedikt Berninger of the Johannes Gutenberg University in Mainz, a leader in the research team, “The ultimate goal we have in mind is that this may one day enable us to induce such conversion within the brain itself and thus provide a novel strategy for repairing the injured or diseased brain."

That may be the goal, but it's hard to imagine this research will be limited to therapy. In fact it may turn out to be easier to use it to enhance the cognitive capacity of normal or healthy aging brains than it is to treat disease. Anything that stimulates the growth of new neurons is likely to be very appealing to aging adults.

If human stem cell research is to reach its full promise, many more advances like this will have to occur. With each advance, however, comes growing confidence that the promise of the field may be highly challenging, but it is not hype.

The article entitled “Reprogramming of pericyte-derived cells of the adult human brain into induced neuronal cells” is published in the October 4, 2012 issue of Cell Stem Cell.

Thursday, August 30, 2012

Denisovan DNA in Focus

Using new techniques to study ancient DNA, scientists have unraveled the genetic details of a young girl who lived in central Asia around 50,000 years ago. She is the only individual of her kind, a unique branch of the human family called the Denisovans, named for the cave where her remains were found in 2008.

What makes the research all the more startling is that only two teeth and one pea-size bone fragment has been found. But from those tiny fragments of humanity, the story of the Denisovans is being pieced together.

The new techniques were developed by Matthias Meyer, working at the Department of Evolutionary Genetics, Max Planck Institute for Evolutionary Anthropology in Leipzig, a research program led by Svante PƤƤbo. DNA extracted from the bone fragment was separated into two strands that were amplified and analyzed separately, many times over, until a highly reliable sequence was determined.

Laboratory for the extraction of ancient DNA. [Image courtesy of Max Planck Institute for Evolutionary Anthropology].

Researchers claim that the result is as complete and accurate as the sequence of living human beings. Already, the new technique is being used to study other ancient remains, including samples of Neandertal DNA. Denisovans and Neandertals, distinct but closely related forms of humanity, overlapped with anatomically modern humans (AMH) and interbred with them.

New methods in genetics, including the technical breakthrough described in this paper, are opening new windows on the human family tree, which resembles an inter-grown vine more than a straight line of branches.

So accurate is the genetic analysis that researchers can reach some conclusions about other Denisovans, even though no samples exist for them. For one thing, despite their wide geographic spread, they apparently never reached high numbers. Their DNA lives on today in the faint echo of ancient interbreeding found in the uniquely-Denisovan sequences carried by those who live in the islands of southeast Asia.

No one knows what Denisovans looked like, but they probably resembled us in many ways. The Denisovan girl whose DNA was studied carried genes that are associated today with brown hair, brown eyes, and dark skin. Like us they had 23 pairs of chromosomes (compared to chimps with 24), making interbreeding more readily possible.

Denisova molar, distal. [Image courtesy of Max Planck Institute for Evolutionary Anthropology].

One of the more tantalizing aspects of the report is the light it sheds not on the Denisovans or the Neandertals but on us anatomically modern human beings who live today. Why did we survive and flourish culturally when they did not?

One explanation may lie in the genetic differences between us and them, which can be studied for the first time in detail. In this paper, researchers identify specific changes in genes that are associated with brain complexity, synaptic connections, and speech development. According to the paper, “it is thus tempting to speculate that crucial aspects of synaptic transmission may have changed in modern humans.” In other words, tiny differences in DNA led to still relatively small differences in brain function that led to huge differences in culture.

Future technical advances will continue to shed new light on the complex story of recent human ancestry. By gaining ever-higher clarity on the genetic differences between Neandertals, Denisovans, and modern humans, we will come to know the story of our humanity in greater detail.

The paper ends with this reflection: “This [work] should ultimately aid in determining how it was that modern humans came to expand dramatically in population size as well as culturally complexity which archaic humans eventually dwindled in numbers and became physically extinct.” The paper, “A High-Coverage Genome Sequence from an Archaic Denisovan Individual,” is published in the 30 August 2012 issue of Science, published by the American Association for the Advancement of Science.

Wednesday, July 18, 2012

Neandertal Medicine

Neandertals not only ate their vegetables. They used specific plants—even ones that tasted bitter—to treat their ailments. That’s the latest finding from the international team of researchers studying Neandertal remains at in El SidrĆ³n archeological site in northern Spain. Discovered in 1994, El SidrĆ³n has yielded thousands of samples from at least 13 Neandertal individuals.

Using newer techniques of microanalysis, the team studied the dental plaque recovered from teeth of five individuals dating about 50,000 years ago. Lodged in the plaque were tiny microfossil remains of various plants, providing evidence that Neandertals supplemented their diet of meat with a wide range of grain, herbs, and vegetables. The study is published this week in Naturwissenschaften (The Science of Nature).

CAPTION: Researchers working in El SidrĆ³n Cave. Credit: CSIC ComunicaciĆ³n.

"The varied use of plants we identified suggests that the Neanderthal occupants of El SidrĆ³n had a sophisticated knowledge of their natural surroundings which included the ability to select and use certain plants for their nutritional value and for self-medication. While meat was clearly important, our research points to an even more complex diet than has previously been supposed," according to Karen Hardy, a leader in the research team, according to a press release from the University of York.

Neandertals disappeared from Europe and Asia somewhere around 30,000 years ago, often sharing regions with modern humans for thousands of years. Only recently has it become clear that they depended heavily on plants as well as meat for their food.

"The evidence indicating this individual was eating bitter-tasting plants such as yarrow and camomile with little nutritional value is surprising. We know that Neanderthals would find these plants bitter, so it is likely these plants must have been selected for reasons other than taste," said Dr Stephen Buckley, a member of the research team.

The clear implication of the study—that Neandertals recognized the medicinal value of certain plants—provides further evidence of the sophistication of Neanderthal culture and technology. The full scope of Neandertal cultural interaction with modern humans remains an open question.

"El SidrĆ³n has allowed us to banish many of the preconceptions we had of Neanderthals. Thanks to previous studies, we know that they looked after the sick, buried their dead and decorated their bodies. Now another dimension has been added relating to their diet and self-medication," according to Antonio Rosas, also on the research team.

CAPTION: Microscopically visible material entrapped in dental calculus samples – filamentous and cocci bacteria. Credit: Karen Hardy/Naturwissenschaften.

The article, "Neanderthal medics? Evidence for food, cooking and medicinal plants entrapped in dental calculus," is published in the current issue of Naturwissenschafen.

Thursday, June 14, 2012

Art in an Age of Neandertals

The oldest confirmed date for cave art has just been pushed back again. New research re-dates paintings in caves in northern Spain to a staggering 40,800 years ago, right at the time when anatomical modern humans (AMHs) like us were just arriving in the region and encountering Neandertals.

In fact, this art is so ancient that it raises the haunting possibility that Neandertals were the painters. If so, then modern humans are not the only form of humanity to create cave art. For now, however, the question of who painted this art is a matter of speculation, something that might be settled by further research.

In work published in the June 15, 2012 issue of Science, researchers studied 50 paintings in eleven caves in the northernmost part of Spain, including the UNESCO World Heritage sites of Altamira, El Castillo and Tito Bustillo. The work was conducted by an international team led by Alistair Pike of the University of Bristol.

The Corredor de los Puntos, El Castillo Cave, Spain. Red disks here have been dated to 34,000-36,000 years ago, and elsewhere in the cave to 40,600 years, making them examples of Europe's earliest cave art. Image courtesy of Pedro Saura.

This research comes on the heels of a re-dating of cave painting in France, recently pushed back to 37,000 years. The latest study adds almost another 4,000 years to the confirmed date of the oldest art. What’s the combined effect of the two studies? In just the past month, our view of the antiquity of art has jumped by nearly 10,000 years, prompting us to wonder how much further back it might go. After all, it is known that AMHs mixed pigments as far back as 100,000 years ago.

Using a new method called uranium-thorium dating, Pike’s research team took a closer look at an old find. They extracted tiny samples of naturally forming deposits that covered the paintings. By dating the deposits, scientists are able to discover the date before which the paint was applied. The date of more than forty thousand years ago, therefore, is a minimum date, suggesting that some of the paintings—here or elsewhere—may be even older.

The specific painting that exceeds 40,800 years is a simple red disk, seemingly primitive when compared to paintings made later in the same caves. More striking are the handprint paintings on the wall of El Castillo cave, made by blowing paint and common in early cave art but now dated to 37,300 years ago.

Commenting on the age of the oldest painting, Pike pointed out the tight fit between the painting and the arrival of AMHs in northern Spain: “Evidence for modern humans in Northern Spain dates back to 41,500 years ago, and before them were Neanderthals. Our results show that either modern humans arrived with painting already part of their cultural activity or it developed very shortly after, perhaps in response to competition with Neanderthals – or perhaps the art is Neanderthal art,” Pike said in a press release issued by the University of Bristol.

The Panel of Hands, El Castillo Cave, Spain. A hand stencil has been dated to earlier than 37,300 years ago and a red disk to earlier than 40,600 years ago, making them the oldest cave paintings in Europe. Image courtesy of Pedro Saura.

Pike also speculated further on the possibility that researchers may someday identify some European cave art as Neandertal. He suggested that perhaps, “cave painting started before the arrival of modern humans, and was done by Neanderthals. That would be a fantastic find as it would mean the hand stencils on the walls of the caves are outlines of Neanderthals' hands, but we will need to date more examples to see if this is the case."

The article in entitled “U-series dating of Palaeolithic Art in 11 Caves in Spain” and appears in the June 15, 2012 issue of the journal Science.

Wednesday, May 16, 2012

Merging Humans and Robots--More Coffee, Please

With the help of a tiny chip implanted in the brain, human beings who cannot move their own limbs are able to move a robotic arm, in one case taking a drink of coffee on one’s own for the first time in fifteen years.

"The smile on her face was a remarkable thing to see. For all of us involved, we were encouraged that the research is making the kind of progress that we had all hoped," said the trial's lead investigator, Leigh Hochberg, M.D., Ph.D., in a press release issued by the National Institutes of Health, which provided some of the funding. Hochberg is an associate professor of engineering at Brown University and a critical care neurologist at Massachusetts General Hospital (MGH)/Harvard Medical School.

The field of brain-computer interface research is not new, but this is the first peer-reviewed report of people using brain signals to control a robotic arm, making it perform in three-dimensional space much as their natural arms once did. By imagining they were controlling their paralyzed limb, they were able to move the robotic arm. Brain activity is detected as electrical activity by the BrainGate chip, processed by an external computer, and fed into a robot that translates the signals into movement.

More research is underway, and in fact this clinical trial is recruiting more volunteers.

Caption: The BrainGate array, which is implanted on the motor cortex, comprises nearly 100 electrodes on a chip the size of a baby aspirin. Credit: www.braingate2.org Usage Restrictions: With Credit.

With future advances, researchers hope to be able to improve the quality of movement in prosthetic limbs or to restore in part the function of paralyzed limbs, perhaps by creating an electronic by-pass to normal nerves.

"This is another big jump forward to control the movements of a robotic arm in three-dimensional space. We're getting closer to restoring some level of everyday function to people with limb paralysis," said John Donoghue, Ph.D., who leads the development of BrainGate technology and is the director of the Institute for Brain Science at Brown University.

Beyond therapy, it is possible to imagine other uses as we humans and our machines co-evolve and increasingly converge, probably to do more than drink coffee.

This report is published in the May 17, 2012 issue of Nature.

Monday, May 14, 2012

How Old Is Art?

Confirmed dates for the world’s oldest art just got older, according to the report of an international research team published in the May 14 issue of the Proceedings of the National Academy of Sciences.

Dating back about 37,000 years, the art consists of engravings made in stone that has since fallen from the ceiling of a cave at Abri Castanet in southwestern France. While not as visually arresting as the more famous cave art found at Chauvet, the Castanet engravings are both older and represent what is very likely an earlier stage in the history of the Aurignacian culture, which spanned 40,000 to about 28,000 years ago. Some of the Chauvet paintings are now confirmed at between 30,000 and 32,000 years ago.

Credit: HTO. A replica of a painting, now in the public domain.

The Castanet engravings are both simpler artistically and were located in the general living area of the cave. The Aurignacian culture that created both the paintings and the engravings is known for is many forms of art. According to New York University anthropology professor Randall White, one of the study's co-authors, the Aurignacians "had relatively complex social identities communicated through personal ornamentation, and they practiced sculpture and graphic arts."

"But unlike the Chauvet paintings and engravings, which are deep underground and away from living areas, the engravings and paintings at Castanet are directly associated with everyday life, given their proximity to tools, fireplaces, bone and antler tool production, and ornament workshops," White said in press release issued by NYU.

With more refined archeological techniques, the story of the rise of human symbolic culture is likely to become more complex and more ancient. While there may well have been bursts of cultural creativity in which symbolic advance occurred rapidly, additional findings may also suggest a more steady rise in the story of human art. The study, entitled “Context and dating of Aurignacian vulvar representations from Abri Castanet, France,” appears in the May 14, 2012 edition of PNAS.

Thursday, May 3, 2012

Human Intelligence: Does It Depend on a Genetic Error?

What makes humans different from the great apes? What makes our brains larger and more complex? We know that our DNA is remarkable similar to other mammals. What subtle genetic changes can explain such huge behavioral differences? One surprising possibility is that our brains are bigger and more complex not so much because of new genes but because of gene duplication.

One gene in particular—SRGAP2—plays a role in how brain cells migrate. It is found widely in mammals of all sorts, from mice to humans. In the great apes, the more archaic form of SRGAP2 results in a relatively slow spread of neurons throughout the brain. Twice in the ancient past, however, SRGAP2 was duplicated, first about 3.4 million years ago and then again around 2.4 million years ago. The second duplication occurred right around the time when the genus Homo separated from Australopithecus. It appears that as a result of these duplications, brains in the Homo lineage—including our own as Homo sapiens—are both large and complex in their number of neuronal connections and in their ability to process information.

A key piece of supporting evidence comes from recent discoveries of the role of SRGAP2 in the development of the human neocortex. When the distinctly human SRGAP2 variants are missing, normal human brain development is impaired. This research appears in two papers appearing May 3, 2012 in the journal Cell. According to one of the papers, “It is intriguing that the general timing of the potentially functional copies…corresponds to the emergence of the genus Homo from Australopithecus (2-3 mya). This period of human evolution has been associated with the expansion of the neocortex and the use of stone tools, as well as dramatic changes in behavior and culture.”

Caption: A team led by Scripps Research Institute scientists has found evidence that, as humans evolved, an extra copy of a brain-development gene allowed neurons to migrate farther and develop more connections. Credit: Photo courtesy of The Scripps Research Institute Usage Restrictions: None

The uniquely human duplications work in a surprising ways, especially the second duplication. The original SRGAP2 remains present in humans today, along with the duplicated versions. The second duplication—SRGAP2C—has the effect of interfering with the original SRGAP2. The reason why SRGAP2C interferes with SRGAP2 rather than boosts it is because the duplicated version is incomplete—in other words, an advantageous copying error.

According to one of the studies, once SRGAP2C appeared about 2.4 million years ago, it created a “dominant negative interaction equivalent to a knockdown of the ancestral copy…The incomplete nature of the segmental duplication was, therefore, ideal to establish the new function by virtue of its structure,” acting in a way that was “instantaneous” in terms of evolution.

"This innovation couldn't have happened without that incomplete duplication," according to Evan Eichler, another leader in the research team. "Our data suggest a mechanism where incomplete duplication of this gene created a novel function 'at birth'."

Even though SRGAP2 duplications seem to play a significant role in distinguishing human beings from the apes, other duplications and mutations are very likely to be involved in the story of human evolution. "There are approximately 30 genes that were selectively duplicated in humans," said Franck Polleux, one of the lead researchers involved in the study, in a press release from the journal. "These are some of our most recent genomic innovations."

Rather than standard mutations, "episodic and large duplication events could have allowed for radical – potentially earth-shattering – changes in brain development and brain function," according to Eichler. For these reasons, this is one of the most intriguing areas for research into the origins of human intelligence.

Whether other duplications—including “incomplete duplications or erroneous copies—also explain our complex brains is something that will be discovered in the next few years.

But what is surprising and somewhat sobering, just based on this SRGAP2 discovery, is how our much-vaunted human uniqueness seems to hang on such a fine thread. If the SGGAP2 duplication is even partly responsible for our complex brains, should we think that our intelligence arose because of a copying error or an incomplete duplication? Is the rise of intelligence and consciousness—truly one of the great events in the story of cosmic evolution—really just based in part on a fluke of nature? Religious or not, hardly anyone is likely to think that thinking is sheer accident.

The papers, Charrier et al.: "Inhibition of SRGAP2 function by its human-specific paralogs induces neoteny during spine maturation" and Dennis et al.: "Human-specific evolution of novel SRGAP2 genes by incomplete segmental duplication," appear in the journal Cell.

Tuesday, May 1, 2012

Extending Healthy Lifespans? A Pill on the Horizon?

Resveratrol, the much-hyped ingredient found in red wine and sold widely as a nutritional supplement, is known to improve the health and extend the lifespan of mice. Can it do the same for humans? Without nasty side effects? And at what dose?

A study published today in Cell Metabolism helps unravel a few more of resveratrol’s mysteries. In particular, researchers have shed new light on how resveratrol works. Key to its effectiveness is a gene known as SIRT1, found in slightly different forms in species as different as yeast and humans. SIRT1 plays many roles, some tied to core metabolic processes. The new study shows that in mice, even a low dose of resveratrol interacts with SIRT1 to improve metabolism.

What makes this study especially interesting is that researchers had to create a special strain of mice in order to test whether SIRT1 is necessary for resveratrol to work. If mice have no SIRT1, they do not develop properly. So two graduate students, Nathan Price and Ana Gomes, developed a novel strain of mice with an unusual copy of the SIRT1 gene, one that could be switched off at adulthood.

By administering a drug (tamoxifen), researchers can “induce” or switch the SIRT1 gene on and off, a strategy that will likely be used in other studies. "This is a drug inducible, whole body deletion of a gene," David Sinclair, the study's senior author, said in a press release from Harvard Medical School. "This is something that's rarely been done so efficiently. Moving forward, this mouse model will be valuable to many different labs for other areas of research."

Photo by R. Cole-Turner

In this case, the switchable SIRT1 mouse provided proof that SIRT1 is key to resveratrol’s effectiveness. Why is that important? Because resveratrol is a complex molecule that interacts with the body in many unknown ways. While it may be beneficial, it may have unwanted side effects. So researchers are trying to design a more simple molecule that provides the benefits of resveratrol without all the risks. One strategy is to boost SIRT1 activity. By proving that SIRT1 is involved, this study provides support for that strategy, which is already being pursued by pharmaceutical firms.

"The results were surprisingly clear," said. "Without the mitochondria-boosting gene SIRT1, resveratrol does not work."

Are we any nearer a magic pill that slows aging or promotes longevity? Perhaps. The headline of the press release from the publisher, Cell Press, claims that this work “restores hope for anti-aging pill.” Remember, of course, that the work reported here is entirely with mice.

Even so, the paper itself concludes with this statement: “This model supports the enticing possibility of designing and developing potent small molecules that provide the health benefits of resveratrol by activating SIRT1 and downstream pathways to treat metabolic and other age-related diseases.”

The treatment of age-related diseases, including diabetes, is a huge target for pharmaceutical firms. But beyond that lies that even bigger market for human enhancement, specifically for enhancing the span of healthy decades.

The study, "SIRT1 Is Required for AMPK Activation and the Beneficial Effects of Resveratrol on Mitochondrial Function," appears in the May 1, 2012 issue of Cell Metabolism.

Thursday, April 26, 2012

Spreading Farming, Spreading Genes

Agriculture probably originated in the Middle East about 11,000 years ago. Over the next six thousand years, it spread to other parts of the globe, including northern Europe, gradually replacing hunting and gathering as the primary means of human survival.

How did it spread? Were hunter-gatherers converted to the efficiencies of agriculture? Or did farmers from the south spread north, bringing their agriculture with them?

A new study suggests that farming spread because farmers moved. The movement was slow, taking five to six thousand years to reach Scandinavia. Early in the process, farmers of southern ancestry lived side by side with their more northerly human cousins, who still lived by hunting and gathering. Eventually, after a thousand years or so, farmers interbred with hunter-gatherers and farming became the dominant way of life.

The new study, which appears in the April 27 issue of Science, is based on an analysis of four skeletons, all found in Sweden and dating from about 5,000 years ago. Three were hunter-gatherers and one was a farmer. All of them lived their entire lives close to where they were buried, the hunter-gatherers in a flat grave and the farmer a stone megalith like the one pictured below.

Caption: Several hundred megalith tombs are known from the Falbygden area, including Gƶkhem and Valle parishes in Ɩstergƶtland,Sweden.Credit: Gƶran Burenhult

"We know that the hunter-gatherer remains were buried in flat-bed grave sites, in stark contrast to the megalithic sites that the farmers built," said Mattias Jakobsson, a senior author from Uppsala University. "The farmer we analyzed was buried under such a megalith, and that's just one difference that helps distinguish the two cultures," Jakobsson said in a press release issued by the journal.

What is most significant in this study comes from an analysis of the human DNA extracted from the four skeletons. By studying their DNA, researchers found that the farmer belonged to a community with ancestral roots in the eastern Mediterranean, most closely resembling today's Greeks and Cypriots. The hunter-gatherers, on the other hand, were more like today’s northern Europeans, most closely resembling today's Finns. "What is interesting and surprising is that Stone Age farmers and hunter-gatherers from the same time had entirely different genetic backgrounds and lived side by side for more than a thousand years, to finally interbreed," Jakobsson said.

Caption: The skeleton belongs to a young female in her 20s, and can be dated to around 4,700 years ago. Credit: Gƶran Burenhult

"The results suggest that agriculture spread across Europe in concert with a migration of people," added Pontus Skoglund, also of Uppsala University. "If farming had spread solely as a cultural process, we would not expect to see a farmer in the north with such genetic affinity to southern populations."

The article, entitled "Origins and Genetic Legacy of Neolithic Farmers and Hunter-Gatherers in Europe," appears in the April 27 issue of Science, published by the American Association for the Advancement of Science.

Thursday, April 19, 2012

Synthetic Biology: Is There Life beyond DNA?

Life as we know it is based on DNA and RNA. Could it have been otherwise? Might other worlds have life based on a different “genetic” system? We may never know for sure.

But we do know that synthetic biology is moving briskly toward the goal of engineered life beyond DNA and RNA.

Recall that in “DNA” and “RNA,” the “NA” part stands for “nucleic acids.” It’s the four nucleic acids that carry the genetic information in a chemical code. The “D” and the “R,” however, stand for sugars that hold the nucleic acids in place, allowing them to form pairs and to copy themselves. Can other sugars work as well?

Recent work in synthetic biology has led beyond DNA and RNA to xeno-nucleic acids or “XNAs.” Now, using six different forms of XNAs, an international team of researchers led by Vitor Pinheiro reports success in getting XNAs to store and propagate information. One of their XNAs actually “evolved” by responding to imposed selective constraints. Their work is published in the April 20, 2012 issue of the journal Science.

Caption: Courtesy--National Human Genome Research Institute

In a commentary on the research, Gerald F. Joyce writes in Science that this work has implications for the “understanding of life itself.” In addition, it opens new insight into the possible origins of life on our planet or else where in the cosmos.

At the same time, far more work lies ahead before synthetic biologists create XNA-based life. Pinheiro’s team was able to get their synthetic XNA “genes” to interact with DNA, but “they have not yet realized a synthetic genetic system.” One big challenge is in getting XNA sequences to copy themselves the way DNA does. Some XNAs can copy themselves to DNA and back again to XNA, but XNA-to-XNA copying is not reliable.

According to Joyce, however, “future studies are likely to yield improvements of the the various XNA-to-XNA copying reaction.” If that happens, synthetic biology will take yet another step toward “synthetic genetics.”

All this prompts a warning from Joyce: “Synthetic biologists are beginning to frolic on the worlds of alternative genetics but must not tread into areas that have the potential to harm our biology.” As ever, greater knowledge brings greater risks. More than ever, public awareness and careful thought are needed.

The research article, "Synthetic Genetic Polymers Capable of Heredity and Evolution" and the commentary, "Toward an Alternative Biology," are both published in the April 20, 2012 issue of Science, the journal of the American Association for the Advancement of Science.

Monday, April 2, 2012

A Million Years of Fire

One of our newest technologies has just shed new light on one of our oldest.

When did our human ancestors learn to control and use fire? Armed with the latest high tech tools, an international team of researchers has pushed the date back to 1 million years. That’s 300,000 years earlier than previous unambiguous dates.

The massive Wonderwerk Cave is in northern South Africa on the edge of the Kalahari. Previous excavations have shown extensive human occupation. Using the new techniques of micromorphological analysis and Fourier transform infrared microspectroscopy (mFTIR), researchers analyzed cave sediments at a far more detailed level than possible before.

Caption: This is a panoramic view of the entrance to Wonderwerk Cave, South Africa. Credit: H. Ruther. Usage Restrictions: None

In the cave sediments researchers found bits of ash from plants along with fragments of burned bone. Did the wind blow burning debris into the cave? The evidence—collected about 100 feet from the current opening of the cave—supports the conclusion that the fire burned in the cave. Also part of the proof: the surrounding surfaces are discolored.

”The analysis pushes the timing for the human use of fire back by 300,000 years, suggesting that human ancestors as early as Homo erectus may have begun using fire as part of their way of life," anthropologist Michael Chazan said in a press release from the University of Toronto.

According to the paper, "Through the application of micromorphological analysis and Fourier transform infrared microspectroscopy (mFTIR) of intact sediments and examination of associated archaeological finds— fauna, lithics, and macrobotanical remains—we provide unambiguous evidence in the form of burned bone and ashed plant remains that burning events took place in Wonderwerk Cave during the early Acheulean occupation, approximately 1.0 Ma. To date, to the best of our knowledge, this is the earliest secure evidence for burning in an archaeological context."

Caption: Interior of Wonderwerk Cave. Images courtesy of M. Chazan.

"The control of fire would have been a major turning point in human evolution," says Chazan. "The impact of cooking food is well documented, but the impact of control over fire would have touched all elements of human society. Socializing around a camp fire might actually be an essential aspect of what makes us human."

How important are fire and cooking for human evolution. A recent book, Catching Fire: How Cooking Made Us Human by Richard Wrangham, argues that cooking is essential to our humanity. Now in the paper published on April 2, the team concludes that its study “is the most compelling evidence to date offering some support for the cooking hypothesis of Wrangham.”

Their work is published as “Microstratigraphic evidence of in situ fire in the Acheulean strata of Wonderwerk Cave, Northern Cape Province, South Africa,” in the April 2, 2012 issue of the Proceedings of the National Academy of Sciences.

Wednesday, March 28, 2012

Planets by the Billions

New calculations suggest that there may be as many as 60 billion habitable planets just in our own Milk Way galaxy, maybe more. At least 100 of these planets may be as close as 30 light-years away.

Researchers at the European Southern Observatory (ESO) carefully sampled 102 red dwarf stars in the southern skies. Red dwarfs are very common in our galaxy. Based on observations and calculations, the ESO team estimates that approximately 40% of the red dwarf stars are orbited by planets. What’s more, these planets are in what astronomers call the “habitable zone,” meaning they are neither too close nor too far from their sun. In particular, it means that the temperature may be right for liquid water to exist on the planet’s surface.

The ESO project is the work of an international team using observations with the HARPS spectrograph on the 3.6-metre telescope at ESO's La Silla Observatory in Chile. It is published in the March 28 issue of Astronomy & Astrophysics.

Caption: This artist's impression shows a sunset seen from the super-Earth Gliese 667 Cc. The brightest star in the sky is the red dwarf Gliese 667 C, which is part of a triple star system. The other two more distant stars, Gliese 667 A and B appear in the sky also to the right. Astronomers have estimated that there are tens of billions of such rocky worlds orbiting faint red dwarf stars in the Milky Way alone. Credit: ESO/L. CalƧada. Usage Restrictions: None

"Our new observations with HARPS mean that about 40% of all red dwarf stars have a super-Earth orbiting in the habitable zone where liquid water can exist on the surface of the planet," says Xavier Bonfils in a press release issued by ESO.

"Because red dwarfs are so common — there are about 160 billion of them in the Milky Way — this leads us to the astonishing result that there are tens of billions of these planets in our galaxy alone," according to Bonfils.

"The habitable zone around a red dwarf, where the temperature is suitable for liquid water to exist on the surface, is much closer to the star than the Earth is to the Sun," says Stephane Udry, another member of the ESO team.

Red dwarfs, however, may pose a special challenge to life. According to Udry, "Red dwarfs are known to be subject to stellar eruptions or flares, which may bathe the planet in X-rays or ultraviolet radiation, and which may make life there less likely." In the more technical language of the scientific publication, “The main difference from earth are a significantly higher mass and a different stellar environment, which potentially can have caused divergent evolutions.”

All the more tantalizing, of course. As one of the ESO scientists puts it, "Now that we know that there are many super-Earths around nearby red dwarfs we need to identify more of them using both HARPS and future instruments. Some of these planets are expected to pass in front of their parent star as they orbit — this will open up the exciting possibility of studying the planet's atmosphere and searching for signs of life," concludes Xavier Delfosse.

The article, "The HARPS search for southern extra-solar planets XXXI. The M-dwarf sample", by Bonfils et al. appears in Astronomy & Astrophysics on March 28.

Thursday, March 22, 2012

Stem Cell Update: Important Research Breakthrough

Another important step has just been taken toward achieving the medical promise of stem cell research. For the first time, researchers at the Max Planck Institute for Molecular Biomedicine in MĆ¼nster, Germany, have reprogrammed skin cells directly into multipotent stem cells.

Over the past five years, stem cell researchers have learned how to induce or reprogram skin cells to become pluripotent stem cells—cells capable of becoming any type of cell in the body. The result: induced pluripotent stem cells or iPSCs. Scientists have also discovered how to reprogram cells to become precursor or progenitor cells. Precursor cells have a much narrower range of potential for development. They are able to become one very specific type of cell in the body.

Expanding on previous work, the latest breakthrough achieves a kind of “goldilocks” or just-right level. Working with mice, the team led by Hans Schƶler discovered how to reprogram fully differentiated skin cells into neuronal stem cells (NSC)s. Unlike pluripotent cells, NSCs are far more suitable for clinical use. And now, with this breakthrough, the Max Plank Institute team has learned how to reprogram or induce NSCs or iNSCs.

And unlike precursor or progenitor cells, iNSCs are capable of multiplying and diffentiating once they are implanted. When researchers implanted their iNSCs into mouse brains, iNSCs generated new cells that began to take on some of the characteristics of ordinary developing brain cells.

Caption: This is an immunofluorescence microscopy image of the induced neural stem cells (iNSCs) using antibodies against two neural stem cell markers SSEA1 (red color) and Olig2 (green color). Credit: MPI for Molecular Biomedicine

The field of stem cell research has faced many obstacles, some moral and some medical. The main moral objection is that the prime source of human pluripotent cells is the human embryo, and many object to the destruction of the embryo for medical purposes. One of the medical challenges is that implanted cells are likely to be rejected by the immune system, much as transplanted organs are rejected unless immunosuppressant drugs are given.

Unless, of course, the source of the cells is from the patient’s own body. That’s why this achievement is important. If this technique can be applied to human cells—and there’s no reason to think it can’t—then someday it may be possible to take a small sample of cells from a patient’s skin, convert them to iNSCs, and then implant them in the patient’s brain to repair damage from disease or injury.

Not only does the iNSC discovery use the patient’s own cells as the source. It also by-passes the pluripotent stage. That fact should help researchers avoid creating cancer or other problems.

According to Schƶler, "pluripotent stem cells exhibit such a high degree of plasticity that under the wrong circumstances they may form tumours instead of regenerating a tissue or an organ."

"Our research shows that reprogramming somatic cells does not require passing through a pluripotent stage," Schƶler said in a press release issued by the Max Plank Institute. "Thanks to this new approach, tissue regeneration is becoming a more streamlined—and safer—process."

The article, "Direct Reprogramming of Fibroblasts into Neural Stem Cells by Defined Factors," appears in the March 22 issue of Cell Stem Cell.

Monday, March 19, 2012

Space Age View of Stone Age Settlements

Thousands of early human settlements have been located using computers and satellite images, according to a paper published on March 19 in the Proceedings of the National Academy of Sciences.

Researchers used computers to analyze satellite images of a 23,000 square kilometer region in the Upper Khabur Basin of northeastern Syria. The finding? They believe they can identify 14,312 possible sites of human settlements dating back eight thousand years. The region, a relatively small 100 miles square, was nearly 3% occupied at various points over the intervening millennia.

Harvard archeologist Jason Ur collaborated with MIT researcher Bjoern Menze to develop a system that identified settlements based on a several factors. Old sites tend to leave mounds that show distinctive shapes and colors from the collapse of building materials such as mud bricks.

“With these computer science techniques, however, we can immediately come up with an enormous map which is methodologically very interesting, but which also shows the staggering amount of human occupation over the last 7,000 or 8,000 years,” Ur said in a press release issued by Harvard.

"What's more, anyone who comes back to this area for any future survey would already know where to go," he continued. "There's no need to do this sort of initial reconnaissance to find sites. This allows you to do targeted work, so it maximizes the time we have on the ground."

The article, “Mapping patterns of long-term settlement in Northern Mesopotamia at a large scale,” appears in the March 19 issue of PNAS.

Wednesday, March 14, 2012

Red Deer People? Really?

Did our human family tree just grow a new branch? That seems to be the tentative conclusion reported today in the online journal, PLoS ONE.

A first analysis of human remains from two caves in southwest China has prompted researchers to make some astounding claims: These "Red Deer People" are not anatomically modern humans (AMH). Their remains date from 14,500 to 11,500 years ago, far more recent than anything similar ever found on the mainland of Asia. They shared their territory with modern humans just at the time when early agriculture was being developed. And—even more puzzling—they shared anatomical features with modern and archaic humans.

Caption: An artist's reconstruction of fossils from two caves in southwest China have revealed a previously unknown Stone Age people and give a rare glimpse of a recent stage of human evolution with startling implications for the early peopling of Asia. The fossils are of a people with a highly unusual mix of archaic and modern anatomical features and are the youngest of their kind ever found in mainland East Asia. Dated to just 14,500 to 11,500 years old.

Credit: Art copyright by Peter Schouten Usage Restrictions: Image may be used in association with initial news media reports - otherwise seek permission from Peter Schouten: info@studioschouten.com.au

Who were they? The international team of researchers speak of these early humans as the “Red-Deer People,” named for extinct species of deer they hunted and for the Maludong or “Red Deer Cave” where some of the remains were discovered. The team was led by Professor Darren Curnoe of the University of New South Wales and Professor Ji Xueping of the Yunnan Institute of Cultural Relics and Archeology.

But researchers hesitate to draw any conclusions about species. "These new fossils might be of a previously unknown species, one that survived until the very end of the Ice Age around 11,000 years ago," says Professor Curnoe in a press release issued by UNSW. "Alternatively, they might represent a very early and previously unknown migration of modern humans out of Africa, a population who may not have contributed genetically to living people."

Although the remains were first discovered in 1979, they remained encased in rock until 2009. While the researchers have been able to compare anatomical features with modern and archaic human remains, they have not been able to extract DNA from the samples. According to the paper, “our ongoing attempts to extract DNA from a specimen from Maludong have so far proven unsuccessful owing to a lack of recoverable genetic material.”

"The discovery of the red-deer people opens the next chapter in the human evolutionary story – the Asian chapter – and it's a story that's just beginning to be told," says Professor Curnoe.

The paper is entitled "Human Remains from the Pleistocene-Holocene Transition of Southwest China Suggest a Complex Evolutionary History for East Asians" and appears in the March 14, 2012 issue of PLoS ONE.

Wednesday, February 22, 2012

Oldest Art from the New World?

In a cave in Brazil, researchers have found what they claim may be the oldest known art from the western hemisphere. Dating from between 9,000 and 12,000 years ago, the simple petroglyph pictures a human-like figure that is roughly 30cm tall.

The report of the discovery is published in the February 22, 2012 issue of PLoS ONE, an open access journal. The team was led by Walter Neves of the University of Sao Paulo.

Caption: This is the oldest reliably dated petroglyph ever found in the New World.

Credit: Citation: Neves WA, Araujo AGM, Bernardo DV, Kipnis R, Feathers JK (2012) Rock Art at the Pleistocene/Holocene Boundary in Eastern South America. PLoS ONE 7(2): e32228. doi:10.1371/journal.pone.0032228

Early New World art is rare, and the oldest examples are not nearly as old as art discovered in Europe and Africa, which ranges back 30,000 years or more. But this finding is interesting nonetheless. Odd features of the drawing--is it human or a bird or reptile, and does it include an oversize phallus--are sure to fuel speculations about the religious or shamanistic origins of ancient art. The earliest expressions of symbolic culture in any setting or context provide additional insight into the cultural origins of modern humanity.

"Rock Art at the Pleistocene/Holocene Boundary in Eastern South America" is published in the February 22 issue of PLoS ONE and is available free to the public.

Sunday, February 19, 2012

Single-Atom Transistor: Why Small Is a Big Deal

A tiny achievement with huge significance was reported today by physicists at the University of New South Wales (UNSW). They have created a transistor that uses a single atom. Their work is described in a paper and an editorial published in the February 19 issue of Nature Nanotechnology.

Using a scanning tunneling microscope—the essential tool in nanotechnology that allows researchers to visualize and manipulate single atoms—the UNSW group positioned a phosphorous atom between nano-scale electrodes. A video explaining the feat is available.

CAPTION:This is a single-atom transistor: 3D perspective scanning tunnelling microscope image of a hydrogenated silicon surface. Phosphorus will incorporate in the red shaded regions selectively desorbed with a STM tip to form electrical leads for a single phosphorus atom patterned precisely in the center. Credit: ARC Centre for Quantum Computation and Communication, at UNSW.

What seems to be most important about this achievement is the accuracy of the placement of the phosphorous atom. This opens the possibility that precisely placed atoms may be used to create a whole new generation of computer chips that are both reliable and smaller than anything used today.

"Our group has proved that it is really possible to position one phosphorus atom in a silicon environment—exactly as we need it –with near-atomic precision, and at the same time register gates," said lead author Dr Martin Fuechsle in a press release from UNSW.

The leader of the research group, Professor Michelle Simmons, claims that "This is the first time anyone has shown control of a single atom in a substrate with this level of precise accuracy." Simmons is director of the ARC Centre for Quantum Computation and Communication at UNSW.

According to the famous “Moore’s Law,” which argues from past achievement in chip design and predicts future a doubling in chip power ever 18 months, single atom or quantum computing should be achieved by the year 2020. Fuechsle and Simmons are speculating that because of this breakthrough, technology is ahead of schedule.

If so, then arguments advanced by futurists such as Ray Kurzweil take on added significance. As chips grow in power and shrink is size, more and more powerful computing becomes possible. Smaller chips are more implantable, bringing us closer to they day when they are implanted not just for medical but for other purposes (see previous post).

Even more significant is that smaller and more powerful processing paves the way for more highly intelligent machines. Kurzweil predicts that within a few decades, machines with greater than human intelligence will be produced. What then? Will our inventions become the inventors of the future, and will they still need us? The report, "A Single-Atom Transistor," is published in the February 19 is of Nature Nanotechnology.

Thursday, February 16, 2012

Humans Beings, DNA Nano-Robots, and Implantable Chips

Technological devices inside the human body are fast becoming more fact than fiction, and two reports released on February 16 are significant milestones along that path.

In one study, appearing in Science Translational Medicine, microchips were implanted in women suffering from osteoporosis. Researchers at Harvard Medical and Case Western worked with MicroCHIPS, the manufacturer of the device.

Patients with advanced osteoporosis, whose bones have weakened and lost density, are currently able to give themselves with a daily injection of a drug that requires refrigeration. By implanting a device, researchers want to make the process easier compliance more consistent.

The microchips implanted in the study contain tiny reservoirs of the drug. The device releases a daily dose when it receives a wireless signal. It also monitors the release of the drug and reports back to the physician, who is able to modify the prescription by sending new instructions to the device from another wireless device, such as a smart phone. This is believed to be the first wirelessly controlled implanted drug-delivery device.

"This trial demonstrates how drug can be delivered through an implantable device that can be monitored and controlled remotely, providing new opportunities to improve treatment for patients and to realize the potential of telemedicine," according to Robert Langer of MIT and the cofounder of MicroCHIPS, Inc. "The convergence of drug delivery and electronic technologies gives physicians a real-time connection to their patient's health, and patients are freed from the daily reminder, or burden, of disease by eliminating the need for regular injections," Langer said in a release issued by the MicroCHIPS.

The drug delivery device (on right) next to an everyday computer memory stick. Courtesy of MicroCHIPS, Inc., Massachusetts.

The company also reported that it is currently developing new designs of its microchip-based implant to include as many as 400 doses per device providing daily dosing for one year or multi-year therapy for less frequent dosing regimens.

In another study reported today, a team of researchers at the Wyss Institute for Biologically Inspired Engineering at Harvard University report on their work in assembling tiny robots out of DNA.

Building on previous advances in what is popularly known as “DNA origami,” the Wyss team used a computer to fabricate a barrel-like structure capable of containing specific molecules for delivery to targeted cells in the body. For example, cancer cells could be targeted with molecules that cause them to self-destruct, much the way the body’s own immune system carries out its functions.

“DNA origami” allows researchers to use DNA as a construction material. They are able to fold it and weave its strands together. What’s more, since DNA is a chemical code, specific patterns or sequences in the DNA can be used to “read” a signal and “act” accordingly. In this study, researchers built a DNA “latch” or locking mechanism. Their DNA barrel kept its molecular payload safely under wraps until it arrived on the surface of the target cell. On the surface of the target cell is a protein that unlocks the DNA latch, releasing the molecule at just the right location.

Cell-targeting DNA nano-robots bearing antibody-fragment payloads. [Image created by Campbell Strong, Shawn Douglas, & Gaƫl McGill using Molecular Maya & cadnano]

"We can finally integrate sensing and logical computing functions via complex, yet predictable, nanostructures—some of the first hybrids of structural DNA, antibodies, aptamers and metal atomic clusters—aimed at useful, very specific targeting of human cancers and T-cells," said George Church, Ph.D., a Wyss core faculty member and Professor of Genetics at Harvard Medical School, who is Principal Investigator on the project.

One way in which the researchers tested their DNA nano-robots was by programming them to target and destroy cancer cells growing in culture, including leukemia and lymphoma cells. The results were promising. According to the study, ” These findings demonstrate that the robots can induce a variety of tunable changes in cell behavior. Furthermore, biologically active payloads may be bound indirectly via interactions with antibody fragments, enabling applications in which the robot carries out a scavenging task before targeted payload delivery.”

The work reported here is built on advances around the world in nanotechnology and synthetic biology. What is new is the way the Wyss team combined several of these advances for the first time. For example, the release mechanism used here responds to the presence of a protein, not just to the presence of DNA or RNA. That feature alone makes this work more immediately applicable for medical purposes.

Put together, these two reports are part of a far wider panorama of basic advances in biomedical research. They stand out in part because of what they promise in terms of future treatment strategies. But more than that, they catch our attention because advances like these continue to blur the lines between ourselves and our technology.

In the first case—a wireless implanted drug-delivering chip—we are not simply injecting a medication or implanting a device. The patients in this study are hosts to a high-tech subsystem implanted within them that interacts in sophisticated ways with another human being (their physician). What’s more, that other human being—even if half a world a way attending a medical conference—can send instructions that immediately cause an effect within the body (but perhaps without the knowledge) of the patient. Surely there’s a spy story here just waiting to be written. More than that, this seems to be another significant milestone on the way to the (post?-) human future.

In the second case (the DNA nanoscale robot), nothing is yet implanted, but that’s clearly a next step. What are we to make of this elegant piece of tiny engineering? It is so small that it can only be made using computers. It is built from the same sort of DNA that we have in every cell, but it's engineered to hold a desired shape and to respond to a specific signal. Then, if inserted in great numbers into the human body, it can emulate the human immune system but take it in directions far beyond evolution.

The report on the implantable chip is entitled "First-in-Human Testing of a Wirelessly Controlled Drug Delivery Microchip" and appears in the February 16 issue of Science Translational Medicine. The report on DNA robots, "A Logic-Gated Nanorobot for Targeted Transport of Molecular Payloads," appears in the February 17 issue of the journal Science. Both journals are publications of the American Association for the Advancement of Science.