Wednesday, December 4, 2013

The Surprising Story of 400,000 Year Old Human DNA

Researchers have just announced a major advance in their quest to recover DNA from ancient humans.  400,000 year old bones contain badly damaged DNA sequences, but experts in Leipzig, Germany, have developed new techniques to extract and piece together tiny fragments until they can read at least a small portion of the genes carried by ancient humans who once lived in northern Spain.

Caption: The Sima de los Huesos hominins lived approximately 400,000 years ago during the Middle Pleistocene. Credit: Javier Trueba, Madrid Scientific Films.Usage Restrictions: None

A team led by Matthias Meyer at the Max Planck Institute for Evolutionary Anthropology in Leipzig worked together with a Spanish team of paleontologists led by Juan-Luis Arsuaga to extract tiny amounts of bone from fossil remains found at Sima de los Huesos, northern Spain’s famous “bone pit.”  This site has been excavated for more than two decades.  It has yielded at least 28 skeletons, usually classified as Homo heidelbergensis, a form of humans seen as the ancestors of the Neandertals.
 

But here is where this study broke new ground.  It turns out that the Sima de los Huesos humans were more closely related to the recently discovered Denisovans than to the Neandertals.  "The fact that the mtDNA of the Sima de los Huesos hominin shares a common ancestor with Denisovan rather than Neandertal mtDNAs is unexpected since its skeletal remains carry Neandertal-derived features," Meyer said in a press release provided by the journal Nature, which carries the report in its 4 December 2013 issue. 


What makes this finding all the more intriguing is that the Denisovans were completely unknown to us until 2010, when the Leipzig team “discovered” them by reconstructing their DNA and comparing it to Neandertals and today’s humans.  Through a spectacular technological achievement, Leipzig researchers discovered that these Denisovans lived as a distinct population some tens of thousands of years ago, when they interbred with other humans. 

"This unexpected result points to a complex pattern of evolution in the origin of Neandertals and modern humans. I hope that more research will help clarify the genetic relationships of the hominins from Sima de los Huesos to Neandertals and Denisovans" says Arsuaga. 


Caption: This is a skeleton of a Homo heidelbergensis from Sima de los Huesos, a unique cave site in Northern Spain.  Credit: Javier Trueba, Madrid Scientific Films.  Usage Restrictions: None

 

According to the most recent discovery, the Sima de los Huesos hominins seem to have shared a common ancestor with the Denisovans some 700,000 years ago.  The idea that they are more closely related to Denisovans than to Neandertals suggests that these mysterious Denisovans, totally unknown just four years ago, may have played a far bigger role in the story of human origins than ever imagined. 



It is important to point out that so far, researchers have only reconstructed the DNA of the mitochondrial.  And even there, the work is not complete.  Whether they succeed in reconstructing the DNA of the far more daunting heidelbergensis genome remains to be seen.  But if past experience is any predictor, we might look for advances not just here but in other human remains from hundreds of thousands of years ago.  Each technical achievement may fill in a page in our past, maybe even re-writing whole chapters.  When it comes to human origins, we should expect more surprises. 

Putting this most recent news in a larger context, Svante Pääbo, the director of the Leipzig research, said this in the Nature press release: "Our results show that we can now study DNA from human ancestors that are hundreds of thousands of years old. This opens prospects to study the genes of the ancestors of Neandertals and Denisovans. It is tremendously exciting."

The article, “A mitochondrial genome sequence of a hominin from Sima de los Huesos,” appears in the 4 December 2013 issue of the journal Nature. 

 

 

 

Thursday, October 17, 2013

What a Small Brain Can Tell Us

New information about an early human skull sheds more light on the very first members of the human genus.  The skull, found in Dmanisi, Georgia in 2005, has now been freed from the stone casing that has preserved it for the past 1.8 million years. An international team led by David Lordkipanidze of the Georgian National Museum report its findings in the October 18 issue of the journal Science. 

Photo Caption: The Dmanisi D4500 early Homo cranium in situ. Photo courtesy of Georgian National Museum.

When the world first learned of early human remains in Georgia, the news came as a bit of a shock.  These early humans seemed quite similar to other remains found in Africa and dating to the same time.  That suggests they were able to travel and adapt to new settings. 

The latest analysis contains a new surprise.  The skull described in the new report has an unexpectedly small brain size, at or below the range usually seen as minimal for our genus.  At 546 cubic centimeters, its small brain widens our view of variability of humans at this time. 

Does this skull, identified as Skull 5 from Dmanisi, really measure up to being in the genus Homo at all? It is something else, like Australopithecus?  The researchers argue that it is clearly part of the genus Homo for the simple reason that Skull 5 is found with other, larger-brained skulls, all clearly part of the same community.  One Georgian brain was as large as 730 cc.  What this suggests is that Skull 5 is part of Homo but that our definition of Homo should be broadened. 

In fact, all this diversity at one site provides support for one side in an ongoing debate.  Are species defined broadly in terms of variability, or does small to moderate variation indicate separate species.  This finding supports the view that at least in terms of early humans, a species can be quite variable.      

Not too long ago, Lordkipanidze and his team took the opposite view.  They believed that these early humans from Georgia were a distinct species, what they called Homo georgicus.  The new paper retracts that claim, saying that the new evidence of variation in Georgia means that these fossils fit within the widened range variability of Homo erectus, a globally dispersed species.  More precisely, they see the Georgian samples as best classified as Homo erectus ergaster georgicus, part of the species Homo erectus but distinct because of modifications over time and because of location. 

Commenting on the variation in the skulls found almost literally on top of each other at Dmanisi, co-author Christoph Zollikofer notes that the skulls “look quite different from one another, so it's tempting to publish them as different species.  Yet we know that these individuals came from the same location and the same geological time, so they could, in principle, represent a single population of a single species,” Zollikofer said in a press release issued by the journal Science. 

The key claim advanced in the article, however, is that these samples from Georgia and Africa, together with other samples from Asia, are all part of one global species.  The report describes them as Homo erectus, seen as “a single but polymorphic lineage.” 

The diversity found in Georgia also suggests that the number of individuals in that region may have been larger than first thought, possibly numbering 10,000 or so.  And the small size of Skull 5’s brain suggests that they traveled all this way before brains began to expand.

The report, “A Complete Skull from Dmanisi, Georgia, and the Evolutionary Biology of Early Homo," is published in the 18 October 2013 issue of the journal Science, published by the American Association for the Advancement of Science.    

Monday, August 12, 2013

Is Neandertal Technology Still in Use Today?

Those primitive Neandertals may not have been so primitive after all.  Some 50,000 years ago, they were using a highly crafted bone tool virtually identical to a tool in use by human leather-workers today.

The tool is called a lissoir, was made by Neandertals living in southwestern France long before the arrival of the people we like to call “anatomically modern humans.”  The discovery, reported in the August 16, 2013 online issue of PNAS, is sure to fuel the debate over the cultural sophistication of the Neandertals.

Caption: Four views of the most complete lissoir found during excavations at the Neandertal site of Abri Peyrony.  Credit: Image courtesy of the Abri Peyrony and Pech-de-l’Azé I Projects.

Ever since their discovery over 150 years ago, Neandertals have been seen as “cavemen,” primitive in every respect compared to us “modern” humans who replaced them.


But in recent decades, the cultural achievements of Neandertals have been recognized.  Even so, the debate continues.  Did they learn more advanced technology from the modern human invaders of Europe and Asia, or did they develop it on their own?  The new findings lends support to the view that Neandertals were able to create and invent on their own. 

Neandertals were very likely the first to use sophisticated bone tools in Europe.  The tool found in France was made from the rib bone of red deer or possibly reindeer.  Making it required breaking, grinding, and polishing.  It shows evidence of being used to work leather, much like similar tools today.  When rubbed against an animal hide, it makes the leather soft, shiny, and more water resistant.

"For now the bone tools from these two sites are one of the better pieces of evidence we have for Neandertals developing on their own a technology previously associated only with modern humans", explains Shannon McPherron of the Max Planck Institute for Evolutionary Anthropology in Leipzig according to a press release from the Institute. 

Tools like this first appear in Africa much earlier.  But this new finding raising intriguing questions.  Did “modern” humans bring this technology from Africa and pass it to Neandertals prior to 50,000 years ago? Is there a technology transfer around the same time as modern/Neandertal interbreeding?  Or did Neandertals invent this technology on their own and transfer it to the modern humans who began to arrive in Europe around 40,000 years ago? 

"If Neandertals developed this type of bone tool on their own, it is possible that modern humans then acquired this technology from Neandertals. Modern humans seem to have entered Europe with pointed bone tools only, and soon after started to make lissoirs. This is the first possible evidence for transmission from Neandertals to our direct ancestors," says Marie Soressi of Leiden University in The Netherlands, part of the team of researchers who made this discovery.

"Lissoirs like these are a great tool for working leather, so much so that 50 thousand years after Neandertals made these, I was able to purchase a new one on the Internet from a site selling tools for traditional crafts," says Soressi. "It shows that this tool was so efficient that it had been maintained through time with almost no change. It might be one or perhaps even the only heritage from Neandertal times that our society is still using today."

Neandertals at this time were making sophisticated stone tools.  But these tools were made of bone because bone can is more adaptable for certain uses.  According to McPherron, "here we have an example of Neandertals taking advantage of the pliability and flexibility of bone to shape it in new ways to do things stone could not do."

The deeper question that lies behind this research is whether “modern humans” burst on the scene suddenly as a unique phenomenon of evolution, or whether the process of becoming human is more gradual and more widely distributed than we once thought.  

The research reported here was conducted by teams from Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and the University of Leiden in the Netherlands.  The article, entitled “Neandertals made the first specialized bone tools in Europe,” appears in the August 16, 2013 online edition of the Proceedings of the National Academy of Sciences.
 




Thursday, July 25, 2013

Rapamycin: Extended Lifespan, Extended Decline?


Ever since 2009, it has been known that the drug rapamycin extends the lifespan of mice.  The journal Science identified this discovery as one of the top 10 research breakthroughs for that year.  The news was all the more exciting because rapamycin already has FDA approval for other uses.

So researchers want to know just how rapamycin extends the lifespan.  Does it actually slow the entire aging process?  Or does it just slow down certain diseases, such as cancer?  

New research testing the effects of rapamycin on mice suggests that the drug probably does not slow the aging process itself.  It does slow the development of cancer and a few other diseases.  But rapamycin is no fountain of youth.  In fact, if it were used just by itself to extend the lifespan of human beings, it might merely draw out the aging process.  In other words, it might extend the lifespan but not extend the healthspan.

Photo: Public domain through Wikimedia. Thanks to Rama.

The research was conducted by a team led by Dan Ehninger and his colleagues at the German Center for Neurodegenerative Diseases. It is published in the August 2013 issue of The Journal of Clinical Investigation, which is freely available online.  In addition to the research article, the journal is publishing an expert commentary that warns about any drug that brings an increase in lifespan that “is accompanied by more disability and disease and a greater loss of physiological functions, i.e., a reduced quality of life.”  By itself, rapamycin could do just that.

On the bright side, the new study shows even more conclusively that rapamycin extends the lifespan of mice by the equivalent of almost a decade of human life.  It also provides a small benefit for cognitive function.  So despite the mixed results, the journal commentary advocates clinical trials involving human patients, perhaps those with dementia.  According to the journal article, the research supports “the feasibility of clinical trials to study the efficacy of rapamycin in treating diseases of the elderly, especially those that are debilitating and for which no current treatment is known, such as Alzheimer’s disease and other neurodegenerative diseases.”

Advocates of anti-aging research will see this new study as something of a set-back, but it is not likely to slow down basic work in the field.  Opponents of anti-aging research are likely to renew their warnings about the prospect of more years of declining health.  Any effort to enhance our humanity, whether it is by increasing cognitive ability or extending the lifespan, is always accompanied by a down-side, by side effects so costly that true enhancement is impossible.  The warning is serious, but advocates of human enhancement are not likely to be convinced.   

The research article is entitled “Rapamycin Extends Murine Lifespan but Has Limited Effects on Aging.”  The commentary is entitled “Rapamycin, Anti-aging, and Avoiding the Fate of Tithonus.”  Both are available free to the public in the August 2013 issue of The Journal of Clinical Investigation.



Thursday, July 18, 2013

Did Neandertals Wear Ornaments?


A small but tantalizing find provides further evidence for Neandertal culture.  Working in the foothills of the Alps just north of Venice, Italy, researchers have discovered and analyzed a small marine shell that originally came from about 60 miles away.  It was thinly coated with a dark red substance that turns out to be pure hematite and was most likely used as a pigment.  One possibility is that the shell was used as an ornament.

The paper, freely available online in the journal PLoS One, dates the shell’s pigmentation to a period just before 45,000 years ago, right before the arrival of so-called “modern” humans in Europe. 

Photo Caption: A shell possibly "painted" by Neandertals about 45,000 years ago.  Photo available from PLoS One.

According to the paper, “deliberate transport and coloring of an exotic object, and perhaps its use as pendant, was a component of Neandertal symbolic culture, well before the earliest appearance of the anatomically modern humans in Europe.”

Quoting more of the paper, “this discovery adds to the ever-increasing evidence that Neandertals had symbolic items as part of their culture.”

Debates about Neandertal culture have intensified recently, in part because of genetic evidence of interbreeding between Neandertals and the modern humans coming into Asia and Europe.  While these modern humans began their migration out of Africa about 80,000 years ago and probably interbred around 55,000 years ago, they did not reach Europe until more like 40,000 years ago.  If all these dates hold up in future research, this shell does provide a small but intriguing hint about the culture of Neandertals at just about the time of their encounter with “modern” humans. 

So who exactly is modern?  The differences between ourselves (the humans we like to call “modern”) and the Neandertals are not as great than we once imagined.  The paper ends with these words: “Future discoveries will only add to our appreciation of Neandertals shared capacities with us.”

The paper, entitled "An Ochered Fossil Marine Shell From the Mousterian of FumaneCave, Italy," appears in the current issue of PLoS One and is freely available online.

Thursday, July 4, 2013

The Rise of Agriculture: New Findings, Added Complexity

In the grand story of human origins, the invention of agriculture is one of the most pivotal chapters.  It is generally agreed that farming first arose in the Fertile Crescent about 12,000 years ago.  But did it arise in at one end of the Crescent and spread to the other?  Or did it arise independently in various locations across the entire region, from modern Israel to modern Iran? 

Photo caption: Hordeum spontaneum, wild barley from Chogha Golan, Iran. [Image courtesy of TISARP]

New research suggests that agriculture arose independently at various locations. While the newly developed agricultural techniques and selected grains probably spread quickly, newly published evidence suggests that the inventive process itself was widespread.  The research, conducted by Simone Riehl from the University of Tübingen in Germany along with colleagues from the Tübingen Senckenberg Center for Human Evolution and Paleoecology, is published in the July 5, 2013 issue of the journal Science.

A key debate in human evolution is whether momentous changes such as agriculture occur in big, rapid, and isolated bursts, or whether such grand changes are the cumulative result of smaller changes widely distributed over vast areas and long periods of time.  This new evidence seems to support the view that changes are distributed and cumulative rather than rapid.

Field work in Chogha Golan, Iran, led Riehl’s team to the discovery of wild, progenitor versions of barley, lentil, and wheat.  At the same site, early domesticated forms of these same plants are found, suggesting that the domestication occurred onsite.  Domesticated plants and animals form the core of agriculture and the economic basis for the rise of human cities and civilization.  

Tools and figurines were also found, dating from 12,000 to around 9,800 years before the present. The rise of agriculture in this region during this period set the stage for the growth of human population, the development of cities, and the rise of ever-more complex cultures.

The article is entitled "Emergence of Agriculture in the Foothills of the Zagros Mountains of Iran."  It appears in the 5 July 2013 issue of the journal Science.  

Monday, June 3, 2013

We Are What We Ate: Diet and Human Evolution

At a key moment in human evolution, our diet expanded and became more diverse, setting the stage for humans to draw on a wider range of food sources to feed expanding brains.

Four academic papers published together in the June 3, 2013 issue of the Proceedings of the National Academy of Sciences report on new methods of studying the carbon found in ancient teeth, going back more than 4 million years.  Ancestors living then ate pretty much what apes eat today, a diet of mostly leaves and fruits.  Then about 3.5 million years ago, a major shift occurs. 
Caption:This is an artist's representation of Paranthropus in southern Africa more than 1 million years ago.  Credit:Illustration courtesy ArchaeologyInfo.com/ScottBjelland.  Usage Restrictions: None
  
The old food sources remained in use, but new sources are added.  Researchers came to this conclusion by analyzing the carbon isotopes still present in ancient teeth.  After examining 175 specimens from 11 different species, they concluded that a key shift occurred at about 3.5 million years ago.  At that point, at least some of our ancestors were supplementing the usual foods by turning to grasses or sedges—or to the animals that graze on them.  These ancestors, including Australopithecus afarensis (best known as the famous “Lucy”), became more diverse in their food sources.

The earliest known evidence suggests that at about this same time, our human ancestors were making tools and using them to butcher large animals for food.  If these animals ate grasses, the carbon would have entered the human diet that way.  Another possibility is that human ancestors were simply learning to identify other types of plants as food sources compatible with human metabolism.

The main point, however, is that at this critical 3.5 million year transition, human ancestors were become more variable in their diet and in their behavior.  Rather than being locked into one type of food source or one way to pursue food, they were becoming more varied in their diet and behavior.  This made it possible for them to exploit more sources of food, nourish even bigger brains, travel and thrive in new niches, and survive climate change cycles, particularly ancient African cycles of wet and dry periods. 

"We don't know exactly what happened," said Matt Sponheimer of Colorado University and one of the researchers. "But we do know that after about 3.5 million years ago, some of these hominids started to eat things that they did not eat before, and it is quite possible that these changes in diet were an important step in becoming human."

If becoming more varied and adaptable is the same as becoming more human, then this study provides an important insight into this process.  One of the papers (Wynn et al.) concludes with this sentence: “This dietary flexibility implies unique landscape use patterns and malleable foraging behavior within a narrow time from of a single species.”  In other words, they were able to adjust quickly, seizing new opportunities and adapting to environmental changes. 



 

Thursday, May 16, 2013

Deep Brain Cognitive Enhancement: The Latest News

The search for new methods of cognitive enhancement has just reached new depths.  Researchers in Austria and the UK report exciting new evidence that a form of noninvasive deep brain stimulation enhances the brain’s ability to do arithmetic. 

"With just five days of cognitive training and noninvasive, painless brain stimulation, we were able to bring about long-lasting improvements in cognitive and brain functions," says Roi Cohen Kadosh of the University of Oxford and lead author of the report that appears in the May 16, 2013 issue of Current Biology.  His comments were provided by the journal.

Photo Credit.  Photo by Ad Meskens of an original oil painting by Laurent de La Hyre (French, 1606-1656).  The title of the painting is Allegory of Arithmetic (Allegorie van de rekenkunde) and it dates to about 1650.  The original painting is in the Walters Art Museum, Baltimore, Maryland.  It was photographed on 18 July 2007 by Ad Meskens, who has made it freely available with proper credit.

In this study, the team used a form of noninvasive deep brain stimulation known as “transcranial random noise stimulation” or TRNS.  The TRNS input was combined with more traditional math training and drills.  Twenty-five young adults, males and females, were divided into two groups, one receiving math training with the TRNS and the other receiving math training combined with a “sham” version of TRNS, a kind of placebo. 

Not only did those who received TRNS do well immediately, but the benefits lasted for at least six months.  In addition, brain monitors detected different brain activity for those receiving TRNS.  This suggests that TRNS modifies brain function.

According to Cohen Kadosh, "If we can enhance mathematics, therefore, there is a good chance that we will be able to enhance simpler cognitive functions."

In the paper’s conclusion, the authors state that TRNS “can enhance learning with respect to high-level cognitive functions, namely algorithmic manipulation and factual recall in mental arithmetic. When this learning is based on deep-level cognitive processing, as is the case for calculation arithmetic, such enhancements are extremely long-lived both behaviorally and physiologically.

Then they sum up with these words:
Both the behavioral and physiological changes displayed extreme longevity, spanning a period of 6 months, but only when learning involved deep-level cognitive processing. By its demonstration of such longevity and, for the calculation task, generalization to new, unlearned material, the present study highlights TRNS as a promising tool for enhancing high-level cognition and facilitating learning. These findings have significant scientific and translational implications for cognitive enhancement in both healthy individuals and patients suffering from disorders characterized by arithmetic deficits.

The paper, Snowball et al.: "Long-Term Enhancement of Brain Function and Cognition Using Cognitive Training and Brain Stimulation," appears in the May 16, 2013 issue of Current Biology

Wednesday, May 15, 2013

Stem Cell Advance and Cloning Debates

An important breakthrough in stem cell medical research is likely to re-ignite an ethical debate about human cloning. 

Researchers at the Oregon Health & Science University reported on May 15 that they have succeeded for the first time in human “somatic cell nuclear transfer” or SCNT, a process that the public often refers to simply as cloning.  Oregon researchers were able to transfer the nucleus from one human cell into a donated human egg from which the nucleus had been removed, essentially the same process that led to the creation of Dolly the sheep more than fifteen years ago. 

Caption: The first step during SCNT is enucleation or removal of nuclear genetic material (chromosomal) from a human egg. An egg is positioned with holding pipette (on the left) and egg's chromosomes are visualized under polarized microscope. A hole is made in the egg's shell (zone pellucida) using a laser and a smaller pipette (on the right) is inserted through the opening. The chromosomes then sucked in inside the pipette and slowly removed from the egg.  Credit: Cell, Tachibana et al. Usage Restrictions: Credit Required.
 
While other teams have achieved nuclear transfer with human cells, none has been able to produce an embryo that developed long enough to yield human embryonic stem cells.  The Oregon team, led by Shoukhrat Mitalipov, achieved this long-desired goal.
 
Even more remarkably, the Oregon team achieved SCNT repeatedly and with high efficiency, in one case producing a stem cell line for every two donated eggs. Along with other researchers around the world, Mitalipov’s team has discovered many ways to fine-tune the nuclear transfer process, making it far more demanding technically than when Ian Wilmut’s team first created Dolly.

But just as we learned from Dolly, any major technical advance in somatic cell nuclear transfer is likely to trigger public controversy about cloning and about the social impact of science.  While nearly everyone applauds the goal of the Oregon research—better understanding and treatment of disease—not everyone will like the way they went about their work.

For one thing, the result of successful nuclear transfer is a kind of embryo.  Mitalipov’s paper, published in the June 6, 2013 issue of the journal Cell (online on May 15), repeatedly refers to this new entity as the “SCNT embryo.”  Is a “SCNT embryo” a “real” embryo?  If an embryo is the result of fertilization, then of course a “SCNT embryo” is not a normal or real embryo.  But if an embryo is defined by its potential to develop, then a SCNT embryo probably is very close to a normal or real embryo, biologically at least.

Suppose we accept that a SCNT embryo is real enough to warrant the same protection as embryos created by IVF.  Is it legitimate to create such an embryo for the express purpose of research that will destroy this SCNT embryo?  Many people object to this, and major religious institutions such as the Catholic Church have been unambiguous in their denunciation of this research. 

On the other hand, a few religious groups have specifically endorsed this research.  One of the clearest statements of support is entitled “Cloning Research, Jewish Tradition and Public Policy.”  The statement, published in 2002, speaks for all major groups within American Judaism: 
Moreover, our tradition states that an embryo in vitro does not enjoy the full status of human-hood and its attendant protections.  Thus, if cloning technology research advances our ability to heal humans with greater success, it ought to be pursued since it does not require or encourage the destruction of life in the process.
Another statement in support comes from a study committee in the United Church of Christ, which released this statement in 1997: 
...we on the United Church of Christ Committee on Genetics do not object categorically to human pre-embryo research, including research that produces and studies cloned human pre-embryos through the 14th day of fetal development.”

For more religious statements on embryo research, check out God and the Embryo, especially the appendics. 

I personally agree with the statements quoted above.  So I support the research performed in Oregon.  But I have to admit that among people with religious commitments, I am in a minority.  As much as I wish it were otherwise, I expect that many will object to the idea that Mitalipov’s group has created and destroyed embryos for research.

Some will argue that the technology of induced pluripotent stem cells (iPSCs) makes the use of embryos unnecessary.  While it is true that iPSC technology is a remarkable and promising advance, so far the field has run into unexpected technical complications in its quest to produce pluripotent stem cells that function like cells from embryonic sources.  A great attraction of iPSCs—beyond the fact that no embryos are involved—is that they are a genetic match to the donor.  What the Oregon breakthrough provides is the best of both: embryonic quality in donor-specific cells. 

Others will object because they don’t like human cloning—understood now as the use of SCNT to produce a child.  They will see the Oregon breakthrough as ushering in the era human reproductive cloning, and they will see this as reason enough to ban any further advances in SCNT technology.

Far more sensible, I think, would be a moratorium on human reproductive cloning.  What the Oregon group has achieved does make it more likely that someone somewhere might try to offer cloning as a reproductive technology.  The problem is that using Mitalipov’s techniques, they might succeed in creating an embryo that survives but that is beset by many unforeseeable health problems.

If we have learned anything in the past fifteen years, it is that SCNT is a tricky and complex process.  Just because Mitalipov’s team learned how to create the SCNT embryo that is healthy and viable through the blastocyst stage does not mean that anyone knows how to create an SCNT child.  Too many things could go wrong, and only now are we beginning to get some idea of how these potential problems might arise.

Someday, many decades in the future, we may understand these problems so well that we can solve them technically.  If that day ever comes, then those who come after us will have to ask: is a cloned child a good idea.  Right now we do not even have to ask that question because an SCNT is an unsafe idea. 

The press release from The Oregon Health and Science University that announces this advance makes this claim:
One important distinction is that while the method might be considered a technique for cloning stem cells, commonly called therapeutic cloning, the same method would not likely be successful in producing human clones otherwise known as reproductive cloning. Several years of monkey studies that utilize somatic cell nuclear transfer have never successfully produced monkey clones. It is expected that this is also the case with humans. Furthermore, the comparative fragility of human cells as noted during this study, is a significant factor that would likely prevent the development of clones.

The Oregon release then quotes Mitalipov:
"Our research is directed toward generating stem cells for use in future treatments to combat disease," added Dr. Mitalipov. "While nuclear transfer breakthroughs often lead to a public discussion about the ethics of human cloning, this is not our focus, nor do we believe our findings might be used by others to advance the possibility of human reproductive cloning."
The article is entitled "Human Embryonic Stem Cells Derived by Somatic Cell Nuclear Transfer" and appears in the May 15 issue of the journal Cell



Thursday, April 11, 2013

Lights and Brains: Injectible LED's Interact with Brain Cells

The quest to put computers in the brain has just come a step closer.  Tiny LED lights have been implanted deep in the brains of rodents.  The LEDs themselves are the size of individual neurons.  They are packaged with other tiny sensors into an ultrathin, flexible device.  The whole device is small enough to be implanted using a needle that positions the device at precise sites deep in the brain. 

Once implanted, the device communicates directly with the brain at the level of cells.  It communicates wirelessly with a module mounted above the rodent’s head, one small enough not to interfere with activity and removable when not in use.  The device itself is completely contained within the brain where it was implanted without any damage to surrounding cells.  Signals sent through the device stimulate genetically modified brain cells, signaling for example for the release of neurotransmitters such as dopamine. 


Photo Credit: MicroLED device next to a human finger.  Image courtesy of University of Illinois-Urbana Champaign and Washington University-St. Louis.
 
"These materials and device structures open up new ways to integrate semiconductor components directly into the brain," said team co-leader John A. Rogers according to a press release from the University of Illinois.  "More generally, the ideas establish a paradigm for delivering sophisticated forms of electronics into the body: ultra-miniaturized devices that are injected into and provide direct interaction with the depths of the tissue."

The device itself is a feat of engineering requiring the effort of an international team based in China, Korea, and at multiple centers across the US.  By miniaturizing the device to the cellular scale and by creating a totally wireless interface, researchers overcame several challenges at once.  For example, larger implantable devices always run the risk of creating scars or lesions in the brain, which may cause serious problems.   "One of the big issues with implanting something into the brain is the potential damage it can cause," team co-leader Michael Bruchas said. "These devices are specifically designed to minimize those problems, and they are much more effective than traditional approaches."

In addition, because this device communicates and receives its power wirelessly, there are no wires or optical fibers passing from the brain to the outside world.  Previous devices were larger and nonflexible. They were implanted only on the surface of brain structures, but this new device is implantable deep within those structures and able to interact with units as small as a single cell.

Along with the LED lights, the device includes temperature and light sensors, microscale heaters, and electrodes that can stimulate and receive brain electrical activity.  Power to the device is provided wirelessly through a radio frequency system. 

It is impossible to predict the future of efforts to connect brains and computers. This work obviously represents a significant advance toward that end.  "These cellular-scale, injectable devices represent frontier technologies with potentially broad implications," Rogers said. Being able to monitor and trigger the brain of living animals at the cellular level is likely to become a profoundly valuable tool for research.  Medical research, too, is also likely to be affected, not just in responding to patients with paralysis but also in research and perhaps even therapy in other diseases involving the brain or other organs, where these devices are also implantable. 

Some, of course, will speculate about even wider implications for this technology.  Will it open the way to control people by controling their brains?  Perhaps.  Will it open the way for our brains to communicate with computers and the internet?  There is little doubt that this step will inspire more work along those lines. 

This article is entitled "Injectable, Cellular-Scale Optoelectronics with Applications for Wireless Optogenetics" and is published in the April 12, 2012 issue of the journal Science, a publication of the American Association for the Advancement of Science. 

The Two Million Year Question

Careful studies of 2-million year old human-like fossils just published in the April 12, 2012 issue of Science raise more questions than they answer.

These papers provide highly detailed information about the teeth, rib cage, hands, and feet of this strange relative, known to scientists as Australopithecus sediba.  But we still do not know the answer to the biggest question of all.  How does sediba fit in the human family tree?  Is sediba a direct human ancestor?  If not, why are they so similar to us in some respects?

Photo Credit: The reconstructed skull and mandible of Australopithecus sediba.Reconstruction by Peter Schmid, Photo by Lee R. Berger. Image courtesy of Lee R. Berger and the University of the Witwatersrand.

The teeth are mostly like those of Australopithecus africanus but also quite a bit like the earliest examples of the genus Homo.  That is surprising.  For some experts, it calls into question the standard view that Homo evolved from Australopithecus afarensis, most commonly known as “Lucy.” 

The new analysis suggests an evolutionary pathway from africanus to sediba to Homo.  In that case, Lucy is a relative but not an ancestor.  Sediba is. 

Not so fast, others insist.  The first examples of Homo may go back to 2.3 million years ago, long before sediba appears at just under two million years ago.  Lucy and her afarensis kin lived much earlier, enough to be ancestral to Homo. 

Based on what we know now, the debate will continue because the facts just do not line up neatly or offer a simple story.  "Our study provides further evidence that sediba is indeed a very close relative of early humans, but we can't definitively determine its position relative to africanus,” study co-author Debbie Guatelli-Steinberg said according to a release from Ohio State University.

What these studies do provide is a remarkably complete picture of what early human-like ancestors look like.  They also provide another surprise.  Despite having a foot with a narrow heel, similar to chimpanzees, sediba definitely walked upright, maybe even using a somewhat awkward never known before to scientists.  They were clearly not knuckle-walkers, like the apes, but they were not nearly as graceful as the humans who followed.  It seems they walked upright differently.  

For now, what all this suggests is that the story of our deep ancestry is more complex than we usually imagine.  Straight ancestral lines are hard to draw.  More finds may help sort things out.  But they may also add new complexity.  The way it looks, multiple forms of early human life may have existed at once.  They differed slightly from each other and also in the degree to which they resemble us.  That makes it very hard to sort out the lineages.  

Is sediba a direct human ancestors?  Yes, at least according Lee Berger, who discovered sediba in a pit in northern South Africa in 2008.  Most experts, however, argue no, mainly the dates are out of line.  What difference does it make?  Perhaps the biggest significance of this debate is to show us that the more we know, the more we see a complex picture of multiple species and perhaps interweaving lineages, making it all the more remarkable that we are here at all. 

This research is published as a set of six research reports in the April 12, 2012 issue of the journal Science, a publication of the American Association for the Advancement of Science. 

Friday, March 22, 2013

"Three-parent babies" and the Human Germline Modification Debate

Human germline modification is back in the news. The current round of public conversation was launched in the UK by the Human Fertilisation and Embryology Authority (HFEA). In the past few days, the media and the blogosphere have lit up with an intensifying debate. 

What HFEA wants people to consider is whether it is acceptable to use in vitro fertilization to try to avoid a specific category of genetic disease. Is it OK to help couples at risk for mitochondrial disorders by supplying donor mitochondria to the new embryo? If mom’s own mitochondrial DNA will lead to a disease, is it OK to add mitochondria from an outside donor?

PHOTO Transmission electron microscope image of a thin section cut through an area of mammalian lung tissue. The high magnification image shows a mitochondria. Source: Wikimedia. Credit: Louisa Howard, PhD. This work has been released into the public domain by its author.

Many refer to this as the “three-parent baby.” And for that reason alone, they object.

Others raise the stakes in the argument. They insist that the “three-parent baby” is just the tip of the looming germline modification iceberg. What’s really coming, they claim, is the era of “designer babies,” enhanced or improved versions of ourselves, a new form of high-tech eugenics.  And with that comes more mischief.

Consider what Stuart Newman (New York Medical College) had to say in his comment in The Huffington Post. Newman starts by asking whether the procedure is really as safe as it seems. Fair question. But then Newman writes that what is really going on here is “a new form of eugenics, the improvement of humans by deliberately choosing their inherited traits.” And then, a few short paragraphs later, he’s off to the Nazis, forced sterilization, and the Nurenberg Code.

Now it may be true that the “three-parent baby” is a pretty bad idea medically. But morally, is it really the fast-track to Nazi medicine?

Or consider Marcy Darnovsky’s comments in a press release from the Center for Genetics and Society:
“Changing the genes we pass on to our children is a bright ethical line that should not be crossed,” said Marcy Darnovsky, PhD, the Center's executive director. “It has been observed by scientists around the world, adopted as law by more than 40 countries, and incorporated in several international treaties. It would be wrong for the UK to disregard this global bioethical consensus, especially when there are safe alternatives available for the very few people who would be candidates for the procedures.”
The release concludes: “The Center for Genetics and Society calls for a domestic and international moratorium on approval of any procedures involving inheritable human genetic modification…”
Or consider the comment of David King of Human Genetics Alert as quoted by the BBC: 
Dr David King, the director of Human Genetics Alert, said: "Historians of the future will point to this as the moment when technocrats crossed the crucial line, the decision that led inexorably to the disaster of genetically engineered babies and consumer eugenics.
Is the “three-parent baby” really “crossing the germline barrier”? Back in 2001 when the first “three-parent babies” being created here in the US, Erik Parens and Eric Juengst wrote a response in the journal Science. They called it “Inadvertently Crossing the Germline.” Ever since then, many have agreed. Despite some really important distinctions, mitochondrial replacement is a kind of human germline modification.

A bit of a stretch, but OK, let’s call it that. But is that reason enough to condemn it? Is human germline modification itself morally wrong? It may be biomedically impossible. It may be excessively expensive considering all the other needs facing the world’s children. But is it intrinsically wrong? 
 
In 2008, I published an edited book that tried to take the temperature of religious opinions on the morality of germline modification. What I discovered surprised even me. Most religious scholars in my collection were not particularly troubled by the prospect of germline modification. Sure, they had their concerns—safety, social justice, over-controlling parents, an attitude of commodification. But in the end, almost without exception, they agreed: what can be religiously or morally wrong with wanting to use the latest technology to help parents have healthy children? For more on this, see Design and Destiny from MIT Press.
 
For many people, it comes as a total shock to hear that even some Vatican statements support the notion that germline modification is not inherently immoral—that, in fact, it could be “desirable.” The Vatican has specific constraints that must be met. No IVF, for one, so the “three-parent baby” strategy fails on that score. But if the means are acceptable, then the goal is laudable, at least according to this statement made by Pope John Paul II:
A strictly therapeutic intervention whose explicit objective is the healing of various maladies such as those stemming from chromosomal defects will, in principle, be considered desirable, provided it is directed to the true promotion of the personal well-being of the individual without doing harm to his integrity or worsening his conditions of life. Such an intervention would indeed fall within the logic of the Christian moral tradition.
I agree with the “three-parent” critics about the importance of the debate over human germline modification. For that very reason, I hope they tone down the rhetoric. This is not Nazi medicine.
 
There are sound moral reasons for wanting to move forward on human germline modification. Of course, there are incredibly important technical hurdles that must be overcome. Some of them, in fact, may prove impossible. If so, then of course human germline modification would be a bad idea because of the risks.
 
But if biomedical research can find its way through these technical barriers, what then? Yes, there are other objections, more religious or moral in nature, but there are also strong reasons for going forward. That, I suggest, is where the real discussion should focus.  

Thursday, March 14, 2013

Stem Cell Advance: Brain Cells Inserted in Monkey Brains

Researchers at the University of Wisconsin-Madison are reporting a significant step forward toward the day when stem cells may be used to treat brain diseases such as Parkinson’s.

Working with three rhesus monkeys, the research team created a personalized stem cell culture for each monkey.  Cells taken from the skin of the monkey were induced to a state of pluripotency by means of a process called “induced pluripotency.”  Once in a state of pluripotency, the cells were guided forward in the process of differentiation until they became neurons and glial cells.  Along the way, the cells in the culture were given a genetic tag so the cells would glow under a florescent light. 

Then the cells were implanted in the brains of the rhesus monkeys.  Because the source of the cells was the monkeys themselves, the DNA matched and there was no immune reaction.  After six months, researchers discovered that the cells were so fully integrated into the monkey brains that in many cases, they could only be recognized by their green florescent glow.

"When you look at the brain, you cannot tell that it is a graft," says senior author Su-Chun Zhang, according to a press release from the University of Wisconsin. "Structurally the host brain looks like a normal brain; the graft can only be seen under the fluorescent microscope." 

Caption: This neuron, created in the Su-Chun Zhang lab at the University of Wisconsin–Madison, makes dopamine, a neurotransmitter involved in normal movement. The cell originated in an induced pluripotent stem cell, which derive from adult tissues. Similar neurons survived and integrated normally after transplant into monkey brains—as a proof of principle that personalized medicine may one day treat Parkinson's disease. Date: 2010.  Image: courtesy Yan Liu and Su-Chun Zhang, Waisman Center, University of Wisconsin–Madison.

The three monkeys involved in the experiment were given tiny lesions or scars in their brain to mimic Parkinson’s disease.  Another lead researcher, Marina Emborg, commented on how the inserted cells integrated themselves into the brain.  “After six months, to see no scar, that was the best part."


What makes this work significant is that it is the first use of induced pluripotent stem cells (iPS) involving a primate, setting the stage for further work someday involving human beings.  According to Zhang, "It's really the first-ever transplant of iPS cells from a non-human primate back into the same animal, not just in the brain," says Zhang. "I have not seen anybody transplanting reprogrammed iPS cells into the blood, the pancreas or anywhere else, into the same primate. This proof-of-principle study in primates presents hopes for personalized regenerative medicine."

One of the keys to their success is that the iPS cells themselves were not transplanted into the monkeys.  Because iPS cells are pluripotent, they can give rise to cancer or other problems.  In this work, the researchers carefully guided the iPS cells so that they were almost at the final stage of differentiation, and then made sure that their cell culture was completely purified so that no potentially cancer-causing cells would slip through.  Quoting Zhang once again: "We differentiate the stem cells only into neural cells. It would not work to transplant a cell population contaminated by non-neural cells."

Because of these precautions, the experiment succeeded in introducing new cells into the monkey’s brains without any obvious problems.  But in this experiment, too few cells were introduced to help the monkeys overcome the symptoms of Parkinson’s.  Solving that problem is the obvious next step.

According to the paper, “this finding represents a significant step toward personalized medicine,” which may someday be used to treat a wide range of diseases in humans.  Because the original source of the cells was from the individual monkeys themselves, there was no immune rejection.  If the same technique can be applied to human beings, it may mean that an individualized culture of iPS cells could be created for each patient, then carefully guided forward in the process of differentiation, and then implanted to regenerate organs or tissues damaged by injury or disease.

What makes iPS cells especially attractive is that no embryos are used in their creation, and so almost no one objects to this line of medical research.  But if regenerative medicine is successful, someday it will be used not just to treat disease but to off-set the effects of aging or to enhance those who are well.  Then, we can be sure, many will object to this technology, but even more will use it.

The article, entitled “Induced Pluripotent Stem Cell-DerivedNeural Cells Survive and Mature in the Nonhuman Primate Brain,” is freely available at the open access journal, Cell Reports in its March 28, 2013 issue. 

 

Wednesday, March 13, 2013

Enhancing Healthy Kids: A Warning, But Who's Listening?

The American Academy of Neurology (AAN) has just issued new guidelines calling on doctors to stop prescribing cognitive-enhancing drugs to healthy kids.

Drugs like Ritalin and Adderal are widely used, not just by adults and university students, but increasingly by children, and not just those who are appropriately diagnosed as experience difficulites with attention or focus, such as Attention Deficit Disorder. 

PHOTO: Ritalin SR (a brand-name sustained-release formulation of methylphenidate, from Wikimedia, 16 June 2006, created by Sponge. 

Perviously, the AAN raised concerns drug enhancement by adults.  It concluded that there is no moral basis for objecting, provided that the patient is acting autonomously in requesting the prescription.  But when it comes to prescribing for healthy children, the AAN report makes this claim:  "Pediatric neuroenhancement remains a particularly unsettled and value-laden practice, often without appropriate goals or justification."  

The Report notes that enhancing children is fundamentally different from enhancing adults.  For doctors, it raises concerns for "the fiduciary responsibility of physicians caring for children, the special integrity of the doctor–child–parent relationship, the vulnerability of children to various forms of coercion, distributive justice in school settings, and the moral obligation of physicians to prevent misuse of medication."

Based on these concerns, the AAN Report advises that "the prescription of neuroenhancements is inadvisable because of numerous social, developmental, and professional integrity issues."

The primary objection raised by the AAN is that children lack the competency to act as autonomous moral agents.  If they were competent, then their request for enhancement would be honored.  Sure, children can be coerced, manipulated, confused, and ambivalent about their needs.  Kind of like the rest of us. 

Whether age brings moral competence is a good question.  But perhaps what this report shows us once again is that when secular bioethics meets enhancement technology, about all it can say is this: If you want it and if you can prove your competence, you can have it. 

The AAN report, “Pediatric neuroenhancement: Ethical, legal,social, andneurodevelopmental implications,” is published in the March 13, 2013 issue of Neurology.