Thursday, October 17, 2013

What a Small Brain Can Tell Us

New information about an early human skull sheds more light on the very first members of the human genus.  The skull, found in Dmanisi, Georgia in 2005, has now been freed from the stone casing that has preserved it for the past 1.8 million years. An international team led by David Lordkipanidze of the Georgian National Museum report its findings in the October 18 issue of the journal Science. 

Photo Caption: The Dmanisi D4500 early Homo cranium in situ. Photo courtesy of Georgian National Museum.

When the world first learned of early human remains in Georgia, the news came as a bit of a shock.  These early humans seemed quite similar to other remains found in Africa and dating to the same time.  That suggests they were able to travel and adapt to new settings. 

The latest analysis contains a new surprise.  The skull described in the new report has an unexpectedly small brain size, at or below the range usually seen as minimal for our genus.  At 546 cubic centimeters, its small brain widens our view of variability of humans at this time. 

Does this skull, identified as Skull 5 from Dmanisi, really measure up to being in the genus Homo at all? It is something else, like Australopithecus?  The researchers argue that it is clearly part of the genus Homo for the simple reason that Skull 5 is found with other, larger-brained skulls, all clearly part of the same community.  One Georgian brain was as large as 730 cc.  What this suggests is that Skull 5 is part of Homo but that our definition of Homo should be broadened. 

In fact, all this diversity at one site provides support for one side in an ongoing debate.  Are species defined broadly in terms of variability, or does small to moderate variation indicate separate species.  This finding supports the view that at least in terms of early humans, a species can be quite variable.      

Not too long ago, Lordkipanidze and his team took the opposite view.  They believed that these early humans from Georgia were a distinct species, what they called Homo georgicus.  The new paper retracts that claim, saying that the new evidence of variation in Georgia means that these fossils fit within the widened range variability of Homo erectus, a globally dispersed species.  More precisely, they see the Georgian samples as best classified as Homo erectus ergaster georgicus, part of the species Homo erectus but distinct because of modifications over time and because of location. 

Commenting on the variation in the skulls found almost literally on top of each other at Dmanisi, co-author Christoph Zollikofer notes that the skulls “look quite different from one another, so it's tempting to publish them as different species.  Yet we know that these individuals came from the same location and the same geological time, so they could, in principle, represent a single population of a single species,” Zollikofer said in a press release issued by the journal Science. 

The key claim advanced in the article, however, is that these samples from Georgia and Africa, together with other samples from Asia, are all part of one global species.  The report describes them as Homo erectus, seen as “a single but polymorphic lineage.” 

The diversity found in Georgia also suggests that the number of individuals in that region may have been larger than first thought, possibly numbering 10,000 or so.  And the small size of Skull 5’s brain suggests that they traveled all this way before brains began to expand.

The report, “A Complete Skull from Dmanisi, Georgia, and the Evolutionary Biology of Early Homo," is published in the 18 October 2013 issue of the journal Science, published by the American Association for the Advancement of Science.    

Monday, August 12, 2013

Is Neandertal Technology Still in Use Today?

Those primitive Neandertals may not have been so primitive after all.  Some 50,000 years ago, they were using a highly crafted bone tool virtually identical to a tool in use by human leather-workers today.

The tool is called a lissoir, was made by Neandertals living in southwestern France long before the arrival of the people we like to call “anatomically modern humans.”  The discovery, reported in the August 16, 2013 online issue of PNAS, is sure to fuel the debate over the cultural sophistication of the Neandertals.

Caption: Four views of the most complete lissoir found during excavations at the Neandertal site of Abri Peyrony.  Credit: Image courtesy of the Abri Peyrony and Pech-de-l’Azé I Projects.

Ever since their discovery over 150 years ago, Neandertals have been seen as “cavemen,” primitive in every respect compared to us “modern” humans who replaced them.


But in recent decades, the cultural achievements of Neandertals have been recognized.  Even so, the debate continues.  Did they learn more advanced technology from the modern human invaders of Europe and Asia, or did they develop it on their own?  The new findings lends support to the view that Neandertals were able to create and invent on their own. 

Neandertals were very likely the first to use sophisticated bone tools in Europe.  The tool found in France was made from the rib bone of red deer or possibly reindeer.  Making it required breaking, grinding, and polishing.  It shows evidence of being used to work leather, much like similar tools today.  When rubbed against an animal hide, it makes the leather soft, shiny, and more water resistant.

"For now the bone tools from these two sites are one of the better pieces of evidence we have for Neandertals developing on their own a technology previously associated only with modern humans", explains Shannon McPherron of the Max Planck Institute for Evolutionary Anthropology in Leipzig according to a press release from the Institute. 

Tools like this first appear in Africa much earlier.  But this new finding raising intriguing questions.  Did “modern” humans bring this technology from Africa and pass it to Neandertals prior to 50,000 years ago? Is there a technology transfer around the same time as modern/Neandertal interbreeding?  Or did Neandertals invent this technology on their own and transfer it to the modern humans who began to arrive in Europe around 40,000 years ago? 

"If Neandertals developed this type of bone tool on their own, it is possible that modern humans then acquired this technology from Neandertals. Modern humans seem to have entered Europe with pointed bone tools only, and soon after started to make lissoirs. This is the first possible evidence for transmission from Neandertals to our direct ancestors," says Marie Soressi of Leiden University in The Netherlands, part of the team of researchers who made this discovery.

"Lissoirs like these are a great tool for working leather, so much so that 50 thousand years after Neandertals made these, I was able to purchase a new one on the Internet from a site selling tools for traditional crafts," says Soressi. "It shows that this tool was so efficient that it had been maintained through time with almost no change. It might be one or perhaps even the only heritage from Neandertal times that our society is still using today."

Neandertals at this time were making sophisticated stone tools.  But these tools were made of bone because bone can is more adaptable for certain uses.  According to McPherron, "here we have an example of Neandertals taking advantage of the pliability and flexibility of bone to shape it in new ways to do things stone could not do."

The deeper question that lies behind this research is whether “modern humans” burst on the scene suddenly as a unique phenomenon of evolution, or whether the process of becoming human is more gradual and more widely distributed than we once thought.  

The research reported here was conducted by teams from Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and the University of Leiden in the Netherlands.  The article, entitled “Neandertals made the first specialized bone tools in Europe,” appears in the August 16, 2013 online edition of the Proceedings of the National Academy of Sciences.
 




Thursday, July 25, 2013

Rapamycin: Extended Lifespan, Extended Decline?


Ever since 2009, it has been known that the drug rapamycin extends the lifespan of mice.  The journal Science identified this discovery as one of the top 10 research breakthroughs for that year.  The news was all the more exciting because rapamycin already has FDA approval for other uses.

So researchers want to know just how rapamycin extends the lifespan.  Does it actually slow the entire aging process?  Or does it just slow down certain diseases, such as cancer?  

New research testing the effects of rapamycin on mice suggests that the drug probably does not slow the aging process itself.  It does slow the development of cancer and a few other diseases.  But rapamycin is no fountain of youth.  In fact, if it were used just by itself to extend the lifespan of human beings, it might merely draw out the aging process.  In other words, it might extend the lifespan but not extend the healthspan.

Photo: Public domain through Wikimedia. Thanks to Rama.

The research was conducted by a team led by Dan Ehninger and his colleagues at the German Center for Neurodegenerative Diseases. It is published in the August 2013 issue of The Journal of Clinical Investigation, which is freely available online.  In addition to the research article, the journal is publishing an expert commentary that warns about any drug that brings an increase in lifespan that “is accompanied by more disability and disease and a greater loss of physiological functions, i.e., a reduced quality of life.”  By itself, rapamycin could do just that.

On the bright side, the new study shows even more conclusively that rapamycin extends the lifespan of mice by the equivalent of almost a decade of human life.  It also provides a small benefit for cognitive function.  So despite the mixed results, the journal commentary advocates clinical trials involving human patients, perhaps those with dementia.  According to the journal article, the research supports “the feasibility of clinical trials to study the efficacy of rapamycin in treating diseases of the elderly, especially those that are debilitating and for which no current treatment is known, such as Alzheimer’s disease and other neurodegenerative diseases.”

Advocates of anti-aging research will see this new study as something of a set-back, but it is not likely to slow down basic work in the field.  Opponents of anti-aging research are likely to renew their warnings about the prospect of more years of declining health.  Any effort to enhance our humanity, whether it is by increasing cognitive ability or extending the lifespan, is always accompanied by a down-side, by side effects so costly that true enhancement is impossible.  The warning is serious, but advocates of human enhancement are not likely to be convinced.   

The research article is entitled “Rapamycin Extends Murine Lifespan but Has Limited Effects on Aging.”  The commentary is entitled “Rapamycin, Anti-aging, and Avoiding the Fate of Tithonus.”  Both are available free to the public in the August 2013 issue of The Journal of Clinical Investigation.



Thursday, July 18, 2013

Did Neandertals Wear Ornaments?


A small but tantalizing find provides further evidence for Neandertal culture.  Working in the foothills of the Alps just north of Venice, Italy, researchers have discovered and analyzed a small marine shell that originally came from about 60 miles away.  It was thinly coated with a dark red substance that turns out to be pure hematite and was most likely used as a pigment.  One possibility is that the shell was used as an ornament.

The paper, freely available online in the journal PLoS One, dates the shell’s pigmentation to a period just before 45,000 years ago, right before the arrival of so-called “modern” humans in Europe. 

Photo Caption: A shell possibly "painted" by Neandertals about 45,000 years ago.  Photo available from PLoS One.

According to the paper, “deliberate transport and coloring of an exotic object, and perhaps its use as pendant, was a component of Neandertal symbolic culture, well before the earliest appearance of the anatomically modern humans in Europe.”

Quoting more of the paper, “this discovery adds to the ever-increasing evidence that Neandertals had symbolic items as part of their culture.”

Debates about Neandertal culture have intensified recently, in part because of genetic evidence of interbreeding between Neandertals and the modern humans coming into Asia and Europe.  While these modern humans began their migration out of Africa about 80,000 years ago and probably interbred around 55,000 years ago, they did not reach Europe until more like 40,000 years ago.  If all these dates hold up in future research, this shell does provide a small but intriguing hint about the culture of Neandertals at just about the time of their encounter with “modern” humans. 

So who exactly is modern?  The differences between ourselves (the humans we like to call “modern”) and the Neandertals are not as great than we once imagined.  The paper ends with these words: “Future discoveries will only add to our appreciation of Neandertals shared capacities with us.”

The paper, entitled "An Ochered Fossil Marine Shell From the Mousterian of FumaneCave, Italy," appears in the current issue of PLoS One and is freely available online.

Thursday, July 4, 2013

The Rise of Agriculture: New Findings, Added Complexity

In the grand story of human origins, the invention of agriculture is one of the most pivotal chapters.  It is generally agreed that farming first arose in the Fertile Crescent about 12,000 years ago.  But did it arise in at one end of the Crescent and spread to the other?  Or did it arise independently in various locations across the entire region, from modern Israel to modern Iran? 

Photo caption: Hordeum spontaneum, wild barley from Chogha Golan, Iran. [Image courtesy of TISARP]

New research suggests that agriculture arose independently at various locations. While the newly developed agricultural techniques and selected grains probably spread quickly, newly published evidence suggests that the inventive process itself was widespread.  The research, conducted by Simone Riehl from the University of Tübingen in Germany along with colleagues from the Tübingen Senckenberg Center for Human Evolution and Paleoecology, is published in the July 5, 2013 issue of the journal Science.

A key debate in human evolution is whether momentous changes such as agriculture occur in big, rapid, and isolated bursts, or whether such grand changes are the cumulative result of smaller changes widely distributed over vast areas and long periods of time.  This new evidence seems to support the view that changes are distributed and cumulative rather than rapid.

Field work in Chogha Golan, Iran, led Riehl’s team to the discovery of wild, progenitor versions of barley, lentil, and wheat.  At the same site, early domesticated forms of these same plants are found, suggesting that the domestication occurred onsite.  Domesticated plants and animals form the core of agriculture and the economic basis for the rise of human cities and civilization.  

Tools and figurines were also found, dating from 12,000 to around 9,800 years before the present. The rise of agriculture in this region during this period set the stage for the growth of human population, the development of cities, and the rise of ever-more complex cultures.

The article is entitled "Emergence of Agriculture in the Foothills of the Zagros Mountains of Iran."  It appears in the 5 July 2013 issue of the journal Science.  

Monday, June 3, 2013

We Are What We Ate: Diet and Human Evolution

At a key moment in human evolution, our diet expanded and became more diverse, setting the stage for humans to draw on a wider range of food sources to feed expanding brains.

Four academic papers published together in the June 3, 2013 issue of the Proceedings of the National Academy of Sciences report on new methods of studying the carbon found in ancient teeth, going back more than 4 million years.  Ancestors living then ate pretty much what apes eat today, a diet of mostly leaves and fruits.  Then about 3.5 million years ago, a major shift occurs. 
Caption:This is an artist's representation of Paranthropus in southern Africa more than 1 million years ago.  Credit:Illustration courtesy ArchaeologyInfo.com/ScottBjelland.  Usage Restrictions: None
  
The old food sources remained in use, but new sources are added.  Researchers came to this conclusion by analyzing the carbon isotopes still present in ancient teeth.  After examining 175 specimens from 11 different species, they concluded that a key shift occurred at about 3.5 million years ago.  At that point, at least some of our ancestors were supplementing the usual foods by turning to grasses or sedges—or to the animals that graze on them.  These ancestors, including Australopithecus afarensis (best known as the famous “Lucy”), became more diverse in their food sources.

The earliest known evidence suggests that at about this same time, our human ancestors were making tools and using them to butcher large animals for food.  If these animals ate grasses, the carbon would have entered the human diet that way.  Another possibility is that human ancestors were simply learning to identify other types of plants as food sources compatible with human metabolism.

The main point, however, is that at this critical 3.5 million year transition, human ancestors were become more variable in their diet and in their behavior.  Rather than being locked into one type of food source or one way to pursue food, they were becoming more varied in their diet and behavior.  This made it possible for them to exploit more sources of food, nourish even bigger brains, travel and thrive in new niches, and survive climate change cycles, particularly ancient African cycles of wet and dry periods. 

"We don't know exactly what happened," said Matt Sponheimer of Colorado University and one of the researchers. "But we do know that after about 3.5 million years ago, some of these hominids started to eat things that they did not eat before, and it is quite possible that these changes in diet were an important step in becoming human."

If becoming more varied and adaptable is the same as becoming more human, then this study provides an important insight into this process.  One of the papers (Wynn et al.) concludes with this sentence: “This dietary flexibility implies unique landscape use patterns and malleable foraging behavior within a narrow time from of a single species.”  In other words, they were able to adjust quickly, seizing new opportunities and adapting to environmental changes. 



 

Thursday, May 16, 2013

Deep Brain Cognitive Enhancement: The Latest News

The search for new methods of cognitive enhancement has just reached new depths.  Researchers in Austria and the UK report exciting new evidence that a form of noninvasive deep brain stimulation enhances the brain’s ability to do arithmetic. 

"With just five days of cognitive training and noninvasive, painless brain stimulation, we were able to bring about long-lasting improvements in cognitive and brain functions," says Roi Cohen Kadosh of the University of Oxford and lead author of the report that appears in the May 16, 2013 issue of Current Biology.  His comments were provided by the journal.

Photo Credit.  Photo by Ad Meskens of an original oil painting by Laurent de La Hyre (French, 1606-1656).  The title of the painting is Allegory of Arithmetic (Allegorie van de rekenkunde) and it dates to about 1650.  The original painting is in the Walters Art Museum, Baltimore, Maryland.  It was photographed on 18 July 2007 by Ad Meskens, who has made it freely available with proper credit.

In this study, the team used a form of noninvasive deep brain stimulation known as “transcranial random noise stimulation” or TRNS.  The TRNS input was combined with more traditional math training and drills.  Twenty-five young adults, males and females, were divided into two groups, one receiving math training with the TRNS and the other receiving math training combined with a “sham” version of TRNS, a kind of placebo. 

Not only did those who received TRNS do well immediately, but the benefits lasted for at least six months.  In addition, brain monitors detected different brain activity for those receiving TRNS.  This suggests that TRNS modifies brain function.

According to Cohen Kadosh, "If we can enhance mathematics, therefore, there is a good chance that we will be able to enhance simpler cognitive functions."

In the paper’s conclusion, the authors state that TRNS “can enhance learning with respect to high-level cognitive functions, namely algorithmic manipulation and factual recall in mental arithmetic. When this learning is based on deep-level cognitive processing, as is the case for calculation arithmetic, such enhancements are extremely long-lived both behaviorally and physiologically.

Then they sum up with these words:
Both the behavioral and physiological changes displayed extreme longevity, spanning a period of 6 months, but only when learning involved deep-level cognitive processing. By its demonstration of such longevity and, for the calculation task, generalization to new, unlearned material, the present study highlights TRNS as a promising tool for enhancing high-level cognition and facilitating learning. These findings have significant scientific and translational implications for cognitive enhancement in both healthy individuals and patients suffering from disorders characterized by arithmetic deficits.

The paper, Snowball et al.: "Long-Term Enhancement of Brain Function and Cognition Using Cognitive Training and Brain Stimulation," appears in the May 16, 2013 issue of Current Biology