Showing posts with label human evolution. Show all posts
Showing posts with label human evolution. Show all posts

Thursday, October 17, 2013

What a Small Brain Can Tell Us

New information about an early human skull sheds more light on the very first members of the human genus.  The skull, found in Dmanisi, Georgia in 2005, has now been freed from the stone casing that has preserved it for the past 1.8 million years. An international team led by David Lordkipanidze of the Georgian National Museum report its findings in the October 18 issue of the journal Science. 

Photo Caption: The Dmanisi D4500 early Homo cranium in situ. Photo courtesy of Georgian National Museum.

When the world first learned of early human remains in Georgia, the news came as a bit of a shock.  These early humans seemed quite similar to other remains found in Africa and dating to the same time.  That suggests they were able to travel and adapt to new settings. 

The latest analysis contains a new surprise.  The skull described in the new report has an unexpectedly small brain size, at or below the range usually seen as minimal for our genus.  At 546 cubic centimeters, its small brain widens our view of variability of humans at this time. 

Does this skull, identified as Skull 5 from Dmanisi, really measure up to being in the genus Homo at all? It is something else, like Australopithecus?  The researchers argue that it is clearly part of the genus Homo for the simple reason that Skull 5 is found with other, larger-brained skulls, all clearly part of the same community.  One Georgian brain was as large as 730 cc.  What this suggests is that Skull 5 is part of Homo but that our definition of Homo should be broadened. 

In fact, all this diversity at one site provides support for one side in an ongoing debate.  Are species defined broadly in terms of variability, or does small to moderate variation indicate separate species.  This finding supports the view that at least in terms of early humans, a species can be quite variable.      

Not too long ago, Lordkipanidze and his team took the opposite view.  They believed that these early humans from Georgia were a distinct species, what they called Homo georgicus.  The new paper retracts that claim, saying that the new evidence of variation in Georgia means that these fossils fit within the widened range variability of Homo erectus, a globally dispersed species.  More precisely, they see the Georgian samples as best classified as Homo erectus ergaster georgicus, part of the species Homo erectus but distinct because of modifications over time and because of location. 

Commenting on the variation in the skulls found almost literally on top of each other at Dmanisi, co-author Christoph Zollikofer notes that the skulls “look quite different from one another, so it's tempting to publish them as different species.  Yet we know that these individuals came from the same location and the same geological time, so they could, in principle, represent a single population of a single species,” Zollikofer said in a press release issued by the journal Science. 

The key claim advanced in the article, however, is that these samples from Georgia and Africa, together with other samples from Asia, are all part of one global species.  The report describes them as Homo erectus, seen as “a single but polymorphic lineage.” 

The diversity found in Georgia also suggests that the number of individuals in that region may have been larger than first thought, possibly numbering 10,000 or so.  And the small size of Skull 5’s brain suggests that they traveled all this way before brains began to expand.

The report, “A Complete Skull from Dmanisi, Georgia, and the Evolutionary Biology of Early Homo," is published in the 18 October 2013 issue of the journal Science, published by the American Association for the Advancement of Science.    

Thursday, July 4, 2013

The Rise of Agriculture: New Findings, Added Complexity

In the grand story of human origins, the invention of agriculture is one of the most pivotal chapters.  It is generally agreed that farming first arose in the Fertile Crescent about 12,000 years ago.  But did it arise in at one end of the Crescent and spread to the other?  Or did it arise independently in various locations across the entire region, from modern Israel to modern Iran? 

Photo caption: Hordeum spontaneum, wild barley from Chogha Golan, Iran. [Image courtesy of TISARP]

New research suggests that agriculture arose independently at various locations. While the newly developed agricultural techniques and selected grains probably spread quickly, newly published evidence suggests that the inventive process itself was widespread.  The research, conducted by Simone Riehl from the University of Tübingen in Germany along with colleagues from the Tübingen Senckenberg Center for Human Evolution and Paleoecology, is published in the July 5, 2013 issue of the journal Science.

A key debate in human evolution is whether momentous changes such as agriculture occur in big, rapid, and isolated bursts, or whether such grand changes are the cumulative result of smaller changes widely distributed over vast areas and long periods of time.  This new evidence seems to support the view that changes are distributed and cumulative rather than rapid.

Field work in Chogha Golan, Iran, led Riehl’s team to the discovery of wild, progenitor versions of barley, lentil, and wheat.  At the same site, early domesticated forms of these same plants are found, suggesting that the domestication occurred onsite.  Domesticated plants and animals form the core of agriculture and the economic basis for the rise of human cities and civilization.  

Tools and figurines were also found, dating from 12,000 to around 9,800 years before the present. The rise of agriculture in this region during this period set the stage for the growth of human population, the development of cities, and the rise of ever-more complex cultures.

The article is entitled "Emergence of Agriculture in the Foothills of the Zagros Mountains of Iran."  It appears in the 5 July 2013 issue of the journal Science.  

Monday, June 3, 2013

We Are What We Ate: Diet and Human Evolution

At a key moment in human evolution, our diet expanded and became more diverse, setting the stage for humans to draw on a wider range of food sources to feed expanding brains.

Four academic papers published together in the June 3, 2013 issue of the Proceedings of the National Academy of Sciences report on new methods of studying the carbon found in ancient teeth, going back more than 4 million years.  Ancestors living then ate pretty much what apes eat today, a diet of mostly leaves and fruits.  Then about 3.5 million years ago, a major shift occurs. 
Caption:This is an artist's representation of Paranthropus in southern Africa more than 1 million years ago.  Credit:Illustration courtesy ArchaeologyInfo.com/ScottBjelland.  Usage Restrictions: None
  
The old food sources remained in use, but new sources are added.  Researchers came to this conclusion by analyzing the carbon isotopes still present in ancient teeth.  After examining 175 specimens from 11 different species, they concluded that a key shift occurred at about 3.5 million years ago.  At that point, at least some of our ancestors were supplementing the usual foods by turning to grasses or sedges—or to the animals that graze on them.  These ancestors, including Australopithecus afarensis (best known as the famous “Lucy”), became more diverse in their food sources.

The earliest known evidence suggests that at about this same time, our human ancestors were making tools and using them to butcher large animals for food.  If these animals ate grasses, the carbon would have entered the human diet that way.  Another possibility is that human ancestors were simply learning to identify other types of plants as food sources compatible with human metabolism.

The main point, however, is that at this critical 3.5 million year transition, human ancestors were become more variable in their diet and in their behavior.  Rather than being locked into one type of food source or one way to pursue food, they were becoming more varied in their diet and behavior.  This made it possible for them to exploit more sources of food, nourish even bigger brains, travel and thrive in new niches, and survive climate change cycles, particularly ancient African cycles of wet and dry periods. 

"We don't know exactly what happened," said Matt Sponheimer of Colorado University and one of the researchers. "But we do know that after about 3.5 million years ago, some of these hominids started to eat things that they did not eat before, and it is quite possible that these changes in diet were an important step in becoming human."

If becoming more varied and adaptable is the same as becoming more human, then this study provides an important insight into this process.  One of the papers (Wynn et al.) concludes with this sentence: “This dietary flexibility implies unique landscape use patterns and malleable foraging behavior within a narrow time from of a single species.”  In other words, they were able to adjust quickly, seizing new opportunities and adapting to environmental changes. 



 

Thursday, April 11, 2013

The Two Million Year Question

Careful studies of 2-million year old human-like fossils just published in the April 12, 2012 issue of Science raise more questions than they answer.

These papers provide highly detailed information about the teeth, rib cage, hands, and feet of this strange relative, known to scientists as Australopithecus sediba.  But we still do not know the answer to the biggest question of all.  How does sediba fit in the human family tree?  Is sediba a direct human ancestor?  If not, why are they so similar to us in some respects?

Photo Credit: The reconstructed skull and mandible of Australopithecus sediba.Reconstruction by Peter Schmid, Photo by Lee R. Berger. Image courtesy of Lee R. Berger and the University of the Witwatersrand.

The teeth are mostly like those of Australopithecus africanus but also quite a bit like the earliest examples of the genus Homo.  That is surprising.  For some experts, it calls into question the standard view that Homo evolved from Australopithecus afarensis, most commonly known as “Lucy.” 

The new analysis suggests an evolutionary pathway from africanus to sediba to Homo.  In that case, Lucy is a relative but not an ancestor.  Sediba is. 

Not so fast, others insist.  The first examples of Homo may go back to 2.3 million years ago, long before sediba appears at just under two million years ago.  Lucy and her afarensis kin lived much earlier, enough to be ancestral to Homo. 

Based on what we know now, the debate will continue because the facts just do not line up neatly or offer a simple story.  "Our study provides further evidence that sediba is indeed a very close relative of early humans, but we can't definitively determine its position relative to africanus,” study co-author Debbie Guatelli-Steinberg said according to a release from Ohio State University.

What these studies do provide is a remarkably complete picture of what early human-like ancestors look like.  They also provide another surprise.  Despite having a foot with a narrow heel, similar to chimpanzees, sediba definitely walked upright, maybe even using a somewhat awkward never known before to scientists.  They were clearly not knuckle-walkers, like the apes, but they were not nearly as graceful as the humans who followed.  It seems they walked upright differently.  

For now, what all this suggests is that the story of our deep ancestry is more complex than we usually imagine.  Straight ancestral lines are hard to draw.  More finds may help sort things out.  But they may also add new complexity.  The way it looks, multiple forms of early human life may have existed at once.  They differed slightly from each other and also in the degree to which they resemble us.  That makes it very hard to sort out the lineages.  

Is sediba a direct human ancestors?  Yes, at least according Lee Berger, who discovered sediba in a pit in northern South Africa in 2008.  Most experts, however, argue no, mainly the dates are out of line.  What difference does it make?  Perhaps the biggest significance of this debate is to show us that the more we know, the more we see a complex picture of multiple species and perhaps interweaving lineages, making it all the more remarkable that we are here at all. 

This research is published as a set of six research reports in the April 12, 2012 issue of the journal Science, a publication of the American Association for the Advancement of Science. 

Thursday, March 7, 2013

What a Smart Mouse Can Tell Us about Evolution


Just a few years ago, we thought that brains were all about neurons.  Sure, we also have glial cells, but the job of the lowly glia is to take care of the neurons, which do all the serious cognitive work. 

But why are the glia of humans and other primates so large and varied in their shape and structure?  Why are they so different from the simpler, smaller glia found in mice and other rodents?  Could the difference play a role in the evolution of human intelligence?

One way to compare a mouse and a human is to create a mouse that is part human.  That’s exactly what researchers at the University of Rochester did.  They implanted human cells into mouse brains.  More precisely, they implanted human glial progenitor cells into newborn mouse pups. 

What they got were chimeras, mice with human cells integrated into their brains.  When the researchers examined the brains of these chimeric mice, they found that the human cells proliferated and were widely present throughout the brain. Although interacting with mouse brain cells, the human cells remained distinctly human in their unusually large size and varied structures.

Photo credit:  A 23 week human culture astrocyte stained for GFAP.   From Wikimedia Commons.  Date: 24 February 2012.  Author: Bruno Pascal. 

Most surprising is that the chimeric mice were  smarter than unaltered mice born in the same litters.  Human glia in a mouse brain seems to make a smarter mouse.  

Why?  The answer probably involves one type of glial cell called astrocytes. Compared to other species, human brains have many more astrocytes.  Ours are larger and more varied in their structure, capable of connecting many neurons and coordinating the activity that occurs at many synapses. 

Based on this study, published in the March 7, 2013 issue of Cell Stem Cell, we now know that human astrocytes boost intelligence in chimeric mice as measured by standard testing procedures.  

This is pretty good evidence to suggest that the evolution of the larger, more complex glial cells was a critical aspect of the evolution of higher intelligence.  At least that is the conclusion drawn by one of the senior authors of the paper, Steven Goldman. “In a fundamental sense are we different from lower species,” he said, according to a press release from the University of Rochester. “Our advanced cognitive processing capabilities exist not only because of the size and complexity of our neural networks, but also because of the increase in functional capabilities and coordination afforded by human glia.”

What makes this study intriguing is that it uses stem cell technology to study brain function and to learn something important about evolution.  By implanting stem cells in create chimeric mice, researchers learn that glia play a critically important role in intelligence and that evolved changes in glial cells are a key part of the story of the rise of intelligence. 

Concerning the role of glial cells in the complex brain, Maiken Nedergaard, another senior author, had this to say:  “I have always found the concept that the human brain is more capable because we have more complex neural networks to be a little too simple, because if you put the entire neural network and all of its activity together all you just end up with a super computer.”

“But human cognition is far more than just processing data, it is also comprised of the coordination of emotion with memory that informs our higher abilities to abstract and learn,” Nedergaard added.

And concerning what chimeric mice have to teach us about evolution, Steven Goldman made this comment: “This study indicates that glia are not only essential to neural transmission, but also suggest that the development of human cognition may reflect the evolution of human-specific glial form and function.”

Or to quote the original paper: “These observations strongly support the notion that the evolution of human neural processing, and hence the species-specific aspects of human cognition, in part may reflect the course of astrocytic evolution.”

The paper does not address the interesting ethical questions raised by making smarter mice.  Over the past decade, ethicists have debated the moral legitimacy of chimeric animals.  One point of concern has been the creation of nonhuman animals with human brain cells.  To defend this practice, it is often said that a mouse brain with human cells is still a mouse brain.  It still has the structure or architecture of a mouse brain.  It may have human cells, but in no way is it a human brain or even a half mouse/half human brain.

This study suggests we should take a closer look at that line of thinking.  Maybe it is true that adding human neurons to a mouse brain does not change the mouse brain structure.  But this study implies that adding human astrocytes to a mouse brain may begin some small but significant change in structure and function. 

The study is clear about the fact these chimeric mice are more intelligent than the unmodified mice.  Their brains are quite literally faster. 

Once again, Goldman: “The bottom line is that these mice demonstrated an increase in plasticity and learning within their existing neural networks, essentially changing their functional capabilities.”

These animals have been cognitively “elevated,” to use a word sometimes found in the debate.  Probably no one will object to the idea of a slightly smarter mouse.  Researchers take special care to make sure these mice do not breed and produce pups of their own.  But even if they did, the added intelligence would not pass to future generations.  They would produce normal lab mice. 

Even so, this study—combining stem cells technology, neuroscience, and evolution in one elegant package—raises intriguing moral questions.  Are we smart enough to know how far we should go in creating smarter mice?  

The study, entitled “Forebrain engraftment by human glialprogenitor cells enhances synaptic plasticity and learning in adult mice,” appears in the March 7, 2013 issue of Cell Stem Cell
   
 


Thursday, January 3, 2013

Past and Future Selves

Are you done changing? Are your values and personality pretty much set for life? Regardless of our age, most of us seem to think so.

According to new research, people generally recognize that they have changed over the past decade. But in the decade ahead? Overwhelmingly, people think their biggest changes are behind them. It’s as if their present state is the defining moment, when values and personality traits are fully realized and fix forever. The research team, led by Jordi Quoidbach, called this the “End of History Illusion.”

In six studies involving more than 19,000 participants, researchers “found consistent evidence to indicate that people underestimate how much they will change in the future,” according to the study appearing in the 5 January 2013 issue of the journal Science.

Like most illusions, this one comes with a big cost. Thinking they won’t change makes it more likely they will “make decisions that their future selves regret.”

What’s most amazing about this illusion is that it seems to hold true at all ages. In fact, some of the results suggested that more than their grandparents, young people think they are done changing.

Caption: Painting, Girl in a Mirror (1632) by Paulus Moreelse, purchased by the Rijksmuseum Amsterdam with support of the Vereniging Rembrandt. In the public domain.

This much, at least, was clear to the researchers: “At every stage of adult life that we could analyze. Both teenagers and grandparents seem to believe that the pace of personal change has slowed to a crawl and that they have recently become the people they will remain. History, it seems, is always ending today.”

While the researchers are clearly speaking of the history of the individual, their research raises the question of whether there’s a similar illusion when it comes to human history. For example, do we routinely underestimate the amount of technological change that lies ahead or its cultural and social impact? We acknowledge the profound cultural changes in past decades, but do we underestimate what is coming?

We marvel at the transformations of human evolution, but do we fail to imagine the changes that lie ahead? According to the researchers, "people may confuse the difficulty of imagining personal change with the unlikelihood of change itself." If that is true of the human individual, might it also be true of the human species?

The research appears as “The End of History Illusion” in the 4 January 2013 issue of the journal Science, a publication of the American Association for the Advancement of Science.

Thursday, November 15, 2012

Stone-Tipped Weapons: Older than We Thought

Stone-tipped spears have been around for at least 500,000 years, according to new research. That is about 200,000 years earlier than previously thought.

Why is that important? In part because it suggests that modern humans did not invent this technology. They did not get it from the Neandertals, nor did Neandertals get it from modern humans. Instead, it now seems that Neandertals and modern humans both used stone-tipped spears because both inherited this technology from an earlier form of human life.

It is generally believed that Neandertals and modern humans diverged about 500,000 years ago. The current view is that both came from earlier humans known as Homo heidelbergensis.

"Rather than being invented twice, or by one group learning from the other, stone-tipped spear technology was in place much earlier," according to Benjamin Schoville, who coauthored the study and is affiliated with the Institute of Human Origins at Arizona State University. "Although both Neandertals and humans used stone-tipped spears, this is the first evidence that this technology originated prior to or near the divergence of these two species," Schoville said according to a press release from his university.

Caption: A ~500,000-year-old point from Kathu Pan 1. Multiple lines of evidence indicate that points from Kathu Pan 1 were used as hafted spear tips. Scale bar = 1 cm. Credit: Jayne Wilkins. Usage Restrictions: Image may be used to illustrate coverage of this research only.

"This changes the way we think about early human adaptations and capacities before the origin of our own species," said Jayne Wilkins, a lead author from the University of Toronto. Technological advance—in this case stone-tipped spears—is now seen as more widely shared among the various forms of humanity and not so confined to anatomically modern humans like us. Creating stone-tipped spears requires more forethought and care than simpler stone tools, especially in preparing the tips for mounting to the wooden shaft of the spear. This process is called “hafting,” and the result is that a more efficient hunting weapon is created.

In this study, researchers re-examined stone points discovered more than thirty years ago. By comparing the damage to the spear tips with simulated damage re-created under laboratory conditions, researchers found evidence that strongly supports the view that the original tips were used for spears.

"When points are used as spear tips, there is a lot of damage that forms at the tip of the point, and large distinctive fractures form. The damage on these ancient stone spear points is remarkably similar to those produced with our calibrated crossbow experiment, and we demonstrate they are not easily created from other processes," said coauthor Kyle Brown, a skilled stone tool replicator from the University of Cape Town.

Brown, along with others who worked on the current paper, also collaborated on a study just released describing further stone weapons refinements that occurred about 70,000 years ago and probably gave modern humans an advantage over Neadertals. For more on that, see Better Technology, Better Weapons.

The most recent findings that push the date of stone-tipped spears back to 500,000 years ago are published as "Evidence for Early Hafted Hunting Technology" in the November 16, 2012 issue of Science.

Wednesday, November 7, 2012

Better Technology, Better Weapons

Ongoing archeological discoveries from coastal South Africa point consistently to a technological and cultural explosion occurring there more than 70,000 years ago. The latest paper, appearing in the November 7 issue of the journal Nature, fills in more detail about remarkable advances in stone tool technology that only appear in Europe some 50,000 years later.

The new findings, reported by an international team of researchers led by Curtis Marean, help fill in a more comprehensive picture of the culture that flourished on the coast of South Africa for thousands of years. In 2009, Marean's team published a report showing how the controlled use of fire played a key role in the engineering of stone tools. The 2012 paper provides evidence that this technology was used for at least 11,000 years by the inhabitants of the coast.

"Eleven thousand years of continuity is, in reality, an almost unimaginable time span for people to consistently make tools the same way," said Marean. "This is certainly not a flickering pattern."

PHOTO: Caption: These microlith blades show a flat edge with a rounded "cutting" edge. Credit: Simen Oestmo. Used by permission of the ASU Institute of Human Origins for the purposes of illustrating coverage of the accompanying article.

One possibility suggested by this research is that the 70,000 year old technology found in South Africa was brought out of Africa by modern humans. If so, it may help explain why Neandertals disappeared as modern humans entered Europe and Asia. Advances in technology made it possible to create light-weight points for spears or arrows, mostly likely used for small spears launched by spear-throwing devices known as atlatls, which effectively extend the length of the throwing arm.

"When Africans left Africa and entered Neanderthal territory they had projectiles with greater killing reach, and these early moderns probably also had higher levels of pro-social (hyper-cooperative) behavior. These two traits were a knockout punch. Combine them, as modern humans did and still do, and no prey or competitor is safe," said Marean. "This probably laid the foundation for the expansion out of Africa of modern humans and the extinction of many prey as well as our sister species such as Neanderthals."

If there is any truth to this conjecture, it is a sobering truth. This technological advance makes it easier to kill.

The new paper reports on findings at the Pinnacle Point excavation site, a mere some 50 miles from Blombos cave, home to similar findings and to the first "chemical engineering laboratory" for the production of the pigment, ochre. Whoever lived there was technologically and culturally advanced, with all the ambiguities that implies.

The paper, "An Early and Enduring Advanced Technology Originating 71,000 Years Ago in South Africa," appears in the November 7 issue of the journal Nature.

Thursday, October 4, 2012

Human-Neandertal Interbreeding: When and Where?

Comparison between Neandertal and anatomically modern human genomes shows a history of interbreeding. Some living human beings—those with ancestry in Europe and Asia—carry the results of that interbreeding in their DNA. Those with ancestry in sub-Saharan Africa typically do not.

We also know that Neandertals lived in Eurasia from 230,000 until about 30,000 years ago. Where they came from or why they disappeared remains an open question. And we know that anatomically modern humans first appear in Africa at least 200,000 years ago. Some of them made their way to Asia and Europe sometime in the last 100,000 years.

So when did modern human/Neandertal interbreeding last occur? Did it occur deep in our past, before modern humans and Neandertal ancestors left Africa? Or did it occur after both left Africa, sometime—in other words—within the past 100,000 years?

A new study claims to find evidence that the interbreeding occurred out of Africa. Researchers argue that on the basis careful analysis of the shared DNA, the most recent interbreeding occurred sometime between 37,000 and 86,000 years ago.

Caption: Reconstruction of a Neandertal, 2006, by Stefan Scheer, from Stefanie Krull, Neanderthal Museum Picture Library, Mettmann, Germany

If so, it is pretty strong evidence that the interbreeding occurred after anatomically modern human left Africa. This may have occurred in the Middle East, researchers point out, but probably not just at the beginning of the modern human migration out of Africa. The most recent interbreeding, they conclude, occurs well after this 100,000 date, suggesting ”a more recent period, possibly when modern humans carrying Upper Paleolithic technologies expanded out of Africa.”

In that case, the conceptual challenge posed by the modern human/Neandertal interbreedng remains clearly in front of us. What is the human species? Were Neandertals human? And what are we to make of our new insight into modern human diversity. All puzzling questions, to put it mildly.

The article, "The Date of Interbreeding between Neandertals and Modern Humans," is published in the current issue of PLOS Genetics, where it is available free to the public.

Thursday, August 30, 2012

Denisovan DNA in Focus

Using new techniques to study ancient DNA, scientists have unraveled the genetic details of a young girl who lived in central Asia around 50,000 years ago. She is the only individual of her kind, a unique branch of the human family called the Denisovans, named for the cave where her remains were found in 2008.

What makes the research all the more startling is that only two teeth and one pea-size bone fragment has been found. But from those tiny fragments of humanity, the story of the Denisovans is being pieced together.

The new techniques were developed by Matthias Meyer, working at the Department of Evolutionary Genetics, Max Planck Institute for Evolutionary Anthropology in Leipzig, a research program led by Svante Pääbo. DNA extracted from the bone fragment was separated into two strands that were amplified and analyzed separately, many times over, until a highly reliable sequence was determined.

Laboratory for the extraction of ancient DNA. [Image courtesy of Max Planck Institute for Evolutionary Anthropology].

Researchers claim that the result is as complete and accurate as the sequence of living human beings. Already, the new technique is being used to study other ancient remains, including samples of Neandertal DNA. Denisovans and Neandertals, distinct but closely related forms of humanity, overlapped with anatomically modern humans (AMH) and interbred with them.

New methods in genetics, including the technical breakthrough described in this paper, are opening new windows on the human family tree, which resembles an inter-grown vine more than a straight line of branches.

So accurate is the genetic analysis that researchers can reach some conclusions about other Denisovans, even though no samples exist for them. For one thing, despite their wide geographic spread, they apparently never reached high numbers. Their DNA lives on today in the faint echo of ancient interbreeding found in the uniquely-Denisovan sequences carried by those who live in the islands of southeast Asia.

No one knows what Denisovans looked like, but they probably resembled us in many ways. The Denisovan girl whose DNA was studied carried genes that are associated today with brown hair, brown eyes, and dark skin. Like us they had 23 pairs of chromosomes (compared to chimps with 24), making interbreeding more readily possible.

Denisova molar, distal. [Image courtesy of Max Planck Institute for Evolutionary Anthropology].

One of the more tantalizing aspects of the report is the light it sheds not on the Denisovans or the Neandertals but on us anatomically modern human beings who live today. Why did we survive and flourish culturally when they did not?

One explanation may lie in the genetic differences between us and them, which can be studied for the first time in detail. In this paper, researchers identify specific changes in genes that are associated with brain complexity, synaptic connections, and speech development. According to the paper, “it is thus tempting to speculate that crucial aspects of synaptic transmission may have changed in modern humans.” In other words, tiny differences in DNA led to still relatively small differences in brain function that led to huge differences in culture.

Future technical advances will continue to shed new light on the complex story of recent human ancestry. By gaining ever-higher clarity on the genetic differences between Neandertals, Denisovans, and modern humans, we will come to know the story of our humanity in greater detail.

The paper ends with this reflection: “This [work] should ultimately aid in determining how it was that modern humans came to expand dramatically in population size as well as culturally complexity which archaic humans eventually dwindled in numbers and became physically extinct.” The paper, “A High-Coverage Genome Sequence from an Archaic Denisovan Individual,” is published in the 30 August 2012 issue of Science, published by the American Association for the Advancement of Science.

Wednesday, July 18, 2012

Neandertal Medicine

Neandertals not only ate their vegetables. They used specific plants—even ones that tasted bitter—to treat their ailments. That’s the latest finding from the international team of researchers studying Neandertal remains at in El Sidrón archeological site in northern Spain. Discovered in 1994, El Sidrón has yielded thousands of samples from at least 13 Neandertal individuals.

Using newer techniques of microanalysis, the team studied the dental plaque recovered from teeth of five individuals dating about 50,000 years ago. Lodged in the plaque were tiny microfossil remains of various plants, providing evidence that Neandertals supplemented their diet of meat with a wide range of grain, herbs, and vegetables. The study is published this week in Naturwissenschaften (The Science of Nature).

CAPTION: Researchers working in El Sidrón Cave. Credit: CSIC Comunicación.

"The varied use of plants we identified suggests that the Neanderthal occupants of El Sidrón had a sophisticated knowledge of their natural surroundings which included the ability to select and use certain plants for their nutritional value and for self-medication. While meat was clearly important, our research points to an even more complex diet than has previously been supposed," according to Karen Hardy, a leader in the research team, according to a press release from the University of York.

Neandertals disappeared from Europe and Asia somewhere around 30,000 years ago, often sharing regions with modern humans for thousands of years. Only recently has it become clear that they depended heavily on plants as well as meat for their food.

"The evidence indicating this individual was eating bitter-tasting plants such as yarrow and camomile with little nutritional value is surprising. We know that Neanderthals would find these plants bitter, so it is likely these plants must have been selected for reasons other than taste," said Dr Stephen Buckley, a member of the research team.

The clear implication of the study—that Neandertals recognized the medicinal value of certain plants—provides further evidence of the sophistication of Neanderthal culture and technology. The full scope of Neandertal cultural interaction with modern humans remains an open question.

"El Sidrón has allowed us to banish many of the preconceptions we had of Neanderthals. Thanks to previous studies, we know that they looked after the sick, buried their dead and decorated their bodies. Now another dimension has been added relating to their diet and self-medication," according to Antonio Rosas, also on the research team.

CAPTION: Microscopically visible material entrapped in dental calculus samples – filamentous and cocci bacteria. Credit: Karen Hardy/Naturwissenschaften.

The article, "Neanderthal medics? Evidence for food, cooking and medicinal plants entrapped in dental calculus," is published in the current issue of Naturwissenschafen.

Thursday, June 14, 2012

Art in an Age of Neandertals

The oldest confirmed date for cave art has just been pushed back again. New research re-dates paintings in caves in northern Spain to a staggering 40,800 years ago, right at the time when anatomical modern humans (AMHs) like us were just arriving in the region and encountering Neandertals.

In fact, this art is so ancient that it raises the haunting possibility that Neandertals were the painters. If so, then modern humans are not the only form of humanity to create cave art. For now, however, the question of who painted this art is a matter of speculation, something that might be settled by further research.

In work published in the June 15, 2012 issue of Science, researchers studied 50 paintings in eleven caves in the northernmost part of Spain, including the UNESCO World Heritage sites of Altamira, El Castillo and Tito Bustillo. The work was conducted by an international team led by Alistair Pike of the University of Bristol.

The Corredor de los Puntos, El Castillo Cave, Spain. Red disks here have been dated to 34,000-36,000 years ago, and elsewhere in the cave to 40,600 years, making them examples of Europe's earliest cave art. Image courtesy of Pedro Saura.

This research comes on the heels of a re-dating of cave painting in France, recently pushed back to 37,000 years. The latest study adds almost another 4,000 years to the confirmed date of the oldest art. What’s the combined effect of the two studies? In just the past month, our view of the antiquity of art has jumped by nearly 10,000 years, prompting us to wonder how much further back it might go. After all, it is known that AMHs mixed pigments as far back as 100,000 years ago.

Using a new method called uranium-thorium dating, Pike’s research team took a closer look at an old find. They extracted tiny samples of naturally forming deposits that covered the paintings. By dating the deposits, scientists are able to discover the date before which the paint was applied. The date of more than forty thousand years ago, therefore, is a minimum date, suggesting that some of the paintings—here or elsewhere—may be even older.

The specific painting that exceeds 40,800 years is a simple red disk, seemingly primitive when compared to paintings made later in the same caves. More striking are the handprint paintings on the wall of El Castillo cave, made by blowing paint and common in early cave art but now dated to 37,300 years ago.

Commenting on the age of the oldest painting, Pike pointed out the tight fit between the painting and the arrival of AMHs in northern Spain: “Evidence for modern humans in Northern Spain dates back to 41,500 years ago, and before them were Neanderthals. Our results show that either modern humans arrived with painting already part of their cultural activity or it developed very shortly after, perhaps in response to competition with Neanderthals – or perhaps the art is Neanderthal art,” Pike said in a press release issued by the University of Bristol.

The Panel of Hands, El Castillo Cave, Spain. A hand stencil has been dated to earlier than 37,300 years ago and a red disk to earlier than 40,600 years ago, making them the oldest cave paintings in Europe. Image courtesy of Pedro Saura.

Pike also speculated further on the possibility that researchers may someday identify some European cave art as Neandertal. He suggested that perhaps, “cave painting started before the arrival of modern humans, and was done by Neanderthals. That would be a fantastic find as it would mean the hand stencils on the walls of the caves are outlines of Neanderthals' hands, but we will need to date more examples to see if this is the case."

The article in entitled “U-series dating of Palaeolithic Art in 11 Caves in Spain” and appears in the June 15, 2012 issue of the journal Science.

Monday, May 14, 2012

How Old Is Art?

Confirmed dates for the world’s oldest art just got older, according to the report of an international research team published in the May 14 issue of the Proceedings of the National Academy of Sciences.

Dating back about 37,000 years, the art consists of engravings made in stone that has since fallen from the ceiling of a cave at Abri Castanet in southwestern France. While not as visually arresting as the more famous cave art found at Chauvet, the Castanet engravings are both older and represent what is very likely an earlier stage in the history of the Aurignacian culture, which spanned 40,000 to about 28,000 years ago. Some of the Chauvet paintings are now confirmed at between 30,000 and 32,000 years ago.

Credit: HTO. A replica of a painting, now in the public domain.

The Castanet engravings are both simpler artistically and were located in the general living area of the cave. The Aurignacian culture that created both the paintings and the engravings is known for is many forms of art. According to New York University anthropology professor Randall White, one of the study's co-authors, the Aurignacians "had relatively complex social identities communicated through personal ornamentation, and they practiced sculpture and graphic arts."

"But unlike the Chauvet paintings and engravings, which are deep underground and away from living areas, the engravings and paintings at Castanet are directly associated with everyday life, given their proximity to tools, fireplaces, bone and antler tool production, and ornament workshops," White said in press release issued by NYU.

With more refined archeological techniques, the story of the rise of human symbolic culture is likely to become more complex and more ancient. While there may well have been bursts of cultural creativity in which symbolic advance occurred rapidly, additional findings may also suggest a more steady rise in the story of human art. The study, entitled “Context and dating of Aurignacian vulvar representations from Abri Castanet, France,” appears in the May 14, 2012 edition of PNAS.

Thursday, May 3, 2012

Human Intelligence: Does It Depend on a Genetic Error?

What makes humans different from the great apes? What makes our brains larger and more complex? We know that our DNA is remarkable similar to other mammals. What subtle genetic changes can explain such huge behavioral differences? One surprising possibility is that our brains are bigger and more complex not so much because of new genes but because of gene duplication.

One gene in particular—SRGAP2—plays a role in how brain cells migrate. It is found widely in mammals of all sorts, from mice to humans. In the great apes, the more archaic form of SRGAP2 results in a relatively slow spread of neurons throughout the brain. Twice in the ancient past, however, SRGAP2 was duplicated, first about 3.4 million years ago and then again around 2.4 million years ago. The second duplication occurred right around the time when the genus Homo separated from Australopithecus. It appears that as a result of these duplications, brains in the Homo lineage—including our own as Homo sapiens—are both large and complex in their number of neuronal connections and in their ability to process information.

A key piece of supporting evidence comes from recent discoveries of the role of SRGAP2 in the development of the human neocortex. When the distinctly human SRGAP2 variants are missing, normal human brain development is impaired. This research appears in two papers appearing May 3, 2012 in the journal Cell. According to one of the papers, “It is intriguing that the general timing of the potentially functional copies…corresponds to the emergence of the genus Homo from Australopithecus (2-3 mya). This period of human evolution has been associated with the expansion of the neocortex and the use of stone tools, as well as dramatic changes in behavior and culture.”

Caption: A team led by Scripps Research Institute scientists has found evidence that, as humans evolved, an extra copy of a brain-development gene allowed neurons to migrate farther and develop more connections. Credit: Photo courtesy of The Scripps Research Institute Usage Restrictions: None

The uniquely human duplications work in a surprising ways, especially the second duplication. The original SRGAP2 remains present in humans today, along with the duplicated versions. The second duplication—SRGAP2C—has the effect of interfering with the original SRGAP2. The reason why SRGAP2C interferes with SRGAP2 rather than boosts it is because the duplicated version is incomplete—in other words, an advantageous copying error.

According to one of the studies, once SRGAP2C appeared about 2.4 million years ago, it created a “dominant negative interaction equivalent to a knockdown of the ancestral copy…The incomplete nature of the segmental duplication was, therefore, ideal to establish the new function by virtue of its structure,” acting in a way that was “instantaneous” in terms of evolution.

"This innovation couldn't have happened without that incomplete duplication," according to Evan Eichler, another leader in the research team. "Our data suggest a mechanism where incomplete duplication of this gene created a novel function 'at birth'."

Even though SRGAP2 duplications seem to play a significant role in distinguishing human beings from the apes, other duplications and mutations are very likely to be involved in the story of human evolution. "There are approximately 30 genes that were selectively duplicated in humans," said Franck Polleux, one of the lead researchers involved in the study, in a press release from the journal. "These are some of our most recent genomic innovations."

Rather than standard mutations, "episodic and large duplication events could have allowed for radical – potentially earth-shattering – changes in brain development and brain function," according to Eichler. For these reasons, this is one of the most intriguing areas for research into the origins of human intelligence.

Whether other duplications—including “incomplete duplications or erroneous copies—also explain our complex brains is something that will be discovered in the next few years.

But what is surprising and somewhat sobering, just based on this SRGAP2 discovery, is how our much-vaunted human uniqueness seems to hang on such a fine thread. If the SGGAP2 duplication is even partly responsible for our complex brains, should we think that our intelligence arose because of a copying error or an incomplete duplication? Is the rise of intelligence and consciousness—truly one of the great events in the story of cosmic evolution—really just based in part on a fluke of nature? Religious or not, hardly anyone is likely to think that thinking is sheer accident.

The papers, Charrier et al.: "Inhibition of SRGAP2 function by its human-specific paralogs induces neoteny during spine maturation" and Dennis et al.: "Human-specific evolution of novel SRGAP2 genes by incomplete segmental duplication," appear in the journal Cell.

Thursday, April 26, 2012

Spreading Farming, Spreading Genes

Agriculture probably originated in the Middle East about 11,000 years ago. Over the next six thousand years, it spread to other parts of the globe, including northern Europe, gradually replacing hunting and gathering as the primary means of human survival.

How did it spread? Were hunter-gatherers converted to the efficiencies of agriculture? Or did farmers from the south spread north, bringing their agriculture with them?

A new study suggests that farming spread because farmers moved. The movement was slow, taking five to six thousand years to reach Scandinavia. Early in the process, farmers of southern ancestry lived side by side with their more northerly human cousins, who still lived by hunting and gathering. Eventually, after a thousand years or so, farmers interbred with hunter-gatherers and farming became the dominant way of life.

The new study, which appears in the April 27 issue of Science, is based on an analysis of four skeletons, all found in Sweden and dating from about 5,000 years ago. Three were hunter-gatherers and one was a farmer. All of them lived their entire lives close to where they were buried, the hunter-gatherers in a flat grave and the farmer a stone megalith like the one pictured below.

Caption: Several hundred megalith tombs are known from the Falbygden area, including Gökhem and Valle parishes in Östergötland,Sweden.Credit: Göran Burenhult

"We know that the hunter-gatherer remains were buried in flat-bed grave sites, in stark contrast to the megalithic sites that the farmers built," said Mattias Jakobsson, a senior author from Uppsala University. "The farmer we analyzed was buried under such a megalith, and that's just one difference that helps distinguish the two cultures," Jakobsson said in a press release issued by the journal.

What is most significant in this study comes from an analysis of the human DNA extracted from the four skeletons. By studying their DNA, researchers found that the farmer belonged to a community with ancestral roots in the eastern Mediterranean, most closely resembling today's Greeks and Cypriots. The hunter-gatherers, on the other hand, were more like today’s northern Europeans, most closely resembling today's Finns. "What is interesting and surprising is that Stone Age farmers and hunter-gatherers from the same time had entirely different genetic backgrounds and lived side by side for more than a thousand years, to finally interbreed," Jakobsson said.

Caption: The skeleton belongs to a young female in her 20s, and can be dated to around 4,700 years ago. Credit: Göran Burenhult

"The results suggest that agriculture spread across Europe in concert with a migration of people," added Pontus Skoglund, also of Uppsala University. "If farming had spread solely as a cultural process, we would not expect to see a farmer in the north with such genetic affinity to southern populations."

The article, entitled "Origins and Genetic Legacy of Neolithic Farmers and Hunter-Gatherers in Europe," appears in the April 27 issue of Science, published by the American Association for the Advancement of Science.

Monday, April 2, 2012

A Million Years of Fire

One of our newest technologies has just shed new light on one of our oldest.

When did our human ancestors learn to control and use fire? Armed with the latest high tech tools, an international team of researchers has pushed the date back to 1 million years. That’s 300,000 years earlier than previous unambiguous dates.

The massive Wonderwerk Cave is in northern South Africa on the edge of the Kalahari. Previous excavations have shown extensive human occupation. Using the new techniques of micromorphological analysis and Fourier transform infrared microspectroscopy (mFTIR), researchers analyzed cave sediments at a far more detailed level than possible before.

Caption: This is a panoramic view of the entrance to Wonderwerk Cave, South Africa. Credit: H. Ruther. Usage Restrictions: None

In the cave sediments researchers found bits of ash from plants along with fragments of burned bone. Did the wind blow burning debris into the cave? The evidence—collected about 100 feet from the current opening of the cave—supports the conclusion that the fire burned in the cave. Also part of the proof: the surrounding surfaces are discolored.

”The analysis pushes the timing for the human use of fire back by 300,000 years, suggesting that human ancestors as early as Homo erectus may have begun using fire as part of their way of life," anthropologist Michael Chazan said in a press release from the University of Toronto.

According to the paper, "Through the application of micromorphological analysis and Fourier transform infrared microspectroscopy (mFTIR) of intact sediments and examination of associated archaeological finds— fauna, lithics, and macrobotanical remains—we provide unambiguous evidence in the form of burned bone and ashed plant remains that burning events took place in Wonderwerk Cave during the early Acheulean occupation, approximately 1.0 Ma. To date, to the best of our knowledge, this is the earliest secure evidence for burning in an archaeological context."

Caption: Interior of Wonderwerk Cave. Images courtesy of M. Chazan.

"The control of fire would have been a major turning point in human evolution," says Chazan. "The impact of cooking food is well documented, but the impact of control over fire would have touched all elements of human society. Socializing around a camp fire might actually be an essential aspect of what makes us human."

How important are fire and cooking for human evolution. A recent book, Catching Fire: How Cooking Made Us Human by Richard Wrangham, argues that cooking is essential to our humanity. Now in the paper published on April 2, the team concludes that its study “is the most compelling evidence to date offering some support for the cooking hypothesis of Wrangham.”

Their work is published as “Microstratigraphic evidence of in situ fire in the Acheulean strata of Wonderwerk Cave, Northern Cape Province, South Africa,” in the April 2, 2012 issue of the Proceedings of the National Academy of Sciences.

Monday, March 19, 2012

Space Age View of Stone Age Settlements

Thousands of early human settlements have been located using computers and satellite images, according to a paper published on March 19 in the Proceedings of the National Academy of Sciences.

Researchers used computers to analyze satellite images of a 23,000 square kilometer region in the Upper Khabur Basin of northeastern Syria. The finding? They believe they can identify 14,312 possible sites of human settlements dating back eight thousand years. The region, a relatively small 100 miles square, was nearly 3% occupied at various points over the intervening millennia.

Harvard archeologist Jason Ur collaborated with MIT researcher Bjoern Menze to develop a system that identified settlements based on a several factors. Old sites tend to leave mounds that show distinctive shapes and colors from the collapse of building materials such as mud bricks.

“With these computer science techniques, however, we can immediately come up with an enormous map which is methodologically very interesting, but which also shows the staggering amount of human occupation over the last 7,000 or 8,000 years,” Ur said in a press release issued by Harvard.

"What's more, anyone who comes back to this area for any future survey would already know where to go," he continued. "There's no need to do this sort of initial reconnaissance to find sites. This allows you to do targeted work, so it maximizes the time we have on the ground."

The article, “Mapping patterns of long-term settlement in Northern Mesopotamia at a large scale,” appears in the March 19 issue of PNAS.

Wednesday, March 14, 2012

Red Deer People? Really?

Did our human family tree just grow a new branch? That seems to be the tentative conclusion reported today in the online journal, PLoS ONE.

A first analysis of human remains from two caves in southwest China has prompted researchers to make some astounding claims: These "Red Deer People" are not anatomically modern humans (AMH). Their remains date from 14,500 to 11,500 years ago, far more recent than anything similar ever found on the mainland of Asia. They shared their territory with modern humans just at the time when early agriculture was being developed. And—even more puzzling—they shared anatomical features with modern and archaic humans.

Caption: An artist's reconstruction of fossils from two caves in southwest China have revealed a previously unknown Stone Age people and give a rare glimpse of a recent stage of human evolution with startling implications for the early peopling of Asia. The fossils are of a people with a highly unusual mix of archaic and modern anatomical features and are the youngest of their kind ever found in mainland East Asia. Dated to just 14,500 to 11,500 years old.

Credit: Art copyright by Peter Schouten Usage Restrictions: Image may be used in association with initial news media reports - otherwise seek permission from Peter Schouten: info@studioschouten.com.au

Who were they? The international team of researchers speak of these early humans as the “Red-Deer People,” named for extinct species of deer they hunted and for the Maludong or “Red Deer Cave” where some of the remains were discovered. The team was led by Professor Darren Curnoe of the University of New South Wales and Professor Ji Xueping of the Yunnan Institute of Cultural Relics and Archeology.

But researchers hesitate to draw any conclusions about species. "These new fossils might be of a previously unknown species, one that survived until the very end of the Ice Age around 11,000 years ago," says Professor Curnoe in a press release issued by UNSW. "Alternatively, they might represent a very early and previously unknown migration of modern humans out of Africa, a population who may not have contributed genetically to living people."

Although the remains were first discovered in 1979, they remained encased in rock until 2009. While the researchers have been able to compare anatomical features with modern and archaic human remains, they have not been able to extract DNA from the samples. According to the paper, “our ongoing attempts to extract DNA from a specimen from Maludong have so far proven unsuccessful owing to a lack of recoverable genetic material.”

"The discovery of the red-deer people opens the next chapter in the human evolutionary story – the Asian chapter – and it's a story that's just beginning to be told," says Professor Curnoe.

The paper is entitled "Human Remains from the Pleistocene-Holocene Transition of Southwest China Suggest a Complex Evolutionary History for East Asians" and appears in the March 14, 2012 issue of PLoS ONE.

Tuesday, December 6, 2011

Evolutionary Fast-Track for Human Brains

More than 35 years ago, Allan Wilson and Mary-Claire King made an astonishing proposal. Maybe what separates humans and chimps is not just our genes. Maybe it’s also how our genes are expressed or regulated.

Research published in today’s issue of PLoS Biology builds on decades of intervening advances in evolution and genetics and take the question much further. The difference between humans and nonhuman primates in cognitive ability is explained in large part by differences in gene expression, especially during the critical periods when young brains are being formed.

Humans share many of their genes with other species, especially chimps. In fact, we share so many genes that it is hard to explain how we can be so different in terms of cognitive ability. If genes make all the difference, how can they explain the differences between chimp and human brains? And how can a mere six million years of human-chimp divergence give us enough time to accumulate enough genetic change?

The answer seems to lie in the relatively rapid evolution of differences in gene expression. In other words, while the genes themselves evolved slowly, the regulation of their expression evolved more rapidly. It’s not just the genes but their expression that’s important. It’s not just the evolution of genes but the evolution of gene expression that drives the rapid divergence between human and chimp brains.

This is especially true in the genes that control the development of the prefrontal cortex of the brain. In other words, there has been relatively rapid evolution in the genetic mechanisms that regulate genes directly responsible for the early-childhood neural development of the critically-important prefrontal cortex, which is involved in abstract thinking, planning, social intelligence, and working memory.

According to the article, “humans display a 3-5 times faster evolutionary rate in divergence in developmental patterns, compared to chimpanzees.” Most important, however, is the way this research identifies specific regulators that have evolved rapidly since human-chimp divergence. These regulators are “micro-RNAs,” some of which are specifically identified in the article, with the claim that “changes in the expression of a few key regulators may have been a major driving force behind rapid evolution of the human brain.”

According to the study’s senior author, Philipp Khaitovich, this finding suggests that "identifying the exact genetic changes that made us think and act like humans might be easier than we previously imagined." Kkaitovich was quoted in a press release issued by the journal, PLoS Biology.

The article is entitled "Micro-RNA-Driven Developmental Remodeling in the Brain Distinguishes Humans from Other Primates" and appears in the December 6 issue of PLoS Biology, where it is freely available to the public.

Wednesday, November 30, 2011

The Great Migration: Tools Mark the Trail

Anatomically modern humans (AMH)—people who looked pretty much like us—migrated out of Africa tens of thousands of years ago and settled across Asia and Europe.

Just who were these people, how long ago did they migrate, and what route did they first take? These are some of the biggest questions in archeology. Now at last researchers seem to be closing in on concrete answers.

In a report published in the November 30 issue of the open-access journal PLoS ONE, an international research team led by Jeffrey Rose presents its analysis of recent work in southern Oman, located on the southeastern corner of the Arabian peninsula.

For years, researchers have debated with each other over the earliest migration route. Was it across the Red Sea to the Arabian boot heel (sea levels being much lower then)? Or was it north from Egypt along the Mediterranean?

Rose and his team found evidence suggesting that AMH residents of the Nile valley migrated—with their distinctive tool technology—to present day Oman. Their analysis of over 100 sites in Oman led researchers to believe that the tool culture was clearly the same in both settings. In other words, one culture spans two continents, clearly supporting the idea of human migration.

Scientists have long known about the Nile valley culture, which they call “Nubian.” The breakthrough reported here is the strong evidence that Nubian toolmakers made their way out of Africa to Arabia, bringing their characteristic stonecutting techniques with them.

The date of migration, according to the report, is at least 106,000 years ago, perhaps earlier.

No human remains were found with the stone tools. This leaves open the possibility that some other humans—“archaic” and not anatomically modern—may be responsible for the stone tools. The researchers dismiss this idea on the grounds that AMH seem to be the only form of humans present in North Africa at the time of the migration.

“After a decade of searching in southern Arabia for some clue that might help us understand early human expansion, at long last we've found the smoking gun of their exit from Africa,” according to Rose, a Research Fellow at the University of Birmingham.

Another surprise contained in the report is that the stone tools were found inland rather than right along the coast. “For a while,” remarks Rose, “South Arabia became a verdant paradise rich in resources – large game, plentiful freshwater, and high-quality flint with which to make stone tools,” according to a press release issued by PLoS One. One possibility is that the “southern route” out of Africa along the southern Arabian peninsula was not so much a coastal expressway to Asia and Europe as it was a settling of the interior of Arabia.

The report, “The Nubian Complex of Dhofar, Oman: An African Middle Stone Age Industry in Southern Arabia,” appears in the November 30, 2011 issue of PLoS ONE