Showing posts with label evolution. Show all posts
Showing posts with label evolution. Show all posts

Thursday, July 18, 2013

Did Neandertals Wear Ornaments?


A small but tantalizing find provides further evidence for Neandertal culture.  Working in the foothills of the Alps just north of Venice, Italy, researchers have discovered and analyzed a small marine shell that originally came from about 60 miles away.  It was thinly coated with a dark red substance that turns out to be pure hematite and was most likely used as a pigment.  One possibility is that the shell was used as an ornament.

The paper, freely available online in the journal PLoS One, dates the shell’s pigmentation to a period just before 45,000 years ago, right before the arrival of so-called “modern” humans in Europe. 

Photo Caption: A shell possibly "painted" by Neandertals about 45,000 years ago.  Photo available from PLoS One.

According to the paper, “deliberate transport and coloring of an exotic object, and perhaps its use as pendant, was a component of Neandertal symbolic culture, well before the earliest appearance of the anatomically modern humans in Europe.”

Quoting more of the paper, “this discovery adds to the ever-increasing evidence that Neandertals had symbolic items as part of their culture.”

Debates about Neandertal culture have intensified recently, in part because of genetic evidence of interbreeding between Neandertals and the modern humans coming into Asia and Europe.  While these modern humans began their migration out of Africa about 80,000 years ago and probably interbred around 55,000 years ago, they did not reach Europe until more like 40,000 years ago.  If all these dates hold up in future research, this shell does provide a small but intriguing hint about the culture of Neandertals at just about the time of their encounter with “modern” humans. 

So who exactly is modern?  The differences between ourselves (the humans we like to call “modern”) and the Neandertals are not as great than we once imagined.  The paper ends with these words: “Future discoveries will only add to our appreciation of Neandertals shared capacities with us.”

The paper, entitled "An Ochered Fossil Marine Shell From the Mousterian of FumaneCave, Italy," appears in the current issue of PLoS One and is freely available online.

Thursday, March 7, 2013

What a Smart Mouse Can Tell Us about Evolution


Just a few years ago, we thought that brains were all about neurons.  Sure, we also have glial cells, but the job of the lowly glia is to take care of the neurons, which do all the serious cognitive work. 

But why are the glia of humans and other primates so large and varied in their shape and structure?  Why are they so different from the simpler, smaller glia found in mice and other rodents?  Could the difference play a role in the evolution of human intelligence?

One way to compare a mouse and a human is to create a mouse that is part human.  That’s exactly what researchers at the University of Rochester did.  They implanted human cells into mouse brains.  More precisely, they implanted human glial progenitor cells into newborn mouse pups. 

What they got were chimeras, mice with human cells integrated into their brains.  When the researchers examined the brains of these chimeric mice, they found that the human cells proliferated and were widely present throughout the brain. Although interacting with mouse brain cells, the human cells remained distinctly human in their unusually large size and varied structures.

Photo credit:  A 23 week human culture astrocyte stained for GFAP.   From Wikimedia Commons.  Date: 24 February 2012.  Author: Bruno Pascal. 

Most surprising is that the chimeric mice were  smarter than unaltered mice born in the same litters.  Human glia in a mouse brain seems to make a smarter mouse.  

Why?  The answer probably involves one type of glial cell called astrocytes. Compared to other species, human brains have many more astrocytes.  Ours are larger and more varied in their structure, capable of connecting many neurons and coordinating the activity that occurs at many synapses. 

Based on this study, published in the March 7, 2013 issue of Cell Stem Cell, we now know that human astrocytes boost intelligence in chimeric mice as measured by standard testing procedures.  

This is pretty good evidence to suggest that the evolution of the larger, more complex glial cells was a critical aspect of the evolution of higher intelligence.  At least that is the conclusion drawn by one of the senior authors of the paper, Steven Goldman. “In a fundamental sense are we different from lower species,” he said, according to a press release from the University of Rochester. “Our advanced cognitive processing capabilities exist not only because of the size and complexity of our neural networks, but also because of the increase in functional capabilities and coordination afforded by human glia.”

What makes this study intriguing is that it uses stem cell technology to study brain function and to learn something important about evolution.  By implanting stem cells in create chimeric mice, researchers learn that glia play a critically important role in intelligence and that evolved changes in glial cells are a key part of the story of the rise of intelligence. 

Concerning the role of glial cells in the complex brain, Maiken Nedergaard, another senior author, had this to say:  “I have always found the concept that the human brain is more capable because we have more complex neural networks to be a little too simple, because if you put the entire neural network and all of its activity together all you just end up with a super computer.”

“But human cognition is far more than just processing data, it is also comprised of the coordination of emotion with memory that informs our higher abilities to abstract and learn,” Nedergaard added.

And concerning what chimeric mice have to teach us about evolution, Steven Goldman made this comment: “This study indicates that glia are not only essential to neural transmission, but also suggest that the development of human cognition may reflect the evolution of human-specific glial form and function.”

Or to quote the original paper: “These observations strongly support the notion that the evolution of human neural processing, and hence the species-specific aspects of human cognition, in part may reflect the course of astrocytic evolution.”

The paper does not address the interesting ethical questions raised by making smarter mice.  Over the past decade, ethicists have debated the moral legitimacy of chimeric animals.  One point of concern has been the creation of nonhuman animals with human brain cells.  To defend this practice, it is often said that a mouse brain with human cells is still a mouse brain.  It still has the structure or architecture of a mouse brain.  It may have human cells, but in no way is it a human brain or even a half mouse/half human brain.

This study suggests we should take a closer look at that line of thinking.  Maybe it is true that adding human neurons to a mouse brain does not change the mouse brain structure.  But this study implies that adding human astrocytes to a mouse brain may begin some small but significant change in structure and function. 

The study is clear about the fact these chimeric mice are more intelligent than the unmodified mice.  Their brains are quite literally faster. 

Once again, Goldman: “The bottom line is that these mice demonstrated an increase in plasticity and learning within their existing neural networks, essentially changing their functional capabilities.”

These animals have been cognitively “elevated,” to use a word sometimes found in the debate.  Probably no one will object to the idea of a slightly smarter mouse.  Researchers take special care to make sure these mice do not breed and produce pups of their own.  But even if they did, the added intelligence would not pass to future generations.  They would produce normal lab mice. 

Even so, this study—combining stem cells technology, neuroscience, and evolution in one elegant package—raises intriguing moral questions.  Are we smart enough to know how far we should go in creating smarter mice?  

The study, entitled “Forebrain engraftment by human glialprogenitor cells enhances synaptic plasticity and learning in adult mice,” appears in the March 7, 2013 issue of Cell Stem Cell
   
 


Tuesday, January 22, 2013

Asians, Europeans, and Neandertals

New research suggests that Europeans and Asians diverged at least 40,000 years ago, starting a process leading to the subtle differences that distinguish people to this day.

Working with bones discovered in 2003, researchers at the Max Planck Institute for Evolutionary Anthropology in Leipzig were able to reconstruct portions of DNA from an individual who lived in China about 40,000 years ago. Earlier analysis of the bones suggested that this individual showed “archaic” features, somewhat like Neandertal bones.

Credit: A Photograph of China's Empress Dowager, taken in the 1890s by Xunling, the Imperial Court Photographer. In the public domain.

The Max Planck team, led by Svante Pääbo, is well-known for work in producing the virtually complete Neandertal genome. In addition, using just a tiny fragment of a finger bone, this team produced the genome of a previously unknown form of humanity, called the Denisovans.

In their earlier work, they discovered that Europeans and Asians are descended in part from Neandertals, who disappeared about 30,000 years ago. In addition, some Asians, especially those living on the islands south of Asia, are partly descended from the Denisovans.

One of the reasons why the team was interested in this new sample was to look more deeply into the relationship between Europeans and Asians and to ask what role Neandertal and Denisovan interbreeding might have played.

Comparing the newly-reconstructed DNA sequence from the 40,000 year old bones, they found they were looking at an individual who also was descended from Neandertals, pretty much the way Europeans and Asians are today. And they also learned that this individual showed no evidence of Denisovan interbreeding.

What this means, they suggest, is that 40,000 years ago, an early version of anatomically modern Eurasians lived in China, near Beijing. While this human community was very much like the humans moving into Europe at about the same time, these two lineages were beginning a process of divergence.

On the basis of additional comparisons, the team concluded that the early-modern human community in China 40,000 about years ago was closely related to today’s Native Americans.

The report is also significant because it shows the power of new approaches to DNA extraction and sequencing. In their raw form, the samples extracted from the bones contained mostly DNA from microorganisms. In fact the human DNA was less than one-tenth of one percent of the total DNA. Even so, researchers were able to establish reliable human sequences, suitable for comparison with other human genomes.

What does that mean? At the very least, it means that many more discoveries like this lie ahead. The new technology means that old findings take on new significance.

The research appears online January 22, 2013, in the Proceedings of National Academy of Sciences, as "DNA analysis of an early modern human from Tianyuan Cave, China."

Wednesday, November 7, 2012

Better Technology, Better Weapons

Ongoing archeological discoveries from coastal South Africa point consistently to a technological and cultural explosion occurring there more than 70,000 years ago. The latest paper, appearing in the November 7 issue of the journal Nature, fills in more detail about remarkable advances in stone tool technology that only appear in Europe some 50,000 years later.

The new findings, reported by an international team of researchers led by Curtis Marean, help fill in a more comprehensive picture of the culture that flourished on the coast of South Africa for thousands of years. In 2009, Marean's team published a report showing how the controlled use of fire played a key role in the engineering of stone tools. The 2012 paper provides evidence that this technology was used for at least 11,000 years by the inhabitants of the coast.

"Eleven thousand years of continuity is, in reality, an almost unimaginable time span for people to consistently make tools the same way," said Marean. "This is certainly not a flickering pattern."

PHOTO: Caption: These microlith blades show a flat edge with a rounded "cutting" edge. Credit: Simen Oestmo. Used by permission of the ASU Institute of Human Origins for the purposes of illustrating coverage of the accompanying article.

One possibility suggested by this research is that the 70,000 year old technology found in South Africa was brought out of Africa by modern humans. If so, it may help explain why Neandertals disappeared as modern humans entered Europe and Asia. Advances in technology made it possible to create light-weight points for spears or arrows, mostly likely used for small spears launched by spear-throwing devices known as atlatls, which effectively extend the length of the throwing arm.

"When Africans left Africa and entered Neanderthal territory they had projectiles with greater killing reach, and these early moderns probably also had higher levels of pro-social (hyper-cooperative) behavior. These two traits were a knockout punch. Combine them, as modern humans did and still do, and no prey or competitor is safe," said Marean. "This probably laid the foundation for the expansion out of Africa of modern humans and the extinction of many prey as well as our sister species such as Neanderthals."

If there is any truth to this conjecture, it is a sobering truth. This technological advance makes it easier to kill.

The new paper reports on findings at the Pinnacle Point excavation site, a mere some 50 miles from Blombos cave, home to similar findings and to the first "chemical engineering laboratory" for the production of the pigment, ochre. Whoever lived there was technologically and culturally advanced, with all the ambiguities that implies.

The paper, "An Early and Enduring Advanced Technology Originating 71,000 Years Ago in South Africa," appears in the November 7 issue of the journal Nature.

Wednesday, July 18, 2012

Neandertal Medicine

Neandertals not only ate their vegetables. They used specific plants—even ones that tasted bitter—to treat their ailments. That’s the latest finding from the international team of researchers studying Neandertal remains at in El Sidrón archeological site in northern Spain. Discovered in 1994, El Sidrón has yielded thousands of samples from at least 13 Neandertal individuals.

Using newer techniques of microanalysis, the team studied the dental plaque recovered from teeth of five individuals dating about 50,000 years ago. Lodged in the plaque were tiny microfossil remains of various plants, providing evidence that Neandertals supplemented their diet of meat with a wide range of grain, herbs, and vegetables. The study is published this week in Naturwissenschaften (The Science of Nature).

CAPTION: Researchers working in El Sidrón Cave. Credit: CSIC Comunicación.

"The varied use of plants we identified suggests that the Neanderthal occupants of El Sidrón had a sophisticated knowledge of their natural surroundings which included the ability to select and use certain plants for their nutritional value and for self-medication. While meat was clearly important, our research points to an even more complex diet than has previously been supposed," according to Karen Hardy, a leader in the research team, according to a press release from the University of York.

Neandertals disappeared from Europe and Asia somewhere around 30,000 years ago, often sharing regions with modern humans for thousands of years. Only recently has it become clear that they depended heavily on plants as well as meat for their food.

"The evidence indicating this individual was eating bitter-tasting plants such as yarrow and camomile with little nutritional value is surprising. We know that Neanderthals would find these plants bitter, so it is likely these plants must have been selected for reasons other than taste," said Dr Stephen Buckley, a member of the research team.

The clear implication of the study—that Neandertals recognized the medicinal value of certain plants—provides further evidence of the sophistication of Neanderthal culture and technology. The full scope of Neandertal cultural interaction with modern humans remains an open question.

"El Sidrón has allowed us to banish many of the preconceptions we had of Neanderthals. Thanks to previous studies, we know that they looked after the sick, buried their dead and decorated their bodies. Now another dimension has been added relating to their diet and self-medication," according to Antonio Rosas, also on the research team.

CAPTION: Microscopically visible material entrapped in dental calculus samples – filamentous and cocci bacteria. Credit: Karen Hardy/Naturwissenschaften.

The article, "Neanderthal medics? Evidence for food, cooking and medicinal plants entrapped in dental calculus," is published in the current issue of Naturwissenschafen.

Monday, May 14, 2012

How Old Is Art?

Confirmed dates for the world’s oldest art just got older, according to the report of an international research team published in the May 14 issue of the Proceedings of the National Academy of Sciences.

Dating back about 37,000 years, the art consists of engravings made in stone that has since fallen from the ceiling of a cave at Abri Castanet in southwestern France. While not as visually arresting as the more famous cave art found at Chauvet, the Castanet engravings are both older and represent what is very likely an earlier stage in the history of the Aurignacian culture, which spanned 40,000 to about 28,000 years ago. Some of the Chauvet paintings are now confirmed at between 30,000 and 32,000 years ago.

Credit: HTO. A replica of a painting, now in the public domain.

The Castanet engravings are both simpler artistically and were located in the general living area of the cave. The Aurignacian culture that created both the paintings and the engravings is known for is many forms of art. According to New York University anthropology professor Randall White, one of the study's co-authors, the Aurignacians "had relatively complex social identities communicated through personal ornamentation, and they practiced sculpture and graphic arts."

"But unlike the Chauvet paintings and engravings, which are deep underground and away from living areas, the engravings and paintings at Castanet are directly associated with everyday life, given their proximity to tools, fireplaces, bone and antler tool production, and ornament workshops," White said in press release issued by NYU.

With more refined archeological techniques, the story of the rise of human symbolic culture is likely to become more complex and more ancient. While there may well have been bursts of cultural creativity in which symbolic advance occurred rapidly, additional findings may also suggest a more steady rise in the story of human art. The study, entitled “Context and dating of Aurignacian vulvar representations from Abri Castanet, France,” appears in the May 14, 2012 edition of PNAS.

Thursday, May 3, 2012

Human Intelligence: Does It Depend on a Genetic Error?

What makes humans different from the great apes? What makes our brains larger and more complex? We know that our DNA is remarkable similar to other mammals. What subtle genetic changes can explain such huge behavioral differences? One surprising possibility is that our brains are bigger and more complex not so much because of new genes but because of gene duplication.

One gene in particular—SRGAP2—plays a role in how brain cells migrate. It is found widely in mammals of all sorts, from mice to humans. In the great apes, the more archaic form of SRGAP2 results in a relatively slow spread of neurons throughout the brain. Twice in the ancient past, however, SRGAP2 was duplicated, first about 3.4 million years ago and then again around 2.4 million years ago. The second duplication occurred right around the time when the genus Homo separated from Australopithecus. It appears that as a result of these duplications, brains in the Homo lineage—including our own as Homo sapiens—are both large and complex in their number of neuronal connections and in their ability to process information.

A key piece of supporting evidence comes from recent discoveries of the role of SRGAP2 in the development of the human neocortex. When the distinctly human SRGAP2 variants are missing, normal human brain development is impaired. This research appears in two papers appearing May 3, 2012 in the journal Cell. According to one of the papers, “It is intriguing that the general timing of the potentially functional copies…corresponds to the emergence of the genus Homo from Australopithecus (2-3 mya). This period of human evolution has been associated with the expansion of the neocortex and the use of stone tools, as well as dramatic changes in behavior and culture.”

Caption: A team led by Scripps Research Institute scientists has found evidence that, as humans evolved, an extra copy of a brain-development gene allowed neurons to migrate farther and develop more connections. Credit: Photo courtesy of The Scripps Research Institute Usage Restrictions: None

The uniquely human duplications work in a surprising ways, especially the second duplication. The original SRGAP2 remains present in humans today, along with the duplicated versions. The second duplication—SRGAP2C—has the effect of interfering with the original SRGAP2. The reason why SRGAP2C interferes with SRGAP2 rather than boosts it is because the duplicated version is incomplete—in other words, an advantageous copying error.

According to one of the studies, once SRGAP2C appeared about 2.4 million years ago, it created a “dominant negative interaction equivalent to a knockdown of the ancestral copy…The incomplete nature of the segmental duplication was, therefore, ideal to establish the new function by virtue of its structure,” acting in a way that was “instantaneous” in terms of evolution.

"This innovation couldn't have happened without that incomplete duplication," according to Evan Eichler, another leader in the research team. "Our data suggest a mechanism where incomplete duplication of this gene created a novel function 'at birth'."

Even though SRGAP2 duplications seem to play a significant role in distinguishing human beings from the apes, other duplications and mutations are very likely to be involved in the story of human evolution. "There are approximately 30 genes that were selectively duplicated in humans," said Franck Polleux, one of the lead researchers involved in the study, in a press release from the journal. "These are some of our most recent genomic innovations."

Rather than standard mutations, "episodic and large duplication events could have allowed for radical – potentially earth-shattering – changes in brain development and brain function," according to Eichler. For these reasons, this is one of the most intriguing areas for research into the origins of human intelligence.

Whether other duplications—including “incomplete duplications or erroneous copies—also explain our complex brains is something that will be discovered in the next few years.

But what is surprising and somewhat sobering, just based on this SRGAP2 discovery, is how our much-vaunted human uniqueness seems to hang on such a fine thread. If the SGGAP2 duplication is even partly responsible for our complex brains, should we think that our intelligence arose because of a copying error or an incomplete duplication? Is the rise of intelligence and consciousness—truly one of the great events in the story of cosmic evolution—really just based in part on a fluke of nature? Religious or not, hardly anyone is likely to think that thinking is sheer accident.

The papers, Charrier et al.: "Inhibition of SRGAP2 function by its human-specific paralogs induces neoteny during spine maturation" and Dennis et al.: "Human-specific evolution of novel SRGAP2 genes by incomplete segmental duplication," appear in the journal Cell.

Thursday, April 19, 2012

Synthetic Biology: Is There Life beyond DNA?

Life as we know it is based on DNA and RNA. Could it have been otherwise? Might other worlds have life based on a different “genetic” system? We may never know for sure.

But we do know that synthetic biology is moving briskly toward the goal of engineered life beyond DNA and RNA.

Recall that in “DNA” and “RNA,” the “NA” part stands for “nucleic acids.” It’s the four nucleic acids that carry the genetic information in a chemical code. The “D” and the “R,” however, stand for sugars that hold the nucleic acids in place, allowing them to form pairs and to copy themselves. Can other sugars work as well?

Recent work in synthetic biology has led beyond DNA and RNA to xeno-nucleic acids or “XNAs.” Now, using six different forms of XNAs, an international team of researchers led by Vitor Pinheiro reports success in getting XNAs to store and propagate information. One of their XNAs actually “evolved” by responding to imposed selective constraints. Their work is published in the April 20, 2012 issue of the journal Science.

Caption: Courtesy--National Human Genome Research Institute

In a commentary on the research, Gerald F. Joyce writes in Science that this work has implications for the “understanding of life itself.” In addition, it opens new insight into the possible origins of life on our planet or else where in the cosmos.

At the same time, far more work lies ahead before synthetic biologists create XNA-based life. Pinheiro’s team was able to get their synthetic XNA “genes” to interact with DNA, but “they have not yet realized a synthetic genetic system.” One big challenge is in getting XNA sequences to copy themselves the way DNA does. Some XNAs can copy themselves to DNA and back again to XNA, but XNA-to-XNA copying is not reliable.

According to Joyce, however, “future studies are likely to yield improvements of the the various XNA-to-XNA copying reaction.” If that happens, synthetic biology will take yet another step toward “synthetic genetics.”

All this prompts a warning from Joyce: “Synthetic biologists are beginning to frolic on the worlds of alternative genetics but must not tread into areas that have the potential to harm our biology.” As ever, greater knowledge brings greater risks. More than ever, public awareness and careful thought are needed.

The research article, "Synthetic Genetic Polymers Capable of Heredity and Evolution" and the commentary, "Toward an Alternative Biology," are both published in the April 20, 2012 issue of Science, the journal of the American Association for the Advancement of Science.

Monday, April 2, 2012

A Million Years of Fire

One of our newest technologies has just shed new light on one of our oldest.

When did our human ancestors learn to control and use fire? Armed with the latest high tech tools, an international team of researchers has pushed the date back to 1 million years. That’s 300,000 years earlier than previous unambiguous dates.

The massive Wonderwerk Cave is in northern South Africa on the edge of the Kalahari. Previous excavations have shown extensive human occupation. Using the new techniques of micromorphological analysis and Fourier transform infrared microspectroscopy (mFTIR), researchers analyzed cave sediments at a far more detailed level than possible before.

Caption: This is a panoramic view of the entrance to Wonderwerk Cave, South Africa. Credit: H. Ruther. Usage Restrictions: None

In the cave sediments researchers found bits of ash from plants along with fragments of burned bone. Did the wind blow burning debris into the cave? The evidence—collected about 100 feet from the current opening of the cave—supports the conclusion that the fire burned in the cave. Also part of the proof: the surrounding surfaces are discolored.

”The analysis pushes the timing for the human use of fire back by 300,000 years, suggesting that human ancestors as early as Homo erectus may have begun using fire as part of their way of life," anthropologist Michael Chazan said in a press release from the University of Toronto.

According to the paper, "Through the application of micromorphological analysis and Fourier transform infrared microspectroscopy (mFTIR) of intact sediments and examination of associated archaeological finds— fauna, lithics, and macrobotanical remains—we provide unambiguous evidence in the form of burned bone and ashed plant remains that burning events took place in Wonderwerk Cave during the early Acheulean occupation, approximately 1.0 Ma. To date, to the best of our knowledge, this is the earliest secure evidence for burning in an archaeological context."

Caption: Interior of Wonderwerk Cave. Images courtesy of M. Chazan.

"The control of fire would have been a major turning point in human evolution," says Chazan. "The impact of cooking food is well documented, but the impact of control over fire would have touched all elements of human society. Socializing around a camp fire might actually be an essential aspect of what makes us human."

How important are fire and cooking for human evolution. A recent book, Catching Fire: How Cooking Made Us Human by Richard Wrangham, argues that cooking is essential to our humanity. Now in the paper published on April 2, the team concludes that its study “is the most compelling evidence to date offering some support for the cooking hypothesis of Wrangham.”

Their work is published as “Microstratigraphic evidence of in situ fire in the Acheulean strata of Wonderwerk Cave, Northern Cape Province, South Africa,” in the April 2, 2012 issue of the Proceedings of the National Academy of Sciences.

Wednesday, January 11, 2012

Are We Alone?

Our sun is a star, one of hundreds of billions of stars.

In the past 16 years, scientists have founds more than 700 planets orbiting a few of the stars beyond our sun. These distant planets are often called exoplanets, short for extra-solar planet.

Just how many exoplanets are there in our Milky Way galaxy? Were researches just extremely lucky, looking where exoplanets exist? Or do they exist nearly everywhere?

Research published in the January 11, 2012 issue of Nature supports the idea that planets are as common as stars. Each of the 100 billion stars in our galaxy has on average at least one planet.

If life exists out there, it most likely exists on planets that are roughly like our earth. Not too big and neither too far from its sun (perpetual winter) or too close (blazing heat). Of all the planets, how many are roughly on the scale of earth? Half of them at least, maybe more.

Caption: This artist's cartoon view gives an impression of how common planets are around the stars in the Milky Way. The planets, their orbits and their host stars are all vastly magnified compared to their real separations. A six-year search that surveyed millions of stars using the microlensing technique concluded that planets around stars are the rule rather than the exception. The average number of planets per star is greater than one. Credit: ESO/M. Kornmesser

Arnaud Cassan of the Institut dʼAstrophysique de Paris and lead author of the paper explains: “We have searched for evidence for exoplanets in six years of microlensing observations. Remarkably, these data show that planets are more common than stars in our galaxy. We also found that lighter planets, such as super-Earths or cool Neptunes, must be more common than heavier ones,” Cassan said in a press release issued by the European Southern Observatory (ESO).

"We used to think that the Earth might be unique in our galaxy. But now it seems that there are literally billions of planets with masses similar to Earth orbiting stars in the Milky Way," according to Daniel Kubas, co-lead author of the paper.

If there are tens of billions of life-friendly planets just in our own galaxy, how likely is it that life exists out there somewhere? And if life exists, has intelligence evolved?

These are ancient religious and philosophical questions. The latest science certainly tilts the debate in favor of life. Sure, it’s possible that earth beat the odds: of the tens of billions of life-friendly planets, only ours has life.

Right now, there’s no evidence either way. All that science can tell us is that the planets are there, ready for the spark of life to get started.

So we are left to gaze at the night sky and to wonder as never before. For each star, there’s a planet. Is anyone out there looking back?

In Christian theology, one of the earliest statements about the Holy Spirit is that the Spirit is the “giver of life.” Hildegard of Bingen (Symponia) put it this way: “God our life is the life of all.” Or consider John Calvin, who says the Spirit is "everywhere diffused, sustains all things, causes them to grow, and quickens them in heaven and on earth."

To believe in God is to believe in the life-giving presence of God, not just here but everywhere. Thanks to this research, it turns out that the Spirit has many more planets on which to give life.

The article, "One or more bound planets per Milky Way star from microlensing observations", by A. Cassan et al., appears in the 12 January issue of the journal Nature.

Monday, January 9, 2012

Genes, Hybrids, and Giant Tortoises

Charles Darwin visited the Galápagos Islands in 1835. As he moved from island to island, he saw the subtle differences between finches, tortoises, and other animals. These observations led to the discovery of the theory of evolution as an expanding “tree of life,” first sketched by Darwin in his notebook entry dated just two years later in 1837.

The great tortoises of the Galápagos could not fail to impress. The greatest of all, the tortoise Chelonoidis elephantopus, can live to be a hundred years old and grow to six feet and almost 900 pounds.

Until now, it was believed that whalers hunted the great C. elephantopus to extinction shortly after Darwin’s visit. Now, however, new research suggests that a few of the great tortoises may still be alive.

Caption: G. Becky tortoises are native to Isabela Island in the Galapagos chain and have more domed shape shell. Credit: Courtesy Yale University.

Researchers have found what they believe are direct offspring of purebred C. elephantopus tortoises. By testing the genes of living tortoises, researchers concluded that they were studying hybrids. One parent was from a related species, C. becki. But the other parent was clearly C. elephantopus. And since the living tortoises were still quite young, researchers were drawn to the obvious conclusion that the C. elephantopus parent lived until a few decades ago and may still be roaming the slopes of Isabela Island.

So now it’s a race against time to find surviving purebred C. elephantopus tortoises in hopes that enough of them still exist so the species—truly one of the great animal species—can be brought back from what seemed like extinction. According to the report, “purebred tortoises of the recently ‘extinct’ C. elephantopus from Floreana Island are very likely still alive today.”

Caption: This tortoise is a hybrid of G. Becky and C. elephantopus, a species native to Floreana Island some 200 miles away and thought to be extinct. Genetic analysis of tortoise population on Isabela Island suggests purebred individuals of C. elephantopus must still be alive on Isabela. Credit: Courtesy of Yale University

One interesting parallel. Using a similar approach, researchers have recently concluded that human beings are also hybrids. For example, many of us contain genes from our Neandertal ancestors. The big difference, of course, is that our interbreeding occurred tens of thousands of years ago. In either case, hybridization or interbreeding occurs when the twigs at the end of Darwin's tree of life come together. As evolutionary biologists are discovering, speciation (or branching) is critical to evolution, but so is interbreeding or hybridization.

According to the report, “To our knowledge, this is the first rediscovery of a species by way of tracking the genetic footprints left in the genomes of its hybrid offspring.” The report, "Genetic rediscovery of an ‘extinct’ Galápagos giant tortoise species," appears in the January 9, 2012 issue of Current Biology.

Thursday, December 8, 2011

Are You as Empathetic as a Rat?

Empathy is the capacity to share the emotional state of another. Politicians claim to have it when they say “I feel your pain.”

Even if they do not always show it, human beings are clearly capable of empathy. Other primates such as chimps have been observed acting in a way that is best explained by empathy. Rather than acting for their own benefit, they sometimes act because they share the feeling or distress of another chimp. Such behavior is said to be “empathy-driven.”

Once it was thought that only human beings could feel empathy. Now researchers are finding that empathy-driven behavior is more widespread than previously imagined. Not just other primates but even rodents, it seems, are biologically capable of empathy. For all the differences between the human and the rat brain, we share fundamental circuits that make it possible to feel the emotions of another, particularly when the other is in pain or distress.

In a simple experiment reported in the December 9 issue of the journal Science, researchers provide solid evidence that the much-maligned rat is capable of acting in a way that is most easily explained by empathy.

"This is the first evidence of helping behavior triggered by empathy in rats," said Jean Decety, a member of the research team at the University of Chicago. "There are a lot of ideas in the literature showing that empathy is not unique to humans, and it has been well demonstrated in apes, but in rodents it was not very clear. We put together in one series of experiments evidence of helping behavior based on empathy in rodents, and that's really the first time it's been seen," Decety said in a release issued by the University.

In order to act in a way that is empathy-driven, an animal must be capable of “emotional contagion.” To test whether rats have this capacity, an experiment was designed Chicago psychology graduate student Inbal Ben-Ami Bartal. Two rats were placed in an enclosure, one of them roaming freely while the other was locked inside a tube. The free rat, in time, could discover how to open the lock, but there was no reward for doing so.

The experiment was designed observe whether rats show they are capable of emotional contagion. Was the free rat biologically capable of emotional concern or what the paper defines as “an other-oriented emotional response elicited by and congruent with the perceived welfare of an individual in distress”?

PHOTO: ©Science/AAAS.

The free rats not only learned to open the container but did so repeatedly when it held another rat, something they did not do if it was empty or if it contained a stuffed animal.

Even more striking was their behavior when chocolate chips were involved. In one variation on the experiment, two enclosures were used, one with an enclosed rat and the other with five pieces of chocolate. The free rat has a choice: free the cagemate or eat the chocolate first. In the absence of empathy, the free rat will make the selfish choice. But at least half the time, the rat freed its cagemate first. According to the report, “these results show that the value of freeing a trapped cagemate is on par with that of accessing chocolate chips.”

"On its face, this is more than empathy, this is pro-social behavior," said Jeffrey Mogil of McGill University, who was not involved in the study. "It's more than has been shown before by a long shot.”

Without claiming to know what rats think, the authors conclude their report with their opinion that “the free rat was not simply empathetically sensitive to another rat’s distress but acted intentionally to liberate a trapped” member of their own species.

If rats are indeed capable of empathetic feelings, then it becomes clear that the biological substrate for shared emotion is deep in our evolutionary past and deep in the earlier parts of our brains. Far from being uniquely human, empathy seems to be widely shared. What is uniquely human, perhaps, is the way we override it with self-interest.

As I prepared this post, I was interrupted several times by others who were speaking of the history of racism in America and particularly the history of slavery. When I saw the pictures of rats in their enclosure, my mind went to chains and slave ships. If empathy is so deep in our mammalian evolution, so deeply rooted in our brains, what extraordinary rationalizations do we conjure up to negate it?

The paper, "Empathy and Pro-Social Behavior in Rats," is published Dec. 9 by the journal Science. http://www.sciencemag.org/content/334/6061/1427.abstract

Tuesday, December 6, 2011

Evolutionary Fast-Track for Human Brains

More than 35 years ago, Allan Wilson and Mary-Claire King made an astonishing proposal. Maybe what separates humans and chimps is not just our genes. Maybe it’s also how our genes are expressed or regulated.

Research published in today’s issue of PLoS Biology builds on decades of intervening advances in evolution and genetics and take the question much further. The difference between humans and nonhuman primates in cognitive ability is explained in large part by differences in gene expression, especially during the critical periods when young brains are being formed.

Humans share many of their genes with other species, especially chimps. In fact, we share so many genes that it is hard to explain how we can be so different in terms of cognitive ability. If genes make all the difference, how can they explain the differences between chimp and human brains? And how can a mere six million years of human-chimp divergence give us enough time to accumulate enough genetic change?

The answer seems to lie in the relatively rapid evolution of differences in gene expression. In other words, while the genes themselves evolved slowly, the regulation of their expression evolved more rapidly. It’s not just the genes but their expression that’s important. It’s not just the evolution of genes but the evolution of gene expression that drives the rapid divergence between human and chimp brains.

This is especially true in the genes that control the development of the prefrontal cortex of the brain. In other words, there has been relatively rapid evolution in the genetic mechanisms that regulate genes directly responsible for the early-childhood neural development of the critically-important prefrontal cortex, which is involved in abstract thinking, planning, social intelligence, and working memory.

According to the article, “humans display a 3-5 times faster evolutionary rate in divergence in developmental patterns, compared to chimpanzees.” Most important, however, is the way this research identifies specific regulators that have evolved rapidly since human-chimp divergence. These regulators are “micro-RNAs,” some of which are specifically identified in the article, with the claim that “changes in the expression of a few key regulators may have been a major driving force behind rapid evolution of the human brain.”

According to the study’s senior author, Philipp Khaitovich, this finding suggests that "identifying the exact genetic changes that made us think and act like humans might be easier than we previously imagined." Kkaitovich was quoted in a press release issued by the journal, PLoS Biology.

The article is entitled "Micro-RNA-Driven Developmental Remodeling in the Brain Distinguishes Humans from Other Primates" and appears in the December 6 issue of PLoS Biology, where it is freely available to the public.

Saturday, December 3, 2011

The Secret Lives of Cells Revealed

Life at the cellular level is chaotic and complex, beautiful and yet deadly.

Even though we are made up of trillions of cells, most of us give our individual cells about as much thought as a piece of sandstone thinks about individual grains of sand.

Enter the new technologies of imaging, which open new worlds. As never before, we can see the very small and the very distant.

On December 3, the American Society for Cell Biology announced the winners of the Celldance 2011 Film and Image Contest Winners.

Take a look. Unless you’re a cell biologist, it will change the way you see the world. It will re-define your relationship to your own body. It will open new vistas on the much quoted “fearfully and wonderfully made.” If only the psalmist could have seen this!

My favorite is the first place winner, “Cancer Dance.” I say “favorite” with a great deal of qualification. It’s hard to look at this film. If you know someone who has faced cancer—and who doesn’t—what you see in this film will shock and anger you. And then you have to think: cancer is happening inside all of us pretty much all the time. Fortunately, it doesn’t get the upper hand…unless it does.

When I teach the introduction to theology, I talk about God, creation, pain in nature, and human suffering at the hands of nature. Cancer is the main example. Describing this disease theologically is a real challenge. Quite simply, cancer uses the mechanisms of life to destroy lives. It turns everything good bad.

I once asked an oncologist friend who is a Christian: “When you look at a cancer cell, theologically, what do you see?” He was so astounded by the question that he couldn’t answer.

Now, thanks to this video, you can ask yourself that question. Theologically, what is going on here? What the bleep is going on here? Why would God design such a system?

So from now on, when I teach theology, I’ll run the video. I won’t have answers. I will hope my students will learn that their standard answers might not be so useful after all.

Finally let me add that I am looking forward to the publication of a book called Chance, Necessity, Love: An Evolutionary Theology of Cancer. It’s the work of Leonard M. Hummel, who teaches Pastoral Theology and Care at the Lutheran Theological Seminary at Gettysburg, and Steve James, Associate Professor of Molecular Biology at Gettysburg College. I’ll update when the book is available.

Here again is the LINK to the videos. Each one is a winner. We nonscientists owe a great debt to the hardworking young researchers who spent hours showing us what we’re made of. For a theologian, it's a revelation.

Wednesday, November 30, 2011

The Great Migration: Tools Mark the Trail

Anatomically modern humans (AMH)—people who looked pretty much like us—migrated out of Africa tens of thousands of years ago and settled across Asia and Europe.

Just who were these people, how long ago did they migrate, and what route did they first take? These are some of the biggest questions in archeology. Now at last researchers seem to be closing in on concrete answers.

In a report published in the November 30 issue of the open-access journal PLoS ONE, an international research team led by Jeffrey Rose presents its analysis of recent work in southern Oman, located on the southeastern corner of the Arabian peninsula.

For years, researchers have debated with each other over the earliest migration route. Was it across the Red Sea to the Arabian boot heel (sea levels being much lower then)? Or was it north from Egypt along the Mediterranean?

Rose and his team found evidence suggesting that AMH residents of the Nile valley migrated—with their distinctive tool technology—to present day Oman. Their analysis of over 100 sites in Oman led researchers to believe that the tool culture was clearly the same in both settings. In other words, one culture spans two continents, clearly supporting the idea of human migration.

Scientists have long known about the Nile valley culture, which they call “Nubian.” The breakthrough reported here is the strong evidence that Nubian toolmakers made their way out of Africa to Arabia, bringing their characteristic stonecutting techniques with them.

The date of migration, according to the report, is at least 106,000 years ago, perhaps earlier.

No human remains were found with the stone tools. This leaves open the possibility that some other humans—“archaic” and not anatomically modern—may be responsible for the stone tools. The researchers dismiss this idea on the grounds that AMH seem to be the only form of humans present in North Africa at the time of the migration.

“After a decade of searching in southern Arabia for some clue that might help us understand early human expansion, at long last we've found the smoking gun of their exit from Africa,” according to Rose, a Research Fellow at the University of Birmingham.

Another surprise contained in the report is that the stone tools were found inland rather than right along the coast. “For a while,” remarks Rose, “South Arabia became a verdant paradise rich in resources – large game, plentiful freshwater, and high-quality flint with which to make stone tools,” according to a press release issued by PLoS One. One possibility is that the “southern route” out of Africa along the southern Arabian peninsula was not so much a coastal expressway to Asia and Europe as it was a settling of the interior of Arabia.

The report, “The Nubian Complex of Dhofar, Oman: An African Middle Stone Age Industry in Southern Arabia,” appears in the November 30, 2011 issue of PLoS ONE

Tuesday, October 18, 2011

Evolution and the Human Brain

How did the human brain become so complex so quickly? Did old genes learn new tricks? Or did new genes appear, bringing new functions?

A paper appearing today in PLoS Biology suggests that new genes play a bigger role than previously thought in explaining the complex functions of the human brain. Researchers at the University of Chicago Department of Ecology and Evolution reached this conclusion by comparing the age of genes with transcription data from humans and mice. Where are new genes most often expressed? In humans, it’s in the brain. Even more interestingly, it’s in the developing brain of the fetus and the infant.

One of the researchers, Yong E. Zhang, was motivated to ask these questions because he accompanied his pregnant wife to prenatal ultrasound appointment, according to a press release issued by the University of Chicago Medical Center. According to Zhang, “Newer genes are found in newer parts of the human brain.” The press release also quotes co-author Patrick Long: “What’s really surprising is that the evolutionary newest genes on the block act early….The primate-specific genes act before birth, even when a human embryo doesn’t look very different from a mouse embryo. But the actual differences are laid out early,” Long explained.

In the language of the PLoS Biology paper, the authors “observed an unexpected accelerated origination of new genes which are upregulated in the early developmental stages (fetal and infant) of human brains relative to mouse.” In other words, compared to all the genes in the human genome, younger genes are significantly more involved in those parts of the brain that make us distinctly human. More than that, these genes play a greater than expected role in prenatal and infant development, the very period in which the brains of humans develop so rapidly compared to the brains of other species.

How did these new genes arise? By all the various means by which new genes arise—by various processes of duplication and by de novo origination. Rather remarkably, the authors make this observation: “…young genes created by all major gene origination mechanisms tend to be upregulated in [the] fetal brain. Such generality suggests that a systematic force instead of a mutational bias associate with a specific origination mechanism contributed to the excess of young genes in the fetal brain.”

What “systematic force”? Clearly, the authors are not speculating about anything more than a statistical correlation. But their work will give rise to new questions for research. What role do these young genes actually play in the developing brain? What role did natural selection play in the evolution of these genes? Does this surprising correlation shed any light at all on our rapid rise as a species and the stunning complexity of the human brain?

The paper, "Accelerated Recruitment of New Brain Development Genes into the Human Genome," is published in the October 18 issue of PLoS Biology [10.1371/journal.pbio.1001179].