Thursday, March 7, 2013

What a Smart Mouse Can Tell Us about Evolution


Just a few years ago, we thought that brains were all about neurons.  Sure, we also have glial cells, but the job of the lowly glia is to take care of the neurons, which do all the serious cognitive work. 

But why are the glia of humans and other primates so large and varied in their shape and structure?  Why are they so different from the simpler, smaller glia found in mice and other rodents?  Could the difference play a role in the evolution of human intelligence?

One way to compare a mouse and a human is to create a mouse that is part human.  That’s exactly what researchers at the University of Rochester did.  They implanted human cells into mouse brains.  More precisely, they implanted human glial progenitor cells into newborn mouse pups. 

What they got were chimeras, mice with human cells integrated into their brains.  When the researchers examined the brains of these chimeric mice, they found that the human cells proliferated and were widely present throughout the brain. Although interacting with mouse brain cells, the human cells remained distinctly human in their unusually large size and varied structures.

Photo credit:  A 23 week human culture astrocyte stained for GFAP.   From Wikimedia Commons.  Date: 24 February 2012.  Author: Bruno Pascal. 

Most surprising is that the chimeric mice were  smarter than unaltered mice born in the same litters.  Human glia in a mouse brain seems to make a smarter mouse.  

Why?  The answer probably involves one type of glial cell called astrocytes. Compared to other species, human brains have many more astrocytes.  Ours are larger and more varied in their structure, capable of connecting many neurons and coordinating the activity that occurs at many synapses. 

Based on this study, published in the March 7, 2013 issue of Cell Stem Cell, we now know that human astrocytes boost intelligence in chimeric mice as measured by standard testing procedures.  

This is pretty good evidence to suggest that the evolution of the larger, more complex glial cells was a critical aspect of the evolution of higher intelligence.  At least that is the conclusion drawn by one of the senior authors of the paper, Steven Goldman. “In a fundamental sense are we different from lower species,” he said, according to a press release from the University of Rochester. “Our advanced cognitive processing capabilities exist not only because of the size and complexity of our neural networks, but also because of the increase in functional capabilities and coordination afforded by human glia.”

What makes this study intriguing is that it uses stem cell technology to study brain function and to learn something important about evolution.  By implanting stem cells in create chimeric mice, researchers learn that glia play a critically important role in intelligence and that evolved changes in glial cells are a key part of the story of the rise of intelligence. 

Concerning the role of glial cells in the complex brain, Maiken Nedergaard, another senior author, had this to say:  “I have always found the concept that the human brain is more capable because we have more complex neural networks to be a little too simple, because if you put the entire neural network and all of its activity together all you just end up with a super computer.”

“But human cognition is far more than just processing data, it is also comprised of the coordination of emotion with memory that informs our higher abilities to abstract and learn,” Nedergaard added.

And concerning what chimeric mice have to teach us about evolution, Steven Goldman made this comment: “This study indicates that glia are not only essential to neural transmission, but also suggest that the development of human cognition may reflect the evolution of human-specific glial form and function.”

Or to quote the original paper: “These observations strongly support the notion that the evolution of human neural processing, and hence the species-specific aspects of human cognition, in part may reflect the course of astrocytic evolution.”

The paper does not address the interesting ethical questions raised by making smarter mice.  Over the past decade, ethicists have debated the moral legitimacy of chimeric animals.  One point of concern has been the creation of nonhuman animals with human brain cells.  To defend this practice, it is often said that a mouse brain with human cells is still a mouse brain.  It still has the structure or architecture of a mouse brain.  It may have human cells, but in no way is it a human brain or even a half mouse/half human brain.

This study suggests we should take a closer look at that line of thinking.  Maybe it is true that adding human neurons to a mouse brain does not change the mouse brain structure.  But this study implies that adding human astrocytes to a mouse brain may begin some small but significant change in structure and function. 

The study is clear about the fact these chimeric mice are more intelligent than the unmodified mice.  Their brains are quite literally faster. 

Once again, Goldman: “The bottom line is that these mice demonstrated an increase in plasticity and learning within their existing neural networks, essentially changing their functional capabilities.”

These animals have been cognitively “elevated,” to use a word sometimes found in the debate.  Probably no one will object to the idea of a slightly smarter mouse.  Researchers take special care to make sure these mice do not breed and produce pups of their own.  But even if they did, the added intelligence would not pass to future generations.  They would produce normal lab mice. 

Even so, this study—combining stem cells technology, neuroscience, and evolution in one elegant package—raises intriguing moral questions.  Are we smart enough to know how far we should go in creating smarter mice?  

The study, entitled “Forebrain engraftment by human glialprogenitor cells enhances synaptic plasticity and learning in adult mice,” appears in the March 7, 2013 issue of Cell Stem Cell
   
 


Friday, February 22, 2013

Parthenogenesis and "Virgin Birth"? Rhetoric and Research

Despite roadblocks, the field of stem cell research remains profoundly attractive. The idea of being able to regenerate damaged or diseased cells in the human body is appealing to nearly everyone who cares about human health.

But technical problems remain. Much has been learned in the past decade, but the pathway to medical treatments still faces many challenging problems. One worry in particular is that implanted stem cells might develop into cancer. Others challenges including getting the cells to multiply, integrate with other cells, function as they should, and avoid being rejected as an infection.

A new solution may be on the horizon, one that addresses many of these problems—moral and technical—all at once. At least that’s the claim made by a team led by Wolfram Zimmerman and colleagues at Georg-August-Universität Göttingen in Germany. Working with laboratory mice, Zimmerman’s team used mouse eggs to create what are known as parthenotes. Without being fertilized, the mouse eggs were manipulated so that they began to develop as if they were fertilized, up to a point.

PHOTO: Mouse embryonic stem cells. This image is a work of a National Science Foundation employee, taken or made as part of that person's official duties. As a work of the U.S. federal government, the image is in the public domain. This image was copied from wikipedia:en.

Parthenogenesis exists in nature. It has been observed in some plants, fish, and reptiles. Over the past decade, researchers have learned how to induce parthenogenesis in mice, monkeys, and humans. In every case, however, the resulting parthenotes fail to develop normally, which means they could never be implanted to produce a child. But they do develop for a few days, long enough for the precursors of pluripotent stem cells to develop.

What is new in the research reported on February 22, 2013 is unexpected success in the use of these stem cells derived from mouse parthenotes. These cells—parthenogenetic stem cells or PSCs—were developed and eventually implanted into damaged mouse hearts. Quite simply, they worked in ways that seem to overcome most if not all of the technical hurdles.

The research appears in the Journal of Clinical Investigation, which carried a companion article claiming that the new research “may overcome all…formidable barriers” that currently stand in the way of stem cell medicine. The original article makes this claim: “One of our key observations involved the capacity of PSCs to exhibit essentially normal cardiogenesis in vitro and in vivo.” In other words, both in the dish and in the mouse, implanted cells fully integrate into the beating heart.

Both the research article and the companion piece make another claim: PSCs are ethically acceptable. That’s because parthenotes are not embryos. Taking cells from parthenotes avoids all the moral concerns that surround the use of cells derived from embryos. Here is the claim: Research using human PSCs, derived from human parthenotes, involves “no destruction of viable embryos,” according to the research article. The companion piece simply notes that compared to embryonic stem cells, PSCs “do not have the same ethical implications.”

If only it were that simple. But plain the fact is that some who object to the use of human embryos in research are already on record as objecting to the use of human parthenotes.

Their logic is fairly straightforward. If human embryos are off limits and if parthenotes cannot be clearly and definitely distinguished from embryos, then human parthenotes are equally off limits to research.
They are not claiming that parthenotes are little people, nor are they being silly or obstructionist. They are only claiming that we do not have enough scientific clarity and certitude to proceed with moral confidence in the work of creating and destroying parthenotes, regardless of the benefit.

Just to be clear, I personally disagree with this objection. But researchers and regulators should be aware that some, at least, will balk at this new line of research, technically attractive as it may be.

For example, in a statement given to the UK Parliament, the Church of Scotland made this comment:

“We reject the suggestion made by various researchers that hybrid embryos, parthenotes and embryos that have been modified to make then non-viable would be an ethical solution to deriving stem cells from embryos. Whatever the status of such creations, it is would be at least as unethical to use methods that would create an ‘embryo’ so deformed that it could not be viable and which therefore inherently denies its potential to develop.” 
Politically more important is the response that will come from Catholics. Some Catholic scholars have defended the moral legitimacy of research using human parthenotes. There is simply no way, they argue, to equate the parthenote with the embryo. The parthenote is not a product of conception. In more popular rhetoric: If “life begins at conception,” then the parthenote is not “life.” Nor can it develop normally. It meets none standard definitions of an embryo.

Others are not so certain. They translate scientific and theological uncertainty into a moral prohibition. Creating and destroying a parthenotes requires that we know for sure that they are not embryos. Such certainty is lacking, at least for now. In the face of uncertainty, they argue, we must not proceed.

On the Catholic website www.ewtn.com, E. Christian Brugger addresses the question: Is the parthenotes enough like and embryo to be considered an embryo? His answer:
“The question presently is unsettled.” He adds this: “Although the empirical question of the status of a human parthenote is unsettled, the underlying moral principle is straightforward. Unless we have moral certainty that a dividing parthenogenetically activated human oocyte is not an embryo, we have an obligation to avoid research with human parthenotes.”
And at the end he concludes:
“Having said this, the present evidence on whether parthenotes are ever embryos seems to me inconclusive. Given the evidence to date, at least with which I am familiar, I do not think it can be established with moral certitude that parthenotes are never human embryos.”
Personally, I want to see this research go forward, and so I have some suggestions for researchers and reporterss in this field.

First, help religious scholars build the case scientifically, showing in clear terms to the wider public why parthenotes are not functionally like embryos and why a morally robust boundary separates the two. Science itself cannot create that boundary, but it can provide evidence supporting moral and philosophical arguments in favour of such a boundary.

Second, stop using provocative phrases like “virgin birth.” Regrettably, the companion piece in the Journal of Clinical Investigation is published with this title: “Virgin birth: engineered heart muscles from parthenogenetic stem cells.”

Sure, “parthenos” is Greek for virgin, so the etymology supports the use of the term “virgin birth.” But for billions of Christians around this world, this term has a very special religious meaning, one that many associate with the most tender core of their faith.

For scientists to claim they are simulating the “virgin birth” is offensive to anyone who takes the religious meaning of the phrase seriously. It is needlessly provocative, almost the worst thing that could be said if religious support for research is desired.

What’s more, associating parthenogenesis with the “virgin birth” has the bizarre effect of equating the parthenote with the embryo. Christians who hold to the “virgin birth” will claim that in one profoundly non-trivial example (Jesus), what scientists now claim they are creating turned out to be a fully viable embryo. And then they say, “But don’t worry; it’s not a human being”?

The original article, entitled "Parthenogenetic stem cells for tissue engineered heart repair," is published in the February 22, 2013 issue of the Journal of Clinical Investigation, together with the companion piece.





















Thursday, February 7, 2013

Brain Renewal? Enhancing Aging Brains

An aging mind may be a fountain of wisdom, but an aging brain is not very good as a source of new neurons. As we age, quite apart from diseases like Alzheimer’s, we lose our ability to remember and to concentrate. It seems that in order to remain sharp, the brain has to regenerate itself by forming new neurons. While neurogenesis continues throughout life, it declines markedly in old age.

Photo credit: published under GNU Free Documentation License, uploaded 23 Sept 2007 by Ccrai008.

Research published today may suggest a way to change that. Scientists at the German Cancer Center in Heidelberg report on their work with mice. They identified a molecule called Dickkopf-1 or Dkk1 in the brains of old mice. When they blocked the production of Dkk1, old mouse brains began to create new brain cells.

“We released a brake on neuronal birth, thereby resetting performance in spatial memory tasks back to levels observed in younger animals,” said Ana Martin-Villalba in a press release from Cell Press, which published the results.

It turns out that clinical trials are already underway involving antibodies for Dkk1. These trials are not related to neurogenesis but to prevention of osteoporosis. What is learned there, however, may be directly helpful to the possibility that blocking Dkk1 is feasible, safe, and effective in countering the effects of declining neurogenesis, which includes both memory loss and depression.

The report concludes with these comments: “Our study raises the possibility that neutralization of Dkk1 might be beneficial in counteracting depression-like behavior and improving cognitive decline in the aging population….The contribution of newly generated young neurons to memory and affective behavior opens tantalizing opportunities for the prevention of affective impairments and age-related cognitive decline.”

These words are carefully chosen, first to caution against undue optimism but also to steer away from the idea of “human enhancement.” But unless we think of aging as a disease, what is envisioned here is clearly a form of enhancement. Normally aging human beings may, someday in the future, be treated not because they have a disease such as Alzheimer’s but because their memory is not as sharp as it once was or as retentive as they would like.

But labeling this an “enhancement” is not likely to dampen public interest. On the contrary, the enhancment potential of blocking Dkk1 is the very thing that is most likely to drive public support.

And that suggests we need to consider once again just what it is we say we do not like about enhancement.

The article is entitled "Loss of Dickkopf-1 restores neurogenesis in old age and counteracts cognitive decline" and appears in the February 7, 2013 issue of Cell Stem Cell.

Tuesday, January 22, 2013

Asians, Europeans, and Neandertals

New research suggests that Europeans and Asians diverged at least 40,000 years ago, starting a process leading to the subtle differences that distinguish people to this day.

Working with bones discovered in 2003, researchers at the Max Planck Institute for Evolutionary Anthropology in Leipzig were able to reconstruct portions of DNA from an individual who lived in China about 40,000 years ago. Earlier analysis of the bones suggested that this individual showed “archaic” features, somewhat like Neandertal bones.

Credit: A Photograph of China's Empress Dowager, taken in the 1890s by Xunling, the Imperial Court Photographer. In the public domain.

The Max Planck team, led by Svante Pääbo, is well-known for work in producing the virtually complete Neandertal genome. In addition, using just a tiny fragment of a finger bone, this team produced the genome of a previously unknown form of humanity, called the Denisovans.

In their earlier work, they discovered that Europeans and Asians are descended in part from Neandertals, who disappeared about 30,000 years ago. In addition, some Asians, especially those living on the islands south of Asia, are partly descended from the Denisovans.

One of the reasons why the team was interested in this new sample was to look more deeply into the relationship between Europeans and Asians and to ask what role Neandertal and Denisovan interbreeding might have played.

Comparing the newly-reconstructed DNA sequence from the 40,000 year old bones, they found they were looking at an individual who also was descended from Neandertals, pretty much the way Europeans and Asians are today. And they also learned that this individual showed no evidence of Denisovan interbreeding.

What this means, they suggest, is that 40,000 years ago, an early version of anatomically modern Eurasians lived in China, near Beijing. While this human community was very much like the humans moving into Europe at about the same time, these two lineages were beginning a process of divergence.

On the basis of additional comparisons, the team concluded that the early-modern human community in China 40,000 about years ago was closely related to today’s Native Americans.

The report is also significant because it shows the power of new approaches to DNA extraction and sequencing. In their raw form, the samples extracted from the bones contained mostly DNA from microorganisms. In fact the human DNA was less than one-tenth of one percent of the total DNA. Even so, researchers were able to establish reliable human sequences, suitable for comparison with other human genomes.

What does that mean? At the very least, it means that many more discoveries like this lie ahead. The new technology means that old findings take on new significance.

The research appears online January 22, 2013, in the Proceedings of National Academy of Sciences, as "DNA analysis of an early modern human from Tianyuan Cave, China."

Thursday, January 3, 2013

Past and Future Selves

Are you done changing? Are your values and personality pretty much set for life? Regardless of our age, most of us seem to think so.

According to new research, people generally recognize that they have changed over the past decade. But in the decade ahead? Overwhelmingly, people think their biggest changes are behind them. It’s as if their present state is the defining moment, when values and personality traits are fully realized and fix forever. The research team, led by Jordi Quoidbach, called this the “End of History Illusion.”

In six studies involving more than 19,000 participants, researchers “found consistent evidence to indicate that people underestimate how much they will change in the future,” according to the study appearing in the 5 January 2013 issue of the journal Science.

Like most illusions, this one comes with a big cost. Thinking they won’t change makes it more likely they will “make decisions that their future selves regret.”

What’s most amazing about this illusion is that it seems to hold true at all ages. In fact, some of the results suggested that more than their grandparents, young people think they are done changing.

Caption: Painting, Girl in a Mirror (1632) by Paulus Moreelse, purchased by the Rijksmuseum Amsterdam with support of the Vereniging Rembrandt. In the public domain.

This much, at least, was clear to the researchers: “At every stage of adult life that we could analyze. Both teenagers and grandparents seem to believe that the pace of personal change has slowed to a crawl and that they have recently become the people they will remain. History, it seems, is always ending today.”

While the researchers are clearly speaking of the history of the individual, their research raises the question of whether there’s a similar illusion when it comes to human history. For example, do we routinely underestimate the amount of technological change that lies ahead or its cultural and social impact? We acknowledge the profound cultural changes in past decades, but do we underestimate what is coming?

We marvel at the transformations of human evolution, but do we fail to imagine the changes that lie ahead? According to the researchers, "people may confuse the difficulty of imagining personal change with the unlikelihood of change itself." If that is true of the human individual, might it also be true of the human species?

The research appears as “The End of History Illusion” in the 4 January 2013 issue of the journal Science, a publication of the American Association for the Advancement of Science.

Thursday, November 15, 2012

Stone-Tipped Weapons: Older than We Thought

Stone-tipped spears have been around for at least 500,000 years, according to new research. That is about 200,000 years earlier than previously thought.

Why is that important? In part because it suggests that modern humans did not invent this technology. They did not get it from the Neandertals, nor did Neandertals get it from modern humans. Instead, it now seems that Neandertals and modern humans both used stone-tipped spears because both inherited this technology from an earlier form of human life.

It is generally believed that Neandertals and modern humans diverged about 500,000 years ago. The current view is that both came from earlier humans known as Homo heidelbergensis.

"Rather than being invented twice, or by one group learning from the other, stone-tipped spear technology was in place much earlier," according to Benjamin Schoville, who coauthored the study and is affiliated with the Institute of Human Origins at Arizona State University. "Although both Neandertals and humans used stone-tipped spears, this is the first evidence that this technology originated prior to or near the divergence of these two species," Schoville said according to a press release from his university.

Caption: A ~500,000-year-old point from Kathu Pan 1. Multiple lines of evidence indicate that points from Kathu Pan 1 were used as hafted spear tips. Scale bar = 1 cm. Credit: Jayne Wilkins. Usage Restrictions: Image may be used to illustrate coverage of this research only.

"This changes the way we think about early human adaptations and capacities before the origin of our own species," said Jayne Wilkins, a lead author from the University of Toronto. Technological advance—in this case stone-tipped spears—is now seen as more widely shared among the various forms of humanity and not so confined to anatomically modern humans like us. Creating stone-tipped spears requires more forethought and care than simpler stone tools, especially in preparing the tips for mounting to the wooden shaft of the spear. This process is called “hafting,” and the result is that a more efficient hunting weapon is created.

In this study, researchers re-examined stone points discovered more than thirty years ago. By comparing the damage to the spear tips with simulated damage re-created under laboratory conditions, researchers found evidence that strongly supports the view that the original tips were used for spears.

"When points are used as spear tips, there is a lot of damage that forms at the tip of the point, and large distinctive fractures form. The damage on these ancient stone spear points is remarkably similar to those produced with our calibrated crossbow experiment, and we demonstrate they are not easily created from other processes," said coauthor Kyle Brown, a skilled stone tool replicator from the University of Cape Town.

Brown, along with others who worked on the current paper, also collaborated on a study just released describing further stone weapons refinements that occurred about 70,000 years ago and probably gave modern humans an advantage over Neadertals. For more on that, see Better Technology, Better Weapons.

The most recent findings that push the date of stone-tipped spears back to 500,000 years ago are published as "Evidence for Early Hafted Hunting Technology" in the November 16, 2012 issue of Science.

Wednesday, November 7, 2012

A Living, Breathing Lung-on-a-Chip

Human cells can be grown outside the human body. In a petri dish, they may develop in ways that resemble the cells inside the body. But their function and activity are limited. For example, in a dish, lung cells are just lung cells. They don’t breathe.

Using new technology, however, researchers have put lung cells on a chip. The cells on a chip have suddenly become a lung-on-a-chip, active, moving, and breathing.

In a paper published in the in the November 7 issue of Science Translational Medicine, researchers report on their use of recently-developed organ-on-a-chip technology. They describe how they built and used "a biomimetic microdevice that reconstitutes organ-level lung functions to create a human disease model-on-a-chip."

Caption: Wyss Institute's human breathing lung-on-a-chip. Credit: Wyss Institute, Harvard University. Usage Restrictions: None.

Already the device has led to two discoveries directly applicable to the lung disease, edema, which is a major concern for some cancer patients. First, development of the disease is accelerated by the physical movement of the lungs. This is "something that clinicians and scientists never suspected before," according to Donald Ingber, senior author of the study.

Second, researchers identified one drug, currently under development, that might help prevent the problem. For Ingber, this is the main attraction of organ-on-a-chip technology. "This on-chip model of human pulmonary edema can be used to identify new potential therapeutic agents in vitro," Ingber says.

This could accelerate the speed of drug development and testing while reducing the cost. The main advantage is that an organ-on-a-chip gives researchers the opportunity to test a wide array of potential drug compounds. Tests can be run not just on nonhuman animals or on cultured human cells but on functioning or working small-scale models of human organs.

Beyond its value in pharmaceutical research, it is not clear where this research may lead, but it is one more way in which the boundary we once drew between the living and the nonliving is being erased, along with the line between the natural and the artifical.

The work was funded by the National Institutes of Health (NIH) and the Food and Drug Administration (FDA), Defense Advanced Research Projects Agency (DARPA), and the Wyss Institute for Biologically Inspired Engineering at Harvard University. The paper is entitled "A Human Disease Model of Drug Toxicity–Induced Pulmonary Edema in a Lung-on-a-Chip Microdevice" and appears in the November 7, 2012 issue of Science Translational Medicine.