Wednesday, October 19, 2011

A New Approach to Enhancement?

Some people object to human germline modification because they do not like the idea of one generation messing with the DNA of future generations. Even worse, they say, is modifying the genes for the sake of…gasp!...enhancement!

But now comes a tantalizing study in tomorrow’s issue of Nature hinting at the possibility that what we do to live longer may change the lifespan of our grandchildren. It’s only a hint—the research reported here involves the faithful nematode, Caenorhabditis elegans. By exposing one generation of these tiny worms to just three proteins, researchers in Anne Brunet’s lab at Stanford produced worms that live up to 30% longer. The surprising thing is that the enhanced lifespan was passed to the next 2-3 generations. The really surprising thing is that the lifespan of the C. elegans great-grandchildren was enhanced even though no DNA sequences were modified. In other words: germline enhancement without genetic modification.

How is that possible? Epigenetics. The three proteins changed the way the DNA is structured or packed without changing the DNA code itself. Such epigenetic changes can change the way genes are expressed. The effect can be dramatic—in this case, a 30% longer lifespan. What’s more, the epigenetic change can be passed to future generations. Most often, epigenetic changes are reset during reproduction. But in some cases, epigenetic modifications are passed to the next 2-3 generations. When that happens, the structure and the expression of DNA are changed even though the DNA sequence remains unchanged. Over time, however, the effect washes out so that the great-great-grandchildren are back to the starting point.

Will this epigenetics-to-lifespan relationship be found in human beings? Who knows. Again, it must be repeated: this research involves flatworms. Humans are just a bit more complicated. Already, however, Brunet’s lab is looking for something similar in mice and in African killfish.

Whether anything similar will be found in human beings, this research already suggests a truly interesting thought experiment. Suppose this leads someday to a human-application technology. Would it be opposed by those who object to human germline modification? Sure, future human beings would be changed without their consent. But no genes are changed, and the changes are not permanent.

Perhaps the more sobering thought is this. Maybe this research will lead to a startling discovery. Never mind some new technology. Might it turn out that what health-minded human beings normally do—eat their green vegetables, get their exercise—has the effect of enhancing their offspring by modifying the expression of their genes by means of generating inheritable epigenetic changes? Could be. If just three proteins make C. elegans progeny live 30% longer, just imagine how your dinner might change your grandchildren (assuming, of course, that you’re in your reproductive years or younger).

The article, “Transgenerational epigenetic inheritance of longevity in Caenorhabditis elegans,” appears in the October 20 issue of Nature.

Tuesday, October 18, 2011

Evolution and the Human Brain

How did the human brain become so complex so quickly? Did old genes learn new tricks? Or did new genes appear, bringing new functions?

A paper appearing today in PLoS Biology suggests that new genes play a bigger role than previously thought in explaining the complex functions of the human brain. Researchers at the University of Chicago Department of Ecology and Evolution reached this conclusion by comparing the age of genes with transcription data from humans and mice. Where are new genes most often expressed? In humans, it’s in the brain. Even more interestingly, it’s in the developing brain of the fetus and the infant.

One of the researchers, Yong E. Zhang, was motivated to ask these questions because he accompanied his pregnant wife to prenatal ultrasound appointment, according to a press release issued by the University of Chicago Medical Center. According to Zhang, “Newer genes are found in newer parts of the human brain.” The press release also quotes co-author Patrick Long: “What’s really surprising is that the evolutionary newest genes on the block act early….The primate-specific genes act before birth, even when a human embryo doesn’t look very different from a mouse embryo. But the actual differences are laid out early,” Long explained.

In the language of the PLoS Biology paper, the authors “observed an unexpected accelerated origination of new genes which are upregulated in the early developmental stages (fetal and infant) of human brains relative to mouse.” In other words, compared to all the genes in the human genome, younger genes are significantly more involved in those parts of the brain that make us distinctly human. More than that, these genes play a greater than expected role in prenatal and infant development, the very period in which the brains of humans develop so rapidly compared to the brains of other species.

How did these new genes arise? By all the various means by which new genes arise—by various processes of duplication and by de novo origination. Rather remarkably, the authors make this observation: “…young genes created by all major gene origination mechanisms tend to be upregulated in [the] fetal brain. Such generality suggests that a systematic force instead of a mutational bias associate with a specific origination mechanism contributed to the excess of young genes in the fetal brain.”

What “systematic force”? Clearly, the authors are not speculating about anything more than a statistical correlation. But their work will give rise to new questions for research. What role do these young genes actually play in the developing brain? What role did natural selection play in the evolution of these genes? Does this surprising correlation shed any light at all on our rapid rise as a species and the stunning complexity of the human brain?

The paper, "Accelerated Recruitment of New Brain Development Genes into the Human Genome," is published in the October 18 issue of PLoS Biology [10.1371/journal.pbio.1001179].

Thursday, October 13, 2011

100,000 Years of Art

The awakening of human creativity is one of the great mysteries of our species. Even today we marvel at the artistic power of cave art, some of it dating back nearly 35,000 years ago. Musical instruments—flutes, at least—date back nearly as far. Beads, often made by carefully drilling a hole through shells, date to nearly 100,000 years ago.

And now comes evidence to suggest that painting goes back just as far. At least 100,000 years ago, about 40,000 years earlier than previously thought, human beings made pigments for paint through a process that is surprisingly complex.

In the October 14 edition of the journal Science, Christopher Henshilwood and his team present their analysis of the earliest known “artists’ workshop.” In the Blombos Cave in Cape Town, South Africa, they discovered a 100,000 year old ochre processing site. In two places in the cave, ochre was ground into fine powder, mixed with crushed quartz and other chemicals including charcoal and bone, and blended into a pigment mixture that was stored in two abalone shells. The pigment may have been used for painting, body decoration, or coloring of clothing.

"The recovery of these toolkits adds evidence for early technological and behavioural developments associated with humans and documents their deliberate planning, production and curation of pigmented compound and the use of containers. It also demonstrates that humans had an elementary knowledge of chemistry and the ability for long-term planning 100,000 years ago," concludes Henshilwood in a press release issued by the University of the Witwatersrand in Johannesburg. The article, "A 100,000-Year-Old Ochre-Processing Workshop at Blombos Cave, South Africa," appears in the October 14 issue of Science.

What is fascinating is how early all this occurred and just how complex the process was. It involved careful planning over time. It included surprisingly sophistical technology (one is tempted to say “chemical engineering”). Why? What was stirring then, and how are we still inventing new ways to release the human imagination?

Wednesday, October 12, 2011

Still wondering..."What Defines Us?"

Last February the editors of Science marked the 10th anniversary of the publication of the complete human genome. They invited a dozen or so people to contribute short reflections on the meaning of this milestone.

I was given 250 words for my article, entitled “What Defines Us?” In that small space I tried to suggest two things. First I noted a simple irony: As we gain more scientific information about our genome and our evolution, the philosophical and religious concepts of humanity become blurred or defocused. Second, I suggested that this defocusing need not be a cause for discomfort. Precisely because we are humans—whatever exactly that means anymore—we switch almost immediately from discomfort to wonder and excitement. The quick switch is what makes us human. I ended by asking: “Who are we, and where will we go next?”

It is as if we are making ourselves up as we go along. Recent discoveries in human evolution intimate just such a view. Anatomically modern humans are now believed to have interbred with Neandertals and with the more recently discovered Denisovans. Many of us carry the DNA of these extinct forms of humanity in our own genome. In that sense they are not distinct at all but live on in every cell of our bodies. More recently, it has been suggested that similar interbreeding occurred in sub-Saharan Africa.

Interestingly, it has also been suggested that interbreeding enhanced us. In August, a research article in Science suggested that our immune systems are more resilient than they might have been. Why? Because our ancestors interbred with Neandertals and Denisovans.

The evolutionary tree of humanity is beginning to look less like a tree and more like a tangled vine. And now we are led to wonder whether it is the tangle that enhances us. What makes us less clearly Homo sapiens seems, paradoxically, to make us more extraordinary as a species. Well…we’re running far ahead of the science here, but (as I suggested in Science), wondering is what makes us human. It’s not just the DNA; it’s what we dare to do with it.

I have been thinking about this recently because I will be speaking on November 10 on this very topic at the University of South Florida Saint Petersburg. They have planned an exciting lecture series, Festival of the Genome: Celebrating the 10th Anniversary of the Sequencing of the Human Genome, showcasing various perspectives on human genome research.

Monday, October 18, 2010

Transhumanism and "Super-human"

I am working on a book on human enhancement through technology. It covers various technologies, including drugs for cognitive enhancement or strategies to extend the human lifespan. The core question for me is theological: what do these technologies mean in light of the classic Christian hope that our lives are to be transformed in Christ?

Part of the book deals with transhumanism, which is a movement that promotes the use of these technologies to enhance human capacities. Today I was looking especially at the antecedents of transhumanism. The word "transhuman"--considering all its cognates in Latin--seems to originate in Dante's Divine Comedy, Paradiso, I.70 (more on that in a later post). What's really interesting is that if we include the Greek equivalent--hyperanthropos--the earliest uses go all the way back to about 150CE.

I ran across an interesting article by Paul Monaghan, who writes about aesthetics and theater. Here's an extended quote from an article by Monaghan:

"I want to introduce another term here, the hyperanthropos or ‘more-than-man’. The first known use of the word itself is in a comic dialogue called Kataplous (chapter 16) by Lucian of Samosata in the second century A.D., in connection with Prometheus, the Titan god who taught mankind how to live independently of the gods by giving them fire, and who is strongly associated with Ananke and suffering in human life. But I am using the term hyperanthropos as a useful shorthand for a concept that had appeared in one form or the other from Homer and Hesiod onwards, was dominant in Greek culture, and has continued to play an important role in Western metaphysics. The hyperanthropos was, and is, either part man, part god (for example, the Homeric Heroes), or a man who is raised well above ordinary men by reason of his intellect (philosophers), physical abilities (athletes), or the great benefits he provides mankind (such as Prometheus). The protagonist in Greek tragedy was a hyperanthropos who had been ‘separated out’ from the Chorus, and whose actions had enormous ramifications for that community of ordinary (often very ordinary) men and women. Socrates himself was seen as a protagonist and hyperanthropos by Plato, and Plato has been regarded as such throughout the centuries, along with the ancient Greeks in general."

But what Monaghan doesn't say is that around the same time, a charismatic Christian named Montanus, later regarded as a heretic for (among other things) recognizing the leadership of women, used the same word--hyperanthropos--to describe ordinary believers.

Transhumanists themselves tend to credit Julian Huxley with creating the word "transhumanism." The fact that it goes back not just to Dante but to Lucian of Samosata and to the early Christian Montanus is more than just a correction. It show that the hope for human enhancement is intertwined with--even rooted in--the longing for a more profound transformation. Transhumanists like to associate their vision of human transformation with Prometheus but not with Christ.

Wednesday, March 11, 2009

Embryonic Stem Cell Research Policy

Rob Stein’s article in yesterday’s Washington Post was one of the few reports to recognize that in announcing his support for federal funding for embryonic stem cell research, President Obama left several big questions unanswered.

According to Stein, NIH insiders were caught off guard. They had expected Obama to say that cell lines derived at any date from donated embryos would be eligible for research dollars. That step removes the barrier set by Pres. Bush, who made funds available but only for cells from donated embryos and derived before August 2001.

But Obama’s statement raised new questions:

First, will federal funding be available for research involving embryonic stem cells taken from embryos that were created specifically for research? Second, will funding be available for cells taken from cloned embryos?

The first of these questions is of course the more urgent of the two. A majority of Americans appear to support the view that research using cells taken from donated embryos is morally acceptable. These embryos already exists by the hundreds of thousands in storage in the nation’s fertility clinics. Most seem to think that it is better donate them for research that could lead to medical breakthroughs than destroy them.

But is it right for scientists to create the embryos, just for research that will not help the embryo but will instead destroy it? Here’s where many draw a line. It will be interesting, to put it mildly, to see if NIH agrees. What NIH will need to do, of course, is to assess carefully the impact of funding research on cells from donated embryos but not on cells from embryos created specifically for research. Will such a line really hold back science?

My own view is that, within strict limits and only if there is a need that cannot be met another way, it is permissible to create embryos for research. I believe that position can be argued on the basis of Christian theology and ethics. I also know that I am in a minority, and I am willing to recognize that policy in this field must be respectful of the deeply-held moral views of the majority. Recognizing that this area of research is contentious, I think it might be wise for NIH not to press too far in changing the guidelines. Unless it can be shown that science is seriously compromised by limiting federal funds to cells from donated embryos, it would be wise to put the limit right there.

The second question—whether federal funds will be available for research on cells from cloned embryos—is a bit more speculative. One of the great hopes for cloned embryos is that researchers could develop patient-specific stem cell lines, first to test drugs on human cell cultures but perhaps to implant in patients without provoking the immune system. Now these hopes are largely addressed with the 2006-2007 breakthroughs in induced pluripotency. Whether cloning still provides a significant scientific advantage over induced pluripotency is something that will have to be argued.

If the argument for the need for cloned stem cells is compelling, then perhaps NIH should recommend that these cells be eligible for use in federally funded studies. But if not, then it would be best, I think, for the NIH to exclude this source of cells.

In his statement yesterday, President Obama was clear that he did not wish to open the door to reproductive cloning. Reproductive cloning is, of course, a far different thing that using cloning (more precisely, nuclear transfer) to create a cloned embryo for research. Even so, many in the general public link the two and object equally to both. Perhaps the most convincing way to close the door to reproductive cloning is to exclude funding for work on cells from cloned embryos.

On top of everything else, NIH will need to consider rules for informed consent, first of all for couples who donate embryos for research, but also for anyone who donates cells that are induced to become pluripotent stem cells. If that’s not enough, NIH and other agencies need to move forward to get ready to oversee clinical trials in this field.

All this is to say that the NIH will be doing two things at once. It will be closing the first stem cell debate (the Bush-era debate over the embryo as the source) and opening a whole new era of stem cell ethics, centered around the pluripotency of cells and of what they might become—in the laboratory, in the body of patients, and in the future of regenerative medicine

Monday, March 9, 2009

Stem Cells Research and Christian Ethics

President Obama has opened a new era for stem cell research in America. He has asked the National Institutes of Health to draft new funding guidelines that will make federal research dollars available to US stem cell researchers without imposing unnecessary restrictions.

The restrictions were imposed on August 9, 2001, by Pres. George W. Bush. He approved the use of federal funds for embryonic stem cell research (something that outraged the religious right at the time), provided the cells were derived before the moment he gave the speech. Since then, the number of qualifying cell lines has shrunk while the number of new, unfunded lines has expanded.

Very few of us ever could grasp the moral difference between an embryo destroyed before August 2001 and one destroyed afterward. And most Americans favor embryonic stem cell research if it uses cells from embryos already created for fertility clinics but unused and ready to be destroyed anyway.

What's not so well known is that several religious groups support the Obama position. Jewish scholars are very clear in their support, as are experts in Islamic law. Christians are of course divided. The Vatican clearly opposes any use of embryos, but not all individual Catholics agree. Some Protestant denominations--the Presbyterian Church (USA), for example, or my own United Church of Christ--have gone on record supporting this research.

In the next few months, it will be interesting to see whether the NIH draft provides for funding for stem cell research on cell lines derived from embryos that were created especially for research. Here, more Americans are opposed, and NIH might be wise to draw a line: Offer funding for lines from donated embryos but not from embryos created expressly for research. Drawing the line at that point would also rule out cloning or nuclear transfer, since any cloned embryo is by definition created for research.

In the meantime, congratulations to Pres. Obama for recognizing the promise of this field of research, the moral complexities that lie ahead, and the need to set aside the artificial limits of the past while working toward consensus on the moral vision that guides the future.