After his conviction for the murder of a Sacramento store clerk in 1989, Charles Gaston faced execution. Pleading for mercy, his lawyer argued that Gaston wasn't fully responsible for his misdeeds, because he was a victim of Fetal Alcohol Syndrome. Though Gaston was 29 at the time of the killing, his experiences in the womb, the lawyer claimed, bore part of the blame for his crime. Witnesses testified that Gaston's birth mother (he was subsequently adopted by another family) had been drunk throughout her pregnancy, and medical experts described the dire impact of fetal alcohol syndrome on thought, emotional stability and self-control.
As the science writer Annie Murphy Paul recounts in Origins: How the Nine Months Before Birth Shape the Rest of Our Lives, Gaston’s judge accepted the argument, sparing Gaston the death penalty. The case became an early example of the kinds of challenges that research on fetal experience poses for settled ideas about responsibility. Now, 20 years later, "fetal origins" — the theory that our pre-birth experiences in the womb determine much about our personalities, tastes, capabilities and vulnerabilities — has grown from being an exotic idea into a burgeoning medical research field. "People are starting to believe that what happens in the womb can affect us for a long time after," Matthew Gillman, a researcher at the Harvard Medical School and author of a paper titled "The Fetal Origin of Adult Disease: From Skeptic to Convert," tells Murphy Paul.
The data from the new discipline, Murphy-Paul argues, demand new thinking about responsibility — not just in courtrooms, but also in schools, public-health agencies, disaster-relief efforts and day-to-day medical care. "Society's stake in healthy pregnancies is far larger than we knew," she writes. In nine clear and engaging chapters (one for each of the months of her own pregnancy as she wrote the book), she makes a good case.
The central idea of fetal-origins research is that a developing embryo is built to receive cues about the kind of world it will live in. Those cues might tell one baby-to-be to prepare itself for a world of abundance and safety, and another to be ready for hunger and fear. Those cues derive from the mother’s experiences during her pregnancy.
This idea, Murphy Paul points out, scrambles the familiar lines of any nature-nurture model of responsibility. For example, Americans are used to the message that lack of exercise and a high-fat, high-carbohydrate diet increase one's risk of developing diabetes. And some populations, like the Pima Indians of Arizona, have fatalistically accepted that they have "bad genes" that make this modern diet exceptionally dangerous. (More than half of Pima older than 35 have diabetes.) But long-range studies have found that the greatest risk factor for the disease in the Pima is neither DNA nor lifestyle, but simply whether, as fetuses, they had a diabetic mother.
Data like these converted a once-skeptical economist named Douglas Almond, who studied the cohort of children born during the 1918 influenza pandemic. Murphy Paul describes how Almond found that children born during the height of that infection did worse on many measures throughout their lives, compared to people who’d been born right before or right after 1918. For instance, male children born during the pandemic grew to be markedly shorter than young men who were only a few months older or younger (a fact made clear by military records kept when, as grown-ups, they registered to fight in World War Two). The influenza cohort, Almond found, were 15 percent more likely than their near-peers to drop out of high school; they earned lower wages throughout life and as older adults were 20 percent more likely to be disabled. Far from being a well-protected inevitable expression of its genes, a developing baby is highly sensitive to the hurts and jolts of its mother's world. The stress of mother's illness, Murphy Paul writes, marks a person well into old age.
So does the stress of hunger: In studies of the effects of fasting on fetal outcome, Almond found that Muslim babies born nine months after the Ramadan fast were 22 percent more likely to be disabled as adults, and had higher rates of trouble with vision, hearing and learning. (Ramadan is a useful example because it cycles through the Western calendar year; the effects of 30 days of all-day fasting can thus be distinguished from the effects of winter or a bad harvest.) Similarly, the 40,000 babies gestated during Holland's "Hunger Winter" of 1944-1945 grew up to have more obesity, more diabetes and more heart trouble in their ranks than their compatriots who developed free of war-induced starvation.
"These individuals' prenatal experience of starvation seems to have changed their bodies in myriad ways," Murphy Paul writes. "They have higher blood pressure, poorer cholesterol profiles, and reduced glucose tolerance, a precursor of diabetes."
Other researchers have noted the impacts of fetal experience in such varied aspects of adult life as the propensity to asthma (asthma rates are lower in babies whose mothers took more vitamin D while pregnant), homosexuality, IQ test scores, reading difficulties and even the taste for licorice. (French babies whose mothers ate anise while pregnant, one study found, were happy to eat food with that flavor, but pregnant women who didn't touch the stuff bore babies who disliked it).
All this information could be interpreted to mean that modern mothers ought to consider themselves even more responsible for their babies' lives than they already do. But this is an intellectual dead end, Murphy Paul argues. For one thing, emphasizing the isolated mother's responsibility leads to a kind of endless regression: Your child's fate is set by how you behaved in pregnancy, but that was set by how your mother behaved while she was carrying you, and hers by her mother, and so on until the buck passes back to Eve.
In fact, Murphy Paul argues, "fetal origins" cannot displace the importance of taking responsibility for one's own life. Gaston, for instance, could have decided not to wrestle for the store clerk's gun; he was spared the gas chamber, but he still got life in prison. In fact, knowledge of fetal-development effects may make personal responsibility even more important: If you know your gestation predisposed you to heart trouble, that's all the more reason to eat right and exercise.
In the near future, Murphy Paul suggests, our options might also include drugs developed specifically to help people whose mothers lived through a natural disaster while pregnant. Experiments on rats and rabbits, Murphy Paul writes, hint that someday adults who had "a less than ideal fetal life" could get a pharmaceutical "do-over."
But a purely individual account of personal responsibility isn’t broad enough to fit the data, Murphy Paul adds. After all, a lot of maternal experience is beyond any individual's control. Dutch mothers in late 1944 certainly didn't choose to go hungry, and very few pregnant women today refuse prenatal care. If they lack it, it's because they have no access to it.
So the biggest impact of fetal-development research should be on our thinking about collective responsibility, not individual blame, Murphy Paul believes. The field's rapidly growing data suggest that political and economic decisions have bigger repercussions on future generations than we once imagined. Hence a number of Murphy Paul's suggestions center on society's responsibility to pregnant women: Improve access to prenatal care for all mothers-to-be. Make sure healthy food is available to all. Rework disaster-relief plans to make the protection and sustenance of pregnant women a high priority. Provide the means for pregnant women to protect themselves from stress, smoking and chemicals that endanger the fetus.
All of that may sound a bit pie-in-the-sky during a severe economic downturn, but Murphy Paul offers a strong argument that, as she puts it, the health and well-being of fetuses is “a matter of concern for our entire society, not simply women who happen to be expecting a child." Perhaps, then, society has an unacknowledged responsibility to make the womb a healthier environment than it is now. After all, she writes, "such investments in the well-being of pregnant women and fetuses could result in a great upward leap in the population's health, akin to jumps seen in earlier eras following widespread improvements in nutrition or sanitation."
One reason for the "French paradox" of rich foods enjoyed by people with low levels of heart disease, the British researcher David Barker has concluded, is probably the systematic effort that France has paid over the last century to maternal health and nutrition. As Murphy Paul notes, this tradition doesn't date from a time of sentimentality. The country's pro-natal policies were put in place after the Franco-Prussian war, under the theory that well-tended mothers would produce more and better potential soldiers. A new concept of collective responsibility to fetuses, in other words, might even be seen as a matter of national security.
David Berreby blogs about behavior at Bigthink.com and has written about science for The New Yorker, The New York Times Magazine, and many other publications. He is the author of Us and Them: The Science of Identity, published by Little, Brown.