How the Black Death Spawned the Minimum Wage
by Stephen Mihm / Sep 5, 2013
Fast-food joints, long inhospitable to any kind of labor activism, are suddenly beset by a surge in strikes. Over the past few months, workers at chains such as McDonalds Corp. have walked off the job in more than 60 cities, demanding a “living wage” of $15 an hour. Regardless of whether the strikes lead to better pay, they have rekindled debate over what constitutes a living wage. That debate, however, has stranger, older and more curious origins than either proponents or detractors of the living wage might imagine.
The story begins in medieval England in the 14th century. Life, never particularly easy at this time in history, had become especially nasty, brutish and short. The preceding year, the “Great Pestilence,” better known as the Black Death, had arrived in continental Europe. The pandemic, one contemporary noted, “began in India and, raging through the whole of infidel Syria and Egypt,” reached England in 1349, “where the same mortality destroyed more than a third of the men, women and children.” Once the dead had been buried, feudal society was shaken to its core by a startling realization. As this same chronicler complained, “there was such a shortage of servants, craftsmen, and workmen, and of agricultural workers and labourers, that a great many lords and people … were yet without all service and attendance.” Survivors could now command much higher compensation for their work, and they weren’t shy about asking for it: “The humble turned up their noses at employment, and could scarcely be persuaded to serve the eminent unless for triple wages.” In response, King Edward III — a wealthy landowner who was as dependent on serfs as his many lords — issued the “Ordinance of Labourers,” which put a ceiling on how much workers could charge for their labor, setting wages at pre-plague levels. Subsequent amendments of the law — for example, the Statute of Labourers in 1351 — amplified the penalties for paying above set rates.
These laws effectively set what we would call a maximum wage. But the measures reflected something a bit more complicated than an attempt to stick it to the serfs. They embodied a distinctly medieval belief that one’s earnings should be commensurate with one’s station in life. The Catholic theologian Thomas Aquinas wrote that a man’s “external riches” must be sufficient “for him to live in keeping with his condition in life.” Anything less was cruel; anything more was an enticement to sin and a threat to the social order. As the historian Kevin Blackburn has convincingly argued, while laws governing wages initially set a ceiling on compensation, they were ultimately used to set a living wage, arguably as early as 1389, when an amendment to the Statute of Labourers effectively pegged wages to the price of food.
As the centuries passed, the justices of the peace charged with setting maximum wages appear to have begun setting formal minimum wages as well, though the evidence is fragmentary. Nonetheless, the practice eventually gained statutory recognition with the passage of an “Act Fixing a Minimum Wage,” issued in 1604 during the reign of James I and aimed at workers in the textile industry. The idea of encumbering wages with either an upper or lower limit would soon fall victim to the liberalizing tendencies of an increasingly capitalistic England. By the early 19th century, the Statutes of Labourers had been repealed. But the argument over wages didn’t disappear. As labor unrest swept many industrial nations in the 19th century, the concept of the minimum wage or living wage resurfaced in unexpected places.
The first was the Vatican. In 1891, Pope Leo XIII offered a distinctly medieval take on the labor question. In his Rerum Novarum, the pontiff called for the passage of laws to remove “the causes which lead to conflicts between employers and [the] employed.” Foremost among those causes, he averred, was the insufficiency of wages. “To defraud any one of wages that are his due is a great crime which cries to the avenging anger of Heaven,” he declared. But there was an easier way to solve the problem than involving the Almighty. Instead, the pope counseled the revival of the medieval living wage, arguing that the compensation of a wage earner should be sufficient “to support a frugal and well-behaved wage-earner.” The encyclical resonated in nations that had high numbers of both Catholics and aggrieved workers. Among these was Australia, which along with New Zealand would become a cradle of the modern minimum wage movement. In the 1890s, Australian Catholics began agitating for the implementation of a living wage. The year of the encyclical, Australian Cardinal Patrick Francis Moran called for wages sufficient to “yield a competence … for the frugal support of [a worker’s] wife and family.”
The first genuine minimum wage laws were established in the states of Victoria (1894) and New South Wales (1895). They dictated that unskilled workers employed by the government be paid a living wage of seven shillings a day.As one legislator declared in 1894, “the workers should have a rate of payment which would enable them to maintain themselves and their families in decent comfort.” In the succeeding years, support for minimum wage legislation grew. Catholic reformers continued to revive the medieval idea of a living wage. Foremost among these figures was Henry Bournes Higgins, the presiding judge in the Commonwealth Court of Conciliation and Arbitration.
In 1907, Higgins heard a case involving the Sunshine Harvester Works, the largest manufacturer of farming implements in Australia. Under a newly passed law, the company would have to pay a significant tax unless it could prove that it paid its workers “fair and reasonable” wages. The law didn’t set those wages; it was up to the court to decide whether Harvester met that threshold. Higgins rejected the company’s claims that it paid reasonable wages. More important, Higgins declared that the court had the right to a set a national minimum wage in the private sector, and he did: seven shillings a day for those working at unskilled labor. Higgins declared that a living wage must be sufficient to provide a “reasonable and frugal comfort.” As Blackburn observed, Higgins effectively “secularized the living wage,” reviving a medieval concept for modern times.
Though Harvester managed to get the decision reversed by a higher court, the opinion quickly became iconic. Higgins and his judicial allies managed to secure widespread acceptance of the idea of a national minimum wage through other opinions. The minimum wage was here to stay. Australia soon became a kind of Mecca for reformers elsewhere, who made the pilgrimage to study these and other innovations firsthand. When reformers in the U.S. proposed a minimum wage to drive wages up, they immediately went Down Under.
Why a medieval peasant got more vacation time than you
by Lynn Parramore / AUGUST 29, 2013
Life for the medieval peasant was certainly no picnic. His life was shadowed by fear of famine, disease and bursts of warfare. His diet and personal hygiene left much to be desired. But despite his reputation as a miserable wretch, you might envy him one thing: his vacations. Plowing and harvesting were backbreaking toil, but the peasant enjoyed anywhere from eight weeks to half the year off. The Church, mindful of how to keep a population from rebelling, enforced frequent mandatory holidays. Weddings, wakes and births might mean a week off quaffing ale to celebrate, and when wandering jugglers or sporting events came to town, the peasant expected time off for entertainment. There were labor-free Sundays, and when the plowing and harvesting seasons were over, the peasant got time to rest, too. In fact, economist Juliet Shor found that during periods of particularly high wages, such as 14th-century England, peasants might put in no more than 150 days a year.
As for the modern American worker? After a year on the job, she gets an average of eight vacation days annually. It wasn’t supposed to turn out this way: John Maynard Keynes, one of the founders of modern economics, made a famous prediction that by 2030, advanced societies would be wealthy enough that leisure time, rather than work, would characterize national lifestyles. So far, that forecast is not looking good. What happened? Some cite the victory of the modern eight-hour a day, 40-hour workweek over the punishing 70 or 80 hours a 19th century worker spent toiling as proof that we’re moving in the right direction. But Americans have long since kissed the 40-hour workweek goodbye, and Shor’s examination of work patterns reveals that the 19th century was an aberration in the history of human labor. When workers fought for the eight-hour workday, they weren’t trying to get something radical and new, but rather to restore what their ancestors had enjoyed before industrial capitalists and the electric lightbulb came on the scene. Go back 200, 300 or 400 years and you find that most people did not work very long hours at all. In addition to relaxing during long holidays, the medieval peasant took his sweet time eating meals, and the day often included time for an afternoon snooze. “The tempo of life was slow, even leisurely; the pace of work relaxed,” notes Shor. “Our ancestors may not have been rich, but they had an abundance of leisure.”
Fast-forward to the 21st century, and the U.S. is the only advanced country with no national vacation policy whatsoever. Many American workers must keep on working through public holidays, and vacation days often go unused. Even when we finally carve out a holiday, many of us answer emails and “check in” whether we’re camping with the kids or trying to kick back on the beach. Some blame the American worker for not taking what is her due. But in a period of consistently high unemployment, job insecurity and weak labor unions, employees may feel no choice but to accept the conditions set by the culture and the individual employer. In a world of “at will” employment, where the work contract can be terminated at any time, it’s not easy to raise objections.
It’s true that the New Deal brought back some of the conditions that farm workers and artisans from the Middle Ages took for granted, but since the 1980s things have gone steadily downhill. With secure long-term employment slipping away, people jump from job to job, so seniority no longer offers the benefits of additional days off. The rising trend of hourly and part-time work, stoked by the Great Recession, means that for many, the idea of a guaranteed vacation is a dim memory. Ironically, this cult of endless toil doesn’t really help the bottom line. Study after study shows that overworking reduces productivity. On the other hand, performance increases after a vacation, and workers come back with restored energy and focus. The longer the vacation, the more relaxed and energized people feel upon returning to the office. Economic crises give austerity-minded politicians excuses to talk of decreasing time off, increasing the retirement age and cutting into social insurance programs and safety nets that were supposed to allow us a fate better than working until we drop. In Europe, where workers average 25 to 30 days off per year, politicians like French President Francois Hollande and Greek Prime Minister Antonis Samaras are sending signals that the culture of longer vacations is coming to an end. But the belief that shorter vacations bring economic gains doesn’t appear to add up. According to the Organisation for Economic Co-operation and Development (OECD) the Greeks, who face a horrible economy, work more hours than any other Europeans. In Germany, an economic powerhouse, workers rank second to last in number of hours worked. Despite more time off, German workers are the eighth most productive in Europe, while the long-toiling Greeks rank 24 out of 25 in productivity.
Beyond burnout, vanishing vacations make our relationships with families and friends suffer. Our health is deteriorating: depression and higher risk of death are among the outcomes for our no-vacation nation. Some forward-thinking people have tried to reverse this trend, like progressive economist Robert Reich, who has argued in favor of a mandatory three weeks off for all American workers. Congressman Alan Grayson proposed the Paid Vacation Act of 2009, but alas, the bill didn’t even make it to the floor of Congress. Speaking of Congress, its members seem to be the only people in America getting as much down time as the medieval peasant. They get 239 days off this year.
How Poverty Taxes the Brain
by Emily Badger / Aug 29, 2013
Human mental bandwidth is finite. You’ve probably experienced this before (though maybe not in those terms): When you’re lost in concentration trying to solve a problem like a broken computer, you’re more likely to neglect other tasks, things like remembering to take the dog for a walk, or picking your kid up from school. This is why people who use cell phones behind the wheel actually perform worse as drivers. It’s why air traffic controllers focused on averting a mid-air collision are less likely to pay attention to other planes in the sky. We only have so much cognitive capacity to spread around. It’s a scarce resource. This understanding of the brain’s bandwidth could fundamentally change the way we think about poverty. Researchers publishing some groundbreaking findings today in the journal Science have concluded that poverty imposes such a massive cognitive load on the poor that they have little bandwidth left over to do many of the things that might lift them out of poverty – like go to night school, or search for a new job, or even remember to pay bills on time. The condition of poverty imposed a mental burden akin to losing 13 IQ points.
In a series of experiments run by researchers at Princeton, Harvard, and the University of Warwick, low-income people who were primed to think about financial problems performed poorly on a series of cognition tests, saddled with a mental load that was the equivalent of losing an entire night’s sleep. Put another way, the condition of poverty imposed a mental burden akin to losing 13 IQ points, or comparable to the cognitive difference that’s been observed between chronic alcoholics and normal adults. The finding further undercuts the theory that poor people, through inherent weakness, are responsible for their own poverty – or that they ought to be able to lift themselves out of it with enough effort. This research suggests that the reality of poverty actually makes it harder to execute fundamental life skills. Being poor means, as the authors write, “coping with not just a shortfall of money, but also with a concurrent shortfall of cognitive resources.” This explains, for example, why poor people who aren’t good with money might also struggle to be good parents. The two problems aren’t unconnected. “It’s the same bandwidth,” says Princeton’s Eldar Shafir, one of the authors of the study alongside Anandi Mani, Sendhil Mullainathan, and Jiaying Zhao. Poor people live in a constant state of scarcity (in this case, scarce mental bandwidth), a debilitating environment that Shafir and Mullainathan describe in a book to be published next week, Scarcity: Why having too little means so much. What Shafir and his colleagues have identified is not exactly stress. Rather, poverty imposes something else on people that impedes them even when biological markers of stress (like elevated heart rates and blood pressure) aren’t present. Stress can also positively affect us in small quantities. An athlete under stress, for example, may actually perform better. Stress follows a kind of classic curve: a little bit can help, but beyond a certain point, too much of it will harm us. This picture of cognitive bandwidth looks different. To study it, the researchers performed two sets of experiments. In the first, about 400 randomly chosen people in a New Jersey mall were asked how they would respond to a scenario where their car required either $150 or $1,500 in repairs. Would they pay for the work in full, take out of a loan, or put off the repair? How would they make that decision? The subjects varied in annual income from $20,000 to $70,000. Before responding, the subjects were given a series of common tests (identifying sequences of shapes and numbers, for example) measuring cognitive function and fluid intelligence. In the easier scenario, where the hypothetical repair cost only $150, subjects classified as “poor” and “rich” performed equally well on these tests. But the “poor” subjects performed noticeably worse in the $1,500 scenario. Simply asking these people to think about financial problems taxed their mental bandwidth. “And these are not people in abject poverty,” Shafir says. “These are regular folks going to the mall that day.”
The “rich” subjects in the study experienced no such difficulty. In the second experiment, the researchers found similar results when working with a group of farmers in India who experience a natural annual cycle of poverty and plenty. These farmers receive 60 percent of their annual income in one lump sum after the sugarcane harvest. Beforehand, they are essentially poor. Afterward (briefly), they’re not. In the state of pre-harvest poverty, however, they exhibited the same shortage of cognitive bandwidth seen in the American subjects in a New Jersey mall. The design of these experiments wasn’t particularly groundbreaking, which makes it all the more astounding that we’ve never previously understood this connection between cognition and poverty. “This project, there’s nothing new in it, there’s no new technology, this could have been done years ago,” Shafir says. But the work is the product of the relatively new field of behavioral economics. Previously, cognitive psychologists seldom studied the differences between different socio-economic populations (“a brain is a brain, a head is a head,” Shafir says). Meanwhile, other psychology and economics fields were studying different populations but not cognition. Now that all of these perspectives have come together, the implications for how we think about poverty – and design programs for people impacted by it – are enormous. Solutions that make financial life easier for poor people don’t simply change their financial prospects. When a poor person receives a regular direct-deposited paycheck every Friday, that does more than simply relieve the worry over when money will come in next. “When we do that, we liberate some bandwidth,” Shafir says. Policymakers tend to evaluate the success of financial programs aimed at the poor by measuring how they do financially. “The interesting thing about this perspective is that it says if I make your financial life easier, if I give you more bandwidth, what I really ought to look at is how you’re doing in your life. You might be doing better parenting. You might be adhering to your medication better.”
The limited bandwidth created by poverty directly impacts the cognitive control and fluid intelligence that we need for all kinds of everyday tasks. “When your bandwidth is loaded, in the case of the poor,” Shafir says, “you’re just more likely to not notice things, you’re more likely to not resist things you ought to resist, you’re more likely to forget things, you’re going to have less patience, less attention to devote to your children when they come back from school.” At the macro level, this means we lost an enormous amount of cognitive ability during the recession. Millions of people had less bandwidth to give to their children, or to remember to take their medication. Conversely, going forward, this also means that anti-poverty programs could have a huge benefit that we’ve never recognized before: Help people become more financially stable, and you also free up their cognitive resources to succeed in all kinds of other ways as well. For all the value in this finding, it’s easy to imagine how proponents of hackneyed arguments about poverty might twist the fundamental relationship between cause-and-effect here. If living in poverty is the equivalent of losing 13 points in IQ, doesn’t that mean people with lower IQs wind up in poverty? “We’ve definitely worried about that,” Shafir says. Science, though, is coalescing around the opposite explanation. “All the data shows it isn’t about poor people, it’s about people who happen to be in poverty. All the data suggests it is not the person, it’s the context they’re inhabiting.”
Nathan Yau’s data visualization maps the food deserts in the United States.
Violent behavior linked to nutritional deficiencies / 03 Sep 2013
Deficiencies of vitamins A, D, K, B1, B3, B6, B12 and folate, and of minerals iodine, potassium, iron, magnesium, zinc, chromium and manganese can all contribute to mental instability and violent behavior, according to a report published in the Spring 2013 issue of Wise Traditions, the journal of the Weston A. Price Foundation. The article, Violent Behavior: A Solution in Plain Sight by Sylvia Onusic, PhD, CNS, LDN, seeks reasons for the increase in violent behavior in America, especially among teenagers. “We can blame violence on the media and on the breakdown of the home,” says Onusic, “but the fact is that a large number of Americans, living mostly on devitalized processed food, are suffering from malnutrition. In many cases, this means their brains are starving.”
In fact, doctors are seeing a return of nutritional deficiency diseases such as scurvy and pellagra, which were declared eradicated long ago by public health officials. Many of these conditions cause brain injuries as well. Symptoms of pellagra, for example, include anxiety, hyperactivity, depression, fatigue, headache, insomnia and hallucinations. Pellagra is a disease caused by deficiency of vitamin B3. Zinc deficiency is linked with angry, aggressive, and hostile behaviors that result in violence. The best dietary sources of zinc are red meat and shellfish. Leaky gut and gluten sensitivities may exacerbate nutrient deficiencies. Gluten intolerance is strongly linked with schizophrenia. “Making things worse are excitotoxins so prevalent in the food supply, such as MSG and Aspartame,” says Onusic. “People who live on processed food and who drink diet sodas are exposed to these mind-altering chemicals at very high levels.” In an effort to curb child obesity, the dairy industry recently petitioned FDA to include aspartame and other artificial sweeteners in dairy beverages featured in school lunches, without appropriate labeling. Recent research has established the fact that aspartame actually leads to weigh gain because of its effect on insulin. Other ingredients in the food supply linked to violent behavior include sugar, artificial colors and flavorings, caffeine, alcohol and soy foods. The toxic environmental burden includes mercury, arsenic, lead, fire retardants, pesticides, heavy metals and Teflon. Adding psychiatric drugs to this mix puts everyone at risk. “The only solution to the mounting levels of violence is a return to real, nutrient-dense food,” says Sally Fallon Morell, president of the Weston A. Price Foundation. “We must create a culture in which eating processed food is seen as uncool, and in which home cooking is embraced as a life-enhancing skill.” The Weston A. Price Foundation has pointed out the poor nutritional quality of school lunches and the flaws in the USDA dietary guidelines, which schools receiving federal funding are required to follow. At a press conference in January, 2010, the Foundation proposed guidelines that include eggs, organ meats and healthy animal fats. “Our brains need cholesterol to function properly,” said Fallon Morell, “and our children need cholesterol-rich food for optimal mental and emotional development.” Studies have shown that depressed individuals, offenders who show the most violent behavior, and the most violent suicides have low cholesterol levels.
Grandma’s Experiences Leave a Mark on Your Genes
Your ancestors’ lousy childhoods or excellent adventures might change your personality, bequeathing anxiety or resilience by altering the epigenetic expressions of genes in the brain.
by Dan Hurley / June 11, 2013
Darwin and Freud walk into a bar. Two alcoholic mice — a mother and her son — sit on two bar stools, lapping gin from two thimbles. The mother mouse looks up and says, “Hey, geniuses, tell me how my son got into this sorry state.” “Bad inheritance,” says Darwin. “Bad mothering,” says Freud. For over a hundred years, those two views — nature or nurture, biology or psychology — offered opposing explanations for how behaviors develop and persist, not only within a single individual but across generations. And then, in 1992, two young scientists following in Freud’s and Darwin’s footsteps actually did walk into a bar. And by the time they walked out, a few beers later, they had begun to forge a revolutionary new synthesis of how life experiences could directly affect your genes — and not only your own life experiences, but those of your mother’s, grandmother’s and beyond. The bar was in Madrid, where the Cajal Institute, Spain’s oldest academic center for the study of neurobiology, was holding an international meeting. Moshe Szyf, a molecular biologist and geneticist at McGill University in Montreal, had never studied psychology or neurology, but he had been talked into attending by a colleague who thought his work might have some application. Likewise, Michael Meaney, a McGill neurobiologist, had been talked into attending by the same colleague, who thought Meaney’s research into animal models of maternal neglect might benefit from Szyf’s perspective. “I can still visualize the place — it was a corner bar that specialized in pizza,” Meaney says. “Moshe, being kosher, was interested in kosher calories. Beer is kosher. Moshe can drink beer anywhere. And I’m Irish. So it was perfect.” The two engaged in animated conversation about a hot new line of research in genetics. Since the 1970s, researchers had known that the tightly wound spools of DNA inside each cell’s nucleus require something extra to tell them exactly which genes to transcribe, whether for a heart cell, a liver cell or a brain cell.
One such extra element is the methyl group, a common structural component of organic molecules. The methyl group works like a placeholder in a cookbook, attaching to the DNA within each cell to select only those recipes — er, genes — necessary for that particular cell’s proteins. Because methyl groups are attached to the genes, residing beside but separate from the double-helix DNA code, the field was dubbed epigenetics, from the prefix epi (Greek for over, outer, above). Originally these epigenetic changes were believed to occur only during fetal development. But pioneering studies showed that molecular bric-a-brac could be added to DNA in adulthood, setting off a cascade of cellular changes resulting in cancer. Sometimes methyl groups attached to DNA thanks to changes in diet; other times, exposure to certain chemicals appeared to be the cause. Szyf showed that correcting epigenetic changes with drugs could cure certain cancers in animals. Geneticists were especially surprised to find that epigenetic change could be passed down from parent to child, one generation after the next. A study from Randy Jirtle of Duke University showed that when female mice are fed a diet rich in methyl groups, the fur pigment of subsequent offspring is permanently altered. Without any change to DNA at all, methyl groups could be added or subtracted, and the changes were inherited much like a mutation in a gene. Now, at the bar in Madrid, Szyf and Meaney considered a hypothesis as improbable as it was profound: If diet and chemicals can cause epigenetic changes, could certain experiences — child neglect, drug abuse or other severe stresses — also set off epigenetic changes to the DNA inside the neurons of a person’s brain? That question turned out to be the basis of a new field, behavioral epigenetics, now so vibrant it has spawned dozens of studies and suggested profound new treatments to heal the brain. According to the new insights of behavioral epigenetics, traumatic experiences in our past, or in our recent ancestors’ past, leave molecular scars adhering to our DNA. Jews whose great-grandparents were chased from their Russian shtetls; Chinese whose grandparents lived through the ravages of the Cultural Revolution; young immigrants from Africa whose parents survived massacres; adults of every ethnicity who grew up with alcoholic or abusive parents — all carry with them more than just memories. Like silt deposited on the cogs of a finely tuned machine after the seawater of a tsunami recedes, our experiences, and those of our forebears, are never gone, even if they have been forgotten. They become a part of us, a molecular residue holding fast to our genetic scaffolding. The DNA remains the same, but psychological and behavioral tendencies are inherited. You might have inherited not just your grandmother’s knobby knees, but also her predisposition toward depression caused by the neglect she suffered as a newborn. Or not. If your grandmother was adopted by nurturing parents, you might be enjoying the boost she received thanks to their love and support. The mechanisms of behavioral epigenetics underlie not only deficits and weaknesses but strengths and resiliencies, too. And for those unlucky enough to descend from miserable or withholding grandparents, emerging drug treatments could reset not just mood, but the epigenetic changes themselves. Like grandmother’s vintage dress, you could wear it or have it altered. The genome has long been known as the blueprint of life, but the epigenome is life’s Etch A Sketch: Shake it hard enough, and you can wipe clean the family curse.
Twenty years after helping to set off a revolution, Meaney sits behind a wide walnut table that serves as his desk. A January storm has deposited half a foot of snow outside the picture windows lining his fourth-floor corner office at the Douglas Institute, a mental health affiliate of McGill. He has the rugged good looks and tousled salt-and-pepper hair of someone found on a ski slope — precisely where he plans to go this weekend. On the floor lays an arrangement of helium balloons in various stages of deflation. “Happy 60th!” one announces. “I’ve always been interested in what makes people different from each other,” he says. “The way we act, the way we behave — some people are optimistic, some are pessimistic. What produces that variation? Evolution selects the variance that is most successful, but what produces the grist for the mill?” Meaney pursued the question of individual differences by studying how the rearing habits of mother rats caused lifelong changes in their offspring. Research dating back to the 1950s had shown that rats handled by humans for as little as five to 15 minutes per day during their first three weeks of life grew up to be calmer and less reactive to stressful environments compared with their non-handled littermates. Seeking to tease out the mechanism behind such an enduring effect, Meaney and others established that the benefit was not actually conveyed by the human handling. Rather, the handling simply provoked the rats’ mothers to lick and groom their pups more, and to engage more often in a behavior called arched-back nursing, in which the mother gives the pups extra room to suckle against her underside. “It’s all about the tactile stimulation,” Meaney says. In a landmark 1997 paper in Science, he showed that natural variations in the amount of licking and grooming received during infancy had a direct effect on how stress hormones, including corticosterone, were expressed in adulthood. The more licking as babies, the lower the stress hormones as grown-ups. It was almost as if the mother rats were licking away at a genetic dimmer switch. What the paper didn’t explain was how such a thing could be possible. “What we had done up to that point in time was to identify maternal care and its influence on specific genes,” Meaney says. “But epigenetics wasn’t a topic I knew very much about.” And then he met Szyf.
“I was going to be a dentist,” Szyf says with a laugh. Slight, pale and balding, he sits in a small office at the back of his bustling laboratory — a room so Spartan, it contains just a single picture, a photograph of two embryos in a womb. Needing to write a thesis in the late 1970s for his doctorate in dentistry at Hebrew University of Jerusalem, Szyf approached a young biochemistry professor named Aharon Razin, who had recently made a splash by publishing his first few studies in some of the world’s top scientific journals. The studies were the first to show that the action of genes could be modulated by structures called methyl groups, a subject about which Szyf knew precisely nothing. But he needed a thesis adviser, and Razin was there. Szyf found himself swept up to the forefront of the hot new field of epigenetics and never looked back. Until researchers like Razin came along, the basic story line on how genes get transcribed in a cell was neat and simple. DNA is the master code, residing inside the nucleus of every cell; RNA transcribes the code to build whatever proteins the cell needs. Then some of Razin’s colleagues showed that methyl groups could attach to cytosine, one of the chemical bases in DNA and RNA. It was Razin, working with fellow biochemist Howard Cedar, who showed these attachments weren’t just brief, meaningless affairs. The methyl groups could become married permanently to the DNA, getting replicated right along with it through a hundred generations. As in any good marriage, moreover, the attachment of the methyl groups significantly altered the behavior of whichever gene they wed, inhibiting its transcription, much like a jealous spouse. It did so, Razin and Cedar showed, by tightening the thread of DNA as it wrapped around a molecular spool, called a histone, inside the nucleus. The tighter it is wrapped, the harder to produce proteins from the gene. Consider what that means: Without a mutation to the DNA code itself, the attached methyl groups cause long-term, heritable change in gene function. Other molecules, called acetyl groups, were found to play the opposite role, unwinding DNA around the histone spool, and so making it easier for RNA to transcribe a given gene. By the time Szyf arrived at McGill in the late 1980s, he had become an expert in the mechanics of epigenetic change. But until meeting Meaney, he had never heard anyone suggest that such changes could occur in the brain, simply due to maternal care. “It sounded like voodoo at first,” Szyf admits. “For a molecular biologist, anything that didn’t have a clear molecular pathway was not serious science. But the longer we talked, the more I realized that maternal care just might be capable of causing changes in DNA methylation, as crazy as that sounded. So Michael and I decided we’d have to do the experiment to find out.”
Actually, they ended up doing a series of elaborate experiments. With the assistance of postdoctoral researchers, they began by selecting mother rats who were either highly attentive or highly inattentive. Once a pup had grown up into adulthood, the team examined its hippocampus, a brain region essential for regulating the stress response. In the pups of inattentive mothers, they found that genes regulating the production of glucocorticoid receptors, which regulate sensitivity to stress hormones, were highly methylated; in the pups of conscientious moms, the genes for the glucocorticoid receptors were rarely methylated. Methylation just gums up the works. So the less the better when it comes to transcribing the affected gene. In this case, methylation associated with miserable mothering prevented the normal number of glucocorticoid receptors from being transcribed in the baby’s hippocampus. And so for want of sufficient glucocorticoid receptors, the rats grew up to be nervous wrecks. To demonstrate that the effects were purely due to the mother’s behavior and not her genes, Meaney and colleagues performed a second experiment. They took rat pups born to inattentive mothers and gave them to attentive ones, and vice versa. As they predicted, the rats born to attentive mothers but raised by inattentive ones grew up to have low levels of glucocorticoid receptors in their hippocampus and behaved skittishly. Likewise, those born to bad mothers but raised by good ones grew up to be calm and brave and had high levels of glucocorticoid receptors.
Before publishing their findings, Meaney and Szyf conducted a third crucial experiment, hoping to overwhelm the inevitable skeptics who would rise up to question their results. After all, it could be argued, what if the epigenetic changes observed in the rats’ brains were not directly causing the behavioral changes in the adults, but were merely co-occurring? Freud certainly knew the enduring power of bad mothers to screw up people’s lives. Maybe the emotional effects were unrelated to the epigenetic change. To test that possibility, Meaney and Szyf took yet another litter of rats raised by rotten mothers. This time, after the usual damage had been done, they infused their brains with trichostatin A, a drug that can remove methyl groups. These animals showed none of the behavioral deficits usually seen in such offspring, and their brains showed none of the epigenetic changes. “It was crazy to think that injecting it straight into the brain would work,” says Szyf. “But it did. It was like rebooting a computer.” Despite such seemingly overwhelming evidence, when the pair wrote it all up in a paper, one of the reviewers at a top science journal refused to believe it, stating he had never before seen evidence that a mother’s behavior could cause epigenetic change. “Of course he hadn’t,” Szyf says. “We wouldn’t have bothered to report the study if it had already been proved.” In the end, their landmark paper, “Epigenetic programming by maternal behavior,” was published in June 2004 in the journal Nature Neuroscience. Meaney and Szyf had proved something incredible. Call it postnatal inheritance: With no changes to their genetic code, the baby rats nonetheless gained genetic attachments due solely to their upbringing — epigenetic additions of methyl groups sticking like umbrellas out the elevator doors of their histones, gumming up the works and altering the function of the brain.
The Beat Goes On
Together, Meaney and Szyf have gone on to publish some two-dozen papers, finding evidence along the way of epigenetic changes to many other genes active in the brain. Perhaps most significantly, in a study led by Frances Champagne — then a graduate student in Meaney’s lab, now an associate professor with her own lab at Columbia University in New York — they found that inattentive mothering in rodents causes methylation of the genes for estrogen receptors in the brain. When those babies grow up, the resulting decrease of estrogen receptors makes them less attentive to their babies. And so the beat goes on. As animal experiments continue apace, Szyf and Meaney have entered into the next great step in the study of behavioral epigenetics: human studies. In a 2008 paper, they compared the brains of people who had committed suicide with the brains of people who had died suddenly of factors other than suicide. They found excess methylation of genes in the suicide brains’ hippocampus, a region critical to memory acquisition and stress response. If the suicide victims had been abused as children, they found, their brains were more methylated. Why can’t your friend “just get over” her upbringing by an angry, distant mother? Why can’t she “just snap out of it”? The reason may well be due to methyl groups that were added in childhood to genes in her brain, thereby handcuffing her mood to feelings of fear and despair. Of course, it is generally not possible to sample the brains of living people. But examining blood samples in humans is routine, and Szyf has gone searching there for markers of epigenetic methylation. Sure enough, in 2011 he reported on a genome-wide analysis of blood samples taken from 40 men who participated in a British study of people born in England in 1958. All the men had been at a socioeconomic extreme, either very rich or very poor, at some point in their lives ranging from early childhood to mid-adulthood. In all, Szyf analyzed the methylation state of about 20,000 genes. Of these, 6,176 genes varied significantly based on poverty or wealth. Most striking, however, was the finding that genes were more than twice as likely to show methylation changes based on family income during early childhood versus economic status as adults.
Timing, in other words, matters. Your parents winning the lottery or going bankrupt when you’re 2 years old will likely affect the epigenome of your brain, and your resulting emotional tendencies, far more strongly than whatever fortune finds you in middle age. Last year, Szyf and researchers from Yale University published another study of human blood samples, comparing 14 children raised in Russian orphanages with 14 other Russian children raised by their biological parents. They found far more methylation in the orphans’ genes, including many that play an important role in neural communication and brain development and function. “Our study shows that the early stress of separation from a biological parent impacts long-term programming of genome function; this might explain why adopted children may be particularly vulnerable to harsh parenting in terms of their physical and mental health,” said Szyf’s co-author, psychologist Elena Grigorenko of the Child Study Center at Yale. “Parenting adopted children might require much more nurturing care to reverse these changes in genome regulation.” A case study in the epigenetic effects of upbringing in humans can be seen in the life of Szyf’s and Meaney’s onetime collaborator, Frances Champagne. “My mom studied prolactin, a hormone involved in maternal behavior. She was a driving force in encouraging me to go into science,” she recalls. Now a leading figure in the study of maternal influence, Champagne just had her first child, a daughter. And epigenetic research has taught her something not found in the What to Expect books or even her mother’s former lab. “The thing I’ve gained from the work I do is that stress is a big suppressor of maternal behavior,” she says. “We see it in the animal studies, and it’s true in humans. So the best thing you can do is not to worry all the time about whether you’re doing the right thing. Keeping the stress level down is the most important thing. And tactile interaction — that’s certainly what the good mother rats are doing with their babies. That sensory input, the touching, is so important for the developing brain.”
The Mark Of Cain
The message that a mother’s love can make all the difference in a child’s life is nothing new. But the ability of epigenetic change to persist across generations remains the subject of debate. Is methylation transmitted directly through the fertilized egg, or is each infant born pure, a methylated virgin, with the attachments of methyl groups slathered on solely by parents after birth? Neuroscientist Eric Nestler of the Icahn School of Medicine at Mount Sinai in New York has been seeking an answer for years. In one study, he exposed male mice to 10 days of bullying by larger, more aggressive mice. At the end of the experiment, the bullied mice were socially withdrawn. To test whether such effects could be transmitted to the next generation, Nestler took another group of bullied mice and bred them with females, but kept them from ever meeting their offspring. Despite having no contact with their depressed fathers, the offspring grew up to be hypersensitive to stress. “It was not a subtle effect; the offspring were dramatically more susceptible to developing signs of depression,” he says. In further testing, Nestler took sperm from defeated males and impregnated females through in vitro fertilization. The offspring did not show most of the behavioral abnormalities, suggesting that epigenetic transmission may not be at the root. Instead, Nestler proposes, “the female might know she had sex with a loser. She knows it’s a tainted male she had sex with, so she cares for her pups differently,” accounting for the results. Despite his findings, no consensus has yet emerged. The latest evidence, published in the Jan. 25 issue of the journal Science, suggests that epigenetic changes in mice are usually erased, but not always. The erasure is imperfect, and sometimes the affected genes may make it through to the next generation, setting the stage for transmission of the altered traits in descendants as well.
The studies keep piling on. One line of research traces memory loss in old age to epigenetic alterations in brain neurons. Another connects post-traumatic stress disorder to methylation of the gene coding for neurotrophic factor, a protein that regulates the growth of neurons in the brain. If it is true that epigenetic changes to genes active in certain regions of the brain underlie our emotional and intellectual intelligence — our tendency to be calm or fearful, our ability to learn or to forget — then the question arises: Why can’t we just take a drug to rinse away the unwanted methyl groups like a bar of epigenetic Irish Spring? The hunt is on. Giant pharmaceutical and smaller biotech firms are searching for epigenetic compounds to boost learning and memory. It has been lost on no one that epigenetic medications might succeed in treating depression, anxiety and post-traumatic stress disorder where today’s psychiatric drugs have failed. But it is going to be a leap. How could we be sure that epigenetic drugs would scrub clean only the dangerous marks, leaving beneficial — perhaps essential — methyl groups intact? And what if we could create a pill potent enough to wipe clean the epigenetic slate of all that history wrote? If such a pill could free the genes within your brain of the epigenetic detritus left by all the wars, the rapes, the abandonments and cheated childhoods of your ancestors, would you take it?