ADOLESCENCE WAS BORN a little more than a century ago, in Worcester, Massachusetts. The term has a long history — “adolescens” signified a young man or woman in Latin, and the word first appeared in English in the fifteenth century — but the notion that adolescence is a stage of life all to its own is more recent. The father of the idea was Granville Stanley Hall, a scion of the Mayflower, an amateur magician, and the first man to receive a PhD in psychology in the United States.
In 1904, Hall published Adolescence: Its Psychology and Its Relation to Physiology, Anthropology, Sociology, Sex, Crime, Religion and Education. The two volumes ran to more than a thousand pages, and contained more than a small amount of crank. Hall argued that each of us reenacts the stages of human evolution, that primitive people are arrested in childhood, and that women are deficient because they have higher rates of suicide. One contemporary critic wrote, “Chock full of errors, masturbation and Jesus. He is a mad man.”
As Jill Lepore, the Harvard historian and New Yorker staff writer, explains in her latest book, The Mansion of Happiness: A History of Life and Death, Hall’s book nevertheless struck a chord. He placed a great deal of hope in adolescents (Anglo-Saxon ones, at least), and pictured their plastic nature as the crucible of future generations. “Despite our lessening fecundity, our over-schooling, ‘city-fication,’ and spoiling, the affectations we instill and the repressions we practice,” Hall wrote, “they are still the light and hope of the world.” Among well-to-do parents, many of whom shared Hall’s anxieties about shrinking birth rates, urban development and delinquency (thanks to compulsory schooling and child-labor laws, more teenagers than ever before remained dependent on their parents), Hall’s notion that Johnny’s troubles were due to the recapitulation of a primitive state, not to a wicked streak, was welcome. As the New York Times wrote, “Many a puzzled and despairing parent will be glad to learn from this volume that the reason why ‘that boy is so bad’ is not necessarily because he has started a downward road to wickedness and sin. Probably it is only because he has reached the age when it is necessary for him to live through the cave-man epoch of the race.” Evidently, the racialist vein in Hall’s thinking didn’t pose much of a problem to critics. The book sold over twenty-five thousand copies. Adolescence was here to stay.
Aristotle gave us youth, maturity and old age, medievalists threw in childhood, and we’ve since included adolescence: the ages of man have multiplied from three to five. We may soon add another. Two years ago, the New York Times Magazine ran a cover story on “emerging adulthood,” a phase Jeffrey Arnett, a Professor of Psychology at Clark University, where Hall once taught, wants to recognize. Two decades ago, at the University of Missouri, Arnett was studying people of college-age. He asked them questions like “Do you feel you have reached adulthood?” Their responses — one imagines a slow, Lebowski-like “Nah” — persuaded him that the prevailing model of adult development needed to be revised. As Robin Marantz Henig, the reporter for the Times Magazine noted, most psychologists divided adulthood into three parts. Young adulthood from age 20 to age 45, middle adulthood from 45 to 65, and late adulthood the rest. That meant that Arnett fell into the same category as his subjects, and that didn’t make sense. “I was in my early to mid-thirties myself,” Arnett told Henig. “And I remember thinking, They’re not a thing like me.” Arnett was married, with a graduate degree and a career. Many of his subjects were single, still taking classes, and figuring out what they wanted to do. “Young adulthood” was too broad.
Arnett and his subjects were also, as Henig notes, physiologically different. At about the same time that Arnett began to interview young people, the National Institute of Mental Health started a multi-year study of five thousand children ages three to 16. What they found surprised them. Instead of stopping like their bodies at age 16, the children’s brains continued to grow. The NIMH extended the study to review the children at 18, 20 and 22. The scans revealed new developments each time, chiefly in the prefrontal cortex and the cerebellum, areas associated with rationality and emotinal control. As Jay Giedd, the study’s director, told Henig, this might explain why 20-somethings seem at loose ends. “The prefrontal part is the part that allows you to control your impulses, come up with a long-range strategy, answer the question ‘What am I going to do with my life?’”
If Arnett succeeds at securing “emerging adulthood” in the psychological lexicon, it will raise questions as well as answers. The way we define the road from birth to death, although we rarely discuss it as such, is enormously important. It dictates how much education we consider necessary, how long we expect parents to support their children, what reproductive rights we recognize, how we care for the elderly — in short, a significant number of our concerns. If emerging adulthood is ratified, we will have changed not just how we live our lives, but how we conceive of life itself. Should that frighten us?
The history of life and death, though its putative subjects haven’t really changed and likely never will, is, surprisingly, a history of many shifts. “Life used to be a circle,” Jill Lepore says at the beginning of The Mansion of Happiness. “Then life lengthened, and the stages of life multiplied.” In addition to demographic changes — fertility rates have fallen in America, as life expectancy has doubled — a handful of unusual characters, like Stanley Hall, set off seismic shifts in what we know about reproduction, what we teach children about sex, how domestic life works, what goals we ascribe to marriage, and what we choose to do with our bodies after death. That a new stage of life, with all the attendant policy changes (the recognition of adolescence is what created middle school) can arise from unique circumstances that affect a limited number of people may surprise us, but, as Lepore reveals, this is the way life has changed, more or less, for centuries.
Lepore modeled her book after life itself, beginning with birth and ending with death, but also after a type of board game. Before the 1960s and the Game of Life, with its plastic cars, blue baby boys and phony stock certificates, there were centuries of board games that simulated life but had more to do with living and dying than accumulating wealth. Jñána Chaupár, a game from ancient India, sends players who fall on a virtue to Vishnu, and those who land on a vice into the mouth of a snake. The history of these games is intriguing — it turns out that Chutes and Ladders descended from Jñána Chaupár, only with less karma; that history is also helpful for interpreting Lepore’s book. Most of the chapters began as essays in the New Yorker, and they’re a bit disjointed if you expect an expansive narrative history. If singular figures like Stanley Hall had such an effect on our thinking, though, what better way to present them than as disruptive rolls in a game of chance?
This isn’t to say that Lepore overlooks context. She scrupulously connects the ideas she discusses to their time and place, and often notes that unheralded thoughts received powerful help from unlooked-for sources. Frederick Taylor, who some blame for the regimented pace of modern life, became famous not because he created management consulting, but because the prominent public advocate Louis Brandeis admired him. In 1910, Brandeis argued that railroads shouldn’t increase their fees, but instead hire a consultant to root out waste. He claimed Taylor’s methods could save a million dollars a day. “Suddenly,” Lepore writes, “those theretofore obscure [Interstate Commerce Committee] hearings made national news. Brandeis won the case, and Taylor became a household word.”
Something similar apparently happened to Stanley Hall. He arrived at Harvard in 1876, overlapping neatly with William James, the pioneering psychologist, with whom he studied. More importantly, perhaps, he came of age at the end of the Second Great Awakening, a period of enormous religious revival, during which the status of American youths came into special focus. Sylvester Graham, an early nineteenth century reformer best known as the father of the Graham cracker, was also, as Lepore explains, deeply concerned about young Americans. In short, he thought they weren’t eating right, and probably masturbating too much. Graham’s lectures proved enormously popular. Lepore thinks it likely that Hall, born seven years after Graham died, read those lectures in childhood. (Hall’s father, she adds, was a temperance lecturer, and may have even known Graham.) In addition to witnessing profound changes to the way children grew up, Hall inherited an audience primed to worry about adolescents.
Still, it’s hard to deny that a handful of crackpots spurred many of the changes Lepore describes. Frederick Taylor inflated figures and lied to clients, which led to a Congressional investigation, but his management theories profoundly affected the rhythm of daily life, speeding up the transformation of a circular world into a linear one. One of Taylor’s devotees, a woman named Lillian Gilbreth, carried efficiency to the kitchen in the form of home economics. An authority on homemaking with a joint position in Purdue’s Schools of Home Economics and Management, Gilbreth had never, despite bearing the dozen children immortalized in Cheaper by the Dozen, been a homemaker. She had an Irish housekeeper who did all the cooking and cleaning, and she knew how to make one meal: creamed chipped beef. Lepore writes, “Her children called it DVOT: Dog’s Vomit on Toast.”
Or consider Lewis Terman and Paul Popenoe, to whom Lepore turns in her chapter on marriage and its ugly cousin, eugenics. Terman created the Stanford-Binet IQ test; Popenoe, marriage counseling. Those inventions led to all sorts of things, from standardized testing to “The Jerry Springer Show,” some of which we condemn, but many of which we like. The inventors were complicit in some of the worst; Terman transformed a test devised to locate students in need of academic help into a tool for weeding out “defectives.” Popenoe went into counseling to ensure that white couples who scored highly on intelligence tests stayed together long enough to have plenty of children. As Lepore puts it, “Eugenics relied on a colossal misunderstanding of science and a savage misreading of history.” Yet the relics of those savage and colossal fools confront us every day.
Arnett’s proposal hardly seems forced compared to what Terman and Popenoe preached. Cordoning off a corner of adulthood, giving additional resources like education, job training, and health care to people just starting out — that sounds pretty reasonable. Arnett doesn’t list specifics in the Times Magazine article, but Henig, the reporter, names a couple of plans. In 2008, Hillary Clinton floated the idea of “baby bonds” to fund a year of self discovery for twenty-one year olds. Henig points to the Amish tradition of rumspringa, in which young people are allowed to experiment with new modes of living. She also suggests expanding programs like the Fulbright, the Peace Corps, and City Year.
Arnett’s emerging adulthood will probably never “fit” classical stage theory, in which everyone, without exception, passes through every stage. But how many youths around the world bypass adolescence to enter factories? Both phases are largely the product of luxury. As Lepore tells it, “An ordinary life used to look like this: born into a growing family, you help raise your siblings, have the first of your own half dozen or even dozen children soon after you’re grown, and die before your youngest has left home.” Americans have doubled their lifespans over the last two hundred years, and the space between those stages has lengthened accordingly. As baby boomers retire, the ranks of the elderly will grow. Even if psychologists ultimately don’t accept Arnett’s terminology, it’s not a bad time to reassess our diagram of life’s stages.
If any constant emerges from Lepore’s stories, it’s that humans have thought all kinds of things about birth, death and what falls between them. “Most of all,” she writes, “I have come to believe that what people make of the relationship between life and death has got a good deal to do with how they think about the present and the past.” Ignorance of the past can lead someone, like Missouri Representative Todd Akin, to articulate the once widespread but now extreme idea that the uterus can defend itself against “legitimate” rape. (Lepore explains it was once commonly believed that women had to orgasm to become pregnant; ipso facto pregnant women couldn’t have been raped). But a careful reading of the past, like the one Lepore presents, provides a measure of equanimity.
At the end of the book, Lepore hints that The Mansion of Happiness arose from a personal tragedy. Some years ago, just as she was about to give birth, a woman who wanted to meet the baby died. (It seems likely this was Lepore’s mother, but Lepore doesn’t say.) Hoping that it might preserve some sense of her, Lepore archived the woman’s papers in a box. A decade later, she opened the box and cried, because the woman she loved wasn’t there. “All that shouted out of that box was my grief,” she writes. Soon after, Lepore flew to Michigan, where she profiled Robert Ettinger, the leader of the cryonics movement, for the New Yorker. There she met a man who, for being the head of such a colorful movement, was surprisingly colorless. Cryonics promises a gleaming future with neither death nor disease — the ultimate fantasy of a linear world — but all Lepore saw was a splotchy old man presiding over a warehouse of bodies frozen in sleeping bags from Walmart.
When Lepore returned home, she and her son took out the trash. In the recycling bin, they placed one of her son’s Halloween accoutrements: a papier-mâché head of Ted Williams, the famous left fielder for the Boston Red Sox, whose head was removed and cryonically frozen. “He was pulped and bleached and made, I suppose, into a newspaper,” Lepore writes, “ashes to ashes—or, at least, paper to paper.” The act carries a whiff of symbolism: learn enough about the history of life and death, and you become accustomed to its endless revolutions.