Wired for Madness? A History

By Michele Pridmore-BrownJuly 26, 2015

Madness in Civilization by Andrew Scull

MENTAL DEGENERACY, OR MADNESS, was on the national agenda in the early decades of the American 20th century. It still is, albeit less dramatically. The common understanding around mental illness at that time was that it could be weeded out. New theories of germs and genes had created an image and a model of transmission: madness was “a blight,” much like syphilis. Contagious and hereditary, it threatened to bring down the country. President Teddy Roosevelt offered a solution in his 1903 address to the nation: he urged the mentally fit to breed. This, he suggested, would in part swamp out the mentally unfit. He gave medals to women of his type who had five or more children; he himself fathered six. In the ensuing decades, many states went further, passing laws to sterilize those deemed unfit: the “epileptic,” the “feebleminded,” the degenerate insane, not to mention the “masturbatory” and indecorous. Justice Oliver Wendell Holmes Jr. resoundingly applauded in 1927, writing one of the more famous sentences in legal history: “Three generations of imbeciles are enough.” He was referring to the Buck family, whose youngest mother had been sterilized at age 17 because her own mother was “in an asylum” and the family, according to court records, was part of Virginia’s “worthless class of anti-social whites.” In short, eugenics — to the tune of 65,000 sterilizations — was alive and well in America. California was especially zealous: it accounted for 20,000 sterilizations.

When at the end of 1941 America entered World War II, madness mattered for a different reason. Extirpating madness “within” mattered to defeating the enemy “over there.” In World War I, many recruits had succumbed to “combat neurosis” and “shell shock,” what we now call PTSD. Instead of concluding that the horrors of modern warfare were the problem, the military decided that the shell-shocked themselves were the problem: they had been unfit from the outset because they were “inherently” degenerate. Screening recruits, it was thought, would take care of the problem, and so 1.75 million recruits were duly weeded out. Only the certifiably sane were sent to the battlefields of World War II. The result, however, was telling: the screened recruits were no more likely to hang onto their sanity than the unscreened recruits of World War I. Screening for madness had been an abject failure.


Madness, or so-called “degeneracy,” can be interpreted in a variety of ways depending on the time period and who’s in charge of categories; it can include disorders of mood, sense-making, or sociality. It alienates those deemed ill from shared, or what we think of as common sense, reality. Usually, it covers mania and depression, delusions and hallucinations, sometimes paranoia, and sometimes, depending on the era, epilepsy, neurosis, addiction, even indecorum. But whatever the definition, the practical problems with trying to rid populations of madness is that humans seem wired for madness in a general sort of way. “Screening” doesn’t shore up sanity. Even the American military had to concede at the end of World War II that every man has a breaking point. For some it’s more likely to occur in the context of, say, bullying; for others in the context of neglect, war, or other trauma. Another complication with defining or screening for madness is that some people hover on its edges — they’d be unlikely to pass a screening test, and yet they also prosper in spectacular fashion. They may even make history.

A handful of evolutionists have argued that a gene called “Neuregulin 1” is the reason why, in any human population, some fraction will go insane — about 1% developing outright schizophrenia for instance. Five to seven million years ago, the theory goes, this particular gene, implicated in both creativity and psychosis, accounted for our species’ divergence from other apes. To be sure, many genes and gene-environment interactions are undoubtedly involved, but the point is that some forms of human madness must have evolved in tandem with our intelligence — for instance, with our ability to create umpteen uses for a brick or a stick, or to ask “what if” (e.g., what if I break the rules) and ruminate endlessly on “if only” (if only I could break the laws of science or sociality or decorum). In her book on John Nash, the late Nobel Laureate inventor of game theory, Sylvia Nasar plumbed the notion that a mind on the edge of madness is a “beautiful mind.” Mad minds can be deviantly generative minds. They can, sometimes, lay down paths others follow. It may be that they see the crystalline lines of the forest where the rest of us see only trees. Or they experience the trees with a more delicate sensorium. Or they see alter-forests. Or hear them! The latest argument — rooted in genetic analysis and neuroimaging as well as old-style observation — is that great achievement and madness tend to occur in the same families. Oliver Sacks, James Joyce, Einstein, and, yes, even Teddy Roosevelt — all had a sibling or child who was schizophrenic or manic-depressive.

This way of seeing madness as visionary is why some tend to fetishize it. The American writer Jack Kerouac for instance thought he’d invented anew the very old idea that to be truly alive — indeed, to be at all interesting to oneself and one’s peers — is to flirt with madness: it is to teeter in full view, to be manically, incandescently “on the road.” We know our brains evolved as social organs; we also know that, to develop normally, they require immersion in a soupy mix made up of other minds. They require being seen by other minds. But the catch is that, while our attunement to others may have us burning incandescently, it also does the opposite. In short: “hell is other people,” because we need them so much. The social gets “under our skins” — and this is why, according to a recent batch of social scientists, discrimination or immigrant status or being very poor increases the risk of madness. Whereas some of us are amplified by the presence of others, becoming seers and leaders, others are dimmed out and a few the outright casualties — the “canaries in the coal mine” of human sociality.

Andrew Scull is a sociologist of science at UC San Diego and the preeminent living historian of madness. In his long career, he has written about madness in over a dozen books. His newest, a candidate for the Pulitzer, is surely his tour de force. It lives up to its capacious title, Madness in Civilization: A Cultural History of Insanity from the Bible to Freud, from the Madhouse to Modern Medicine. He covers two millennia of history while developing three topics of argument: the relationship between madness and cultural production in the arts (the generative fetishized version discussed above); the grounding stories told through the ages about madness, whether under the guise, broadly put, of “religion” or “science;” and the practices and institutions used to contain or treat the mad. His brilliance lies in seamlessly weaving these strands together and making the whole somehow cohere. He is also a pithily witty raconteur with a sharp knife; with surgical precision, for instance, he takes down the likes of Michel Foucault and Erving Goffman for getting historical details and chronology wrong.

Before getting to this chronology and the takedowns though, it’s worth noting his 120 color images, some capturing the other side: what it’s like to be mad or deemed mad. They depict Don Quixote in an 1855 painting by Honoré Daumier charging a flock of sheep. They depict the lunatic mad in chains, treated like cattle, electrocuted, lobotomized against their will, their faces distorted by fear and alienation, loneliness and suffering. There’s a Van Gogh painting of his asylum doctor, who looks so haunted that we can be sure he will soon be joining the inmates. Amongst such images, one photograph is startling, however, for an entirely different reason. Other reviewers have noted it as well — it pops off the page because it depicts what looks like civilized wholesomeness (that is, if one ignores a few uniforms!). One can’t help wondering what it is doing here amongst these other images. In it, a group of mostly casually dressed men and women are smiling and enjoying each other’s company, and a middle-aged man in the foreground with twinkly good-natured eyes is playing the accordion. Their expressions are genial and healthy. One then reads Scull’s caption: “The staff at Hadamar, c. 1940-42, a psychiatric hospital used in the T-4 euthanasia programme, relaxed and happy after a hard day at work disposing of those the Nazis considered ‘unworthy of living.’” Like Justice Holmes in the United States, they’re drawing lines, rolling up their sleeves to cut their own unfit — 70,000 in a year and a half — from the gene pool. The image is one of Scull’s comments on collective madness; we are of course meant to recognize ourselves in what’s been called “that dark mirror.” We’re meant to ask about the actual location of madness. Clearly, there are no tidy answers.

Indeed, Scull mocks the sanity line. His mockery, sometimes subtle, sometimes explicit, is a leitmotif in his book. Madness, he shows us, is deep within civilization — deep within us, not “over there.” When madness is collective, it’s usually not termed madness. When it’s individual, it alienates us from our peers’ reality. As for our civilized compassion, his book tacitly mocks it as well. Take our current attitudes from this side of the sanity line. The woman on the bus gibbering to herself horrifies us — we sit elsewhere and pretend she doesn’t exist. We also determinedly overlook that fact that the largest concentration of the mentally ill are now housed in jails, subject to electronic stun guns and chemical dousing. And, we concertedly avoid looking at “the disappeared” that fall out of classrooms or workplaces never to return. But, as Parisian neurologist and impresario Jean-Martin Charcot understood over 100 years ago, scantily-dressed young women having hysteric fits on a public stage are mesmerizing: the “demi-fous,” as Scull points out, are an enormous money-maker. They still are. Indeed, throughout recorded history, staged madness has been regarded as chock full of dramatic possibilities. What could be more dramatic, writes Scull, than the “slaughter, violence and depravity” of a world unhinged by madness, “of moral codes dissolved,” of Tamora for instance, one of Shakespeare’s characters in Titus Andronicus, feasting on her sons. Scull, an opera fan, suggests opera too requires madness for its life’s blood. At a more bathetic and contemporary level, Hollywood and our tabloid or now buzz-feed culture clearly does as well. As for the pretensions and gullibility of the cultivated, Scull has a field day here; there’s nothing like the hyper-educated to fall for stage-tricks, especially when “science” is invoked. In the Age of Reason, they fell for the impresario Franz Anton Mesmer, whose “touch” could restore “hysterics” through the new science of “animal magnetism.” They fell for the 19th century phrenologists (“bumpologists” who diagnosed their patients by reading the bumps on their heads). More dangerously, in the early 20th century, the securely-pedigreed like Holmes fell for the eugenicists. And more comically, circa 1970, they fell for the psychoanalyst Jacques Lacan’s “paroles” (he could, Scull tells us, get away with whispering a single word to a patient in his waiting room, charging for the hour.) Since then they, or rather we, have been falling for other diagnoses and fixes for mental illness, dressed up as “science” — but it takes Scull’s romp through history to understand those fixes and our gullibility.


The Ancient World: Madness, Demons, and Humors

Scull targets and has a nose for folly. But with Madness in Civilization, he’s mostly the careful historian in command of his narrative arc: how madness and civilization have played out for generations. Starting with the Ancient world, he tells us that it “bequeathed” to subsequent eras both “natural and supernatural accounts of madness.” These shape our present. First, the supernatural accounts: madness could easily be seen as “sacred” — the result of otherworldly “possession.” Think of epilepsy, for instance, which as Scull points out is no longer regarded as madness, but certainly was then. It was once viewed as punishment for sinning, for offending a god or gods. The Hebrew King Saul is punished in the Bible for failing to carry out God’s instructions to the letter with respect to killing every last “Amalekite.” Cursed by a vengeful God for his disobedience, he froths, raves, and all the rest. The Babylonian King Nebuchadnezzar is cursed for boasting — and reduced to frothing on all fours.

But, and here we’re still in the Bible but could be in any number of times and places, “prophets who exhibited some of the attributes (of madness) might [also] be divinely inspired.” Now things get tricky. “To behave like a prophet” is “to rave.” Presumably it is to rave with compelling charisma, in which case hallucinations become revelations, an avenue to knowledge: a different way of “seeing.” Scull has much to say about this strand, but he’s also circumspect. Clearly, the individual ecstasies and raptures of Christian visionaries in later ages fit into this conceptual framework. So, clearly, do raves of other sorts.

But madness could also represent not punishment or ecstatic perception, but outright demonic possession. Figuring out how to “cast out demons” could then be tantamount to restoring the mad to their senses. This became the backbone of Christian faith, or at least of its healing strand. Jesus heals the sick by casting out demons and the news travels far and wide. He restores the mad to their senses, and a line of apostles and saints does the same in later ages. Healing the mad was then about channeling the divine (the Quran, by contrast, is silent on questions of health and disease). The “relics” of saints, housed in shrines, would go on to generate a whole culture in medieval Europe, not to mention destination sites — a story Scull plumbs as well.

Such supernatural accounts have parallels in secular stories, it turns out, which are equally important for Scull’s history. Eventually, priestly types had to vie for clientele with descendants of Hippocrates and Galen, who diagnosed flows and blockages of earthy sorts, and then prescribed vomits, purges, and dietary tweaks rather than prayers, raves, and shrine visitations. Health in this scenario hinged on the internal “balance” of humors. Balance was key across many traditions in fact, and Scull alludes to them all, but the main takeaway is that madness was in a word “imbalance” — in the Greek account amongst four well-defined humours. An excess of black bile tipped the body into “melancholia.” An excess of yellow bile tipped it into anger. And so on.


Early Modernity: Madhouses and the Rule of Reason

Once Greek texts, including those describing the humoral model, were discovered in Europe, “modern” ruminatory discontent found its grammar; it was understood as an overexpression of black bile, as “Melancholia” in short. Not to be missed amongst Scull’s images: Albrecht Dürer’s famous engraving “Melancholia” (1514) depicting the artist himself (a creative genius, presumably) “in the grip of melancholic madness.” Scull bravely takes on Richard Burton’s daunting 1500-page tome (1621) on “melancholizing.” He dissects Don Quixote (1605) madly but meaningfully tilting at windmills, reminding us that the novel was “a genre founded in madness.” Don Quixote is the existential hero whose hallucinations are “mad” rather than, dare one say, revelatory, because they’re unshared. He dies when he himself stops believing in them. Already a century earlier, Erasmus opined that piety is folly (the title of his 1511 treatise). Folly is madness, but piety is therapeutic and “not to believe” is madness too. Scull sorts out the ironies. As for Shakespeare, he naturalized madness as an indelible part of the human condition. Hamlet is the quintessential melancholizing “canary.” King Lear is made mad by grief, and Othello by jealous paranoia, and so on. In other words, Shakespeare understood that it was the very nature of the social soup that drives them insane, not supernatural forces. Scull spends dozens of pages on textual analysis and on melancholic madness in particular — as existential (part of being human) and as an emerging form of cultural capital, too: lite-madness as the price of genius or high sensibility.

But what about the more abject depredations of madness among the hoi polloi? Melancholic geniuses aside, what did a self-consciously emerging “modern” world do with those who could not take care of themselves? This is equally Scull’s forte. The Islamic world had created small charity hospitals; the early modern European world was trying to catch up in this as in so much else. Scull makes an intriguing argument here that life imitated art in England in particular: the literary imagination infused the actual geography of madness. The raving “Bedlam mad” was, he shows us, “a set piece” in plays of the early 17th century, even though the original Bedlam hospital, or “Bethlehem hospital” as it was called, was initially no more than a tiny charity hospital. Founded in 1247 on the outskirts of London, by 1500 it acquired a reputation for caring for the mad — but in actual fact it took in all manner of the sick, including just a handful of the mad. For the most part, Scull tells us, the mad remained in their families or were left to roam much as they’d always been left, part of local color; or they were confined in jails if violent. The point is that the idea of “Bedlam” — of the mad grouped together — pricked the imagination before it existed in the form of institutions dedicated to the mad.

To be sure, much later, just before the turn of the 18th century, Bedlam Hospital was rebuilt into an opulent vast edifice when it was designed, as Scull tells us, to show off London’s charity toward the mad, and “the rule of reason” after the turmoil of the English Revolution. But a great deal happened in the interim. The imagination, leavened by 17th century mercantile values, gradually turned Bedlam into an actual madhouse and built other ones across Europe. In one chapter, Scull describes how a new set of pragmatic values inspired a more “skeptical” attitude toward those who could not labor — and then over time, as rules of polite society evolved, against those who could not follow social norms. Self-regarding nations — the industrious Dutch were first — taxed the peasantry in order “to sweep” the poor and mad from the streets.

Scull chides Foucault for being misled by an image — the original circa 1500 image by Hieronymus Bosch of the Ships of Fools, which Foucault actually thought (and here Scull is chortling) represented something real at the time! Nope, the mad collectively in search of their lost reason was a conceit, not a historical reality. But he especially attacks Foucault for overstating the degree to which the later “Great Confinement” of the 19th century represented the actual state of affairs in the earlier centuries, something that becomes patently obvious as he puts it (with biting sarcasm this time) when one “looks beyond the French capital.” To be sure, in an age pretending to decorum, those with the means were willing and eager to pay to put mad relations out of sight. And then madhouses, dramatic buildings with their high walls and barred windows, viscerally reminded passersby of their ominous custodial function, further titillating their imaginations.

In short, the practice of locking up one’s troublesome relations migrated back and forth between reality and literary representation, so the historian has to be careful about sorting out the direction of causality. The Marquis de Sade’s mother-in-law had him locked up — thanks to a King’s order, or lettre de cachet (no trial needed!) — but, in post-revolutionary literary representation, what sold was “virtuous maidens” confined amongst lunatics by scheming relations. By the 19th century, a few husbands put away troublesome wives but, in the known cases, wives exacted revenge by going public in the newspapers!


The Discovery of Nerves and the Invention of Asylums

There’s yet another important story here — even more basic or fundamental than the madhouses, or the melancholic geniuses and the literary imagination, and in fact impacting all of these: the story of “science,” and how mad and sane bodies came to be understood in the Age of Reason. “Humors” lasted until the early 19th century in some places. In the late 17th century, an English physician named Thomas Willis took it upon himself to dissect bodies; he discovered nerves and brains at a moment when the news could travel. “Physick,” as it was called, changed as a result. Neurology emerged, and it sought to “un-jangle” nerves rather than “balance” humors. Partly as a result, shocking or stupefying the madman or madwoman to unjangle jangled nerves came into vogue as a scientifically validated practice rather than just an expedient one.

Perhaps nothing illustrates this better than the fact that, when King George III, the king who famously lost both America and his reason, went mad in 1788 — initially shaking hands with a tree he thought was Frederick the Great, and then raving in more unruly fashion — he was beaten, starved, put in a strait waistcoat and menaced in what was considered the most enlightened scientific manner. He went mad from the top of the social pile, but this did not exempt him from such treatment. Benjamin Rush would develop “the tranquilizer” based on the same scientific principles, and Erasmus Darwin added a swinging chair to the mix. Scull’s graphically gory descriptions are hard to read but they do ease us into the follies of the 20th century.

We’re not there yet, though. The science of nerves and their redress was not straightforward. When the matter of George III’s reason still hung in the balance, he desperately wanted to be “nervous,” and there’s a long story there that Scull plumbs as well. Outright madness was one thing, nervousness quite another. One was socially catastrophic, the other potentially a badge of honor in the manner of Melancholia if played right. Scull recounts how, a few decades after Willis’s discovery, a Scottish diet doctor named George Cheyne had coined in 1733 the term “English Malady” for nervousness, and declared it the malady of superior sensibility and, as such, peculiar to the English. Forthwith, he became famous and rich. As Scull points out, the social and geographic location of this new malady, or lite-madness, was all-important. But, just as important, the malady didn’t stand a chance unless harsh treatment — being whipped and tortured — were remade into a more “gentle Physick” suitable to delicate nerves. Cheyne and others provided that as well, and so the Melancholic of the previous era became safely nervous (the lite version of madness) by virtue of being hyper-civilized. Even the philosopher and critic David Hume fell for the conceit (as did virtually all the English luminaries of the Age of Reason, including satirists like Jonathan Swift). Here’s Scull quoting Hume: “‘We Hypochondriakcs may,’ boasted Hume, swallowing Cheyne’s bait whole, ‘console ourselves in the hour of gloomy distress, by thinking that our suffering marks our superiority.’” Hypochondria was nervousness, and various nationalities developed their own flavors.

The Americans reached this pinnacle of civilization only in the 19th century, calling their version neurasthenia. Scull points out that diagnoses “creep” or leap from body to body at that moment when they become profitable to the doctor and flattering to the patient. In a rapidly industrializing nation, “the price of being civilized” was thus couched in self-congratulatory metaphors like “over-running one’s battery,” or “wear and tear.” Sanatoriums proliferated for the fashionably “depleted” — for the civilized- or lite-mad (not the Bedlam mad). The Kellogg brothers of cereal fame would later profit form the craze with the Battle Creek Sanitarium, where affluent patients could address flows and blockages in newly chic language: they could “recharge their batteries.”

But, what about the unseemly Bedlam mad? Scull explains that national sensibilities would go on to affect them as well: “an elite with nerves and sensibilities needed to display its sensitivities” toward the Bedlam mad “in new ways.” “Madhouses” morphed into “asylums” in what is now an oft-told tale. Where the earlier period has been steeped in mercantile values, 19th-century “reformers” were motivated by “moral” ones. The English passed legislation in 1845 for the construction of asylums at public expense, which, as Scull tells us, was what truly marked the beginning of Foucault’s “Great Confinement” across much of the world. Queen Victoria’s physician would call this confinement “the most blessed manifestation of true civilization that the world can present” — in other words, a symbol of what we might as well call manifest virtue. In short, nerves, once rebranded as sites of civilization, demanded the transformation of violent cures across the board, not just for the nervous elite. Unruly inmates, previously subject to violent physick, were now “patients” who could, in theory, be gently domesticated. Madness became “mental illness.” A new breed of physicians took madness seriously — and took themselves seriously; they wrote articles in journals, they met, they created sub-categories of madness, and they formed what we now think of as the beginning of the psychiatric profession (the word was coined in Germany in 1808). Foucault famously dismissed “moral treatment” as a form of gigantic moral imprisonment, but Scull demurs: there’s a grain of truth, he concedes, but a patient might well prefer enforced four o’clock tea and cake (i.e., “domestication”) to being broken in like a horse.

Alas, though, moral treatment appears to have been no more curative than the whiplashes of the madhouse era — and that is essentially what lands us in the 20th century.


Blame the Patient! Medical and Mental Therapeutics Circa 1900

The treatment of the mad may well be the most direct channel into civilization’s heart of darkness. A profession had been created in the 19th century to heal the mad, but the fact that cures dramatically failed to materialize set the stage for eugenics and much else. In all sorts of ways of course, civilization foundered, and Scull describes it all. The mad in their asylums stayed obdurately mad. Psychiatrists lost credibility. Eventually, morale dropped and so did public funding, and so, it would seem, did public interest in performing manifest virtue. In short, “moral treatment” was an intermezzo. In the end, the solution was all-too-obvious: blame the patient!

The theory of degeneration — based on new notions of heredity, genes, and germs, and tendentious interpretations of Darwin and Lamarck — provided a serendipitous explanation for why the Bedlam mad could not be cured. They were mad by virtue of heredity. They were the “degenerate” mad. The sins of the fathers were visited on ensuing generations in the form of mental illness. The problem was thus not the impotence of the profession but the nature of madness itself. This made for a whole slew of new literary tropes, but that’s another matter.

I’ve left aside the historical zigzags and complications of various national histories, including the rich post-revolutionary French one (the asylum work of the wise and humane Philippe Pinel, for instance). All of this Scull is careful to include without ever making his 450-page book seem long. But the overarching point holds — namely, that it makes grim historical sense that asylums began to burst their bounds around 1900. Buildings sprouted across the globe and swelled to house as many as 12,000 inmates. They became “total” institutions whose asylum-like function was increasingly secondary to their custodial one. Families had ever more reason to fear the social stigma of harboring blight within their family tree. The self-regarding carefully hid the existence of mad relations, made up stories of their travel in faraway places. Self-contained institutions away from home were essential to keeping up appearances. This is, apparently, still the case in Japan, according to Scull.

Because of its therapeutic impotence, the profession, in Germany initially, turned to systematic categorization and classification: Alzheimer’s (1906) and end-stage syphilis were identified through brain dissection. The dementias and schizophrenia were named. This would all prove consequential: the nervous and sensitive were, as per usual, put safely among the hyper-civilized. The Bedlam or degenerate mad were put in another bin: they became “tainted persons,” “lepers,” “moral refuse” — the sorts of people Presidents Roosevelt and Wilson, as well as Churchill and Justice Holmes rhetorically and through writ tried to sterilize, and the Nazis to euthanize. They were “incurables.”

Scull insists we bear witness to the horrors of this period — the return of violent treatment with the imprimatur, again, of “science.” He insists we fully inhabit the depravities of men on this side of the fitness line: those who zealously routed out “germs” from deep within the body’s cavities (as often as not killing the patient in the process), who touted Insulin Shock Treatments and malarial “therapies” that brought patients to death’s door, or who performed lobotomies, which took the old science of phrenology to newly absurd heights. All of these therapies put us well into the 20th century. Materialism went grimly nuts. With a kind of scathing precision, Scull describes neurologist Water Freeman’s “evangelical zeal” in developing “precision lobotomy,” which was performed with an ice pick introduced under the eyelid, and a mallet to break the eye socket. In Scull’s words, Freeman “later boasted that in 20 minutes he could teach any damn fool to perform a lobotomy, even a psychiatrist.” Emboldened by their new technologies, doctors became ever more sure of themselves.

Here, in the 20th century and on the doorstep of our present, Scull slows down. He manages to make America’s love affair with Freud seem fresh in a chapter entitled “A Meaningful Interlude,” where he shows how the newly theorized lethality of one’s intimates — and of attachment in general — offered endless dramatic possibilities (on the screen, on the stage, in child-rearing manuals, and everywhere else), and a new business model (the lucrative office-based practice for the lite-mad). But, perhaps the most important Freudian insight for Scull’s purposes is that, while the robust leaders of civilization might well be trying to draw lines, and the materialist brand of doctors might well be creating categories and pretending to excise blight from brains and bodily cavities, Freud insisted that shoring up sanity was as futile as tilting at windmills. Madness was within civilization. We’re all on the spectrum — we’re all degenerates, the id perched like a gargoyle on our individual and collective shoulders. Freud had his flaws and excesses, and his own cloying vanity, but Scull is partial to him for getting this much right.


The Latest Turn of the Bio-Cultural Screw: The Death of the Asylum and the Triumph of Pills!

Our tales, the ones we tell about ourselves and about “balance” (now conceived in terms of neurotransmitters rather than humors or jangled nerves), are for Scull symbolically rooted in a pill — the first real psychiatric medication, discovered in 1951 and marketed as Thorazine in America in 1954. That pill, he argues, revived the psychiatric profession after the wars — and ushered in the bio-psychiatric paradigm in which we find ourselves today.

Scull makes perhaps his most critically important argument here at the end of his book. This was also the era — the late 1950s and ’60s — when asylums were emptied, starting in America and spreading elsewhere. Scholars like to say the entry of pills and the emptying of institutions were correlated: that pills cured the insane just enough to eject them from their prisons. “Not so fast,” Scull says. They were already emptying. The politically tidy story — one that maintains a façade of manifest virtue, as it were — hinges on the idea that Thorazine provided a cure for the ravings of mania and schizophrenia. To be sure, it stupefied the patient into lethargy. Cure the patient it most certainly did not. What pills did, says Scull, was change the cultural climate and revolutionize the practice of psychiatry. In point of fact, he argues, social policy was already under way, designed to empty institutions — and this had everything to do with the bottom line, and nothing to do with performing virtue. For individual states, it was a question of “phase out before going bankrupt.” As a result, a 100-year experiment that had begun in the mid-19th century came to an end. In the United States, states figured out they could transfer their charges to federal programs thanks to the passage of public assistance programs. Scull argues that gullible scholars, including his fellow sociologists (he’s looking at the late Irving Goffman here), celebrated the release of the institutionalized into the community as a great step forward. Others rather conveniently and even more stupidly stepped in to argue that the insane were in fact the super-sane: a nice conceit in the 1960s and after, but hardly consonant with these patients’ inability to care for themselves. The problem was that no alternatives were set up for handling the seriously mentally ill. The reform was, Scull alleges, not remotely beneficent.

The most “unwanted” segment of society was, in Scull’s phrasing, left “to decompose” — many in jails, the largest concentration of the mentally ill housed currently in the Los Angeles County Jail. As he puts it, we’ve come full circle: 19th-century reformers were shocked by the confinement of the mad in what were essentially jails: madhouses. There they are again.

But the indignant Scull has other targets as well in the last part of his book: the pills themselves, their culture, and the profession at large, in bed with pharmaceutical companies. The marketers of Thorazine espied a market beyond schizophrenics: they even dubbed it “mother’s little helper.” In the words of the Rolling Stones song, the little yellow pill “helped her cope with her busy dying day” as a bored housewife. Need one say more? Valium followed. Valium morphed into Prozac in the 1990s in another oft-told tale. Psycho-pharmaceuticals, as Peter Kramer of Listening to Prozac fame put it, have become “a whole climate of opinion under which we conduct our different lives.” They replaced Freud. Enter Scull’s other target: psychiatrists who scrambled to come up with coherent psychiatric diagnoses, and then scrambled to stabilize their labels, and scrambled some more to be seen as credible and give coherence to madness. The invented categories of the late 19th century became something real. The absurdities of the Diagnostic Statistical Manual (DSM), the handbook of our era for diagnosing mental illness, are legendary — from its “tick the boxes” approach in its third edition (if you tick off a totally arbitrary but set number of boxes, then you are officially depressed or compulsive or whatever) to its pretense of describing real neuro-states in the latest edition. The DSM, Scull argues, is about nothing more than politics and profits, “horse-trading,” and of course, professional vanity. He’s written about it for the Los Angeles Review of Books here. Medical insurance now requires DSM diagnoses, so voila, the DSM is now a bible, although built on nothing more than constructs that say far more about our cultural preoccupations than about madness itself.

Scull’s point: what he terms bio-babble is as un-scientific as psycho-babble. “Your dopamine made you do x,” “your serotonin made you think y,” isn’t much better than the thinking about bodies by means of the four “humours.” But now “the chronicity” and incurability of mental illness makes the heyday of pharmaceutical companies. Evidence for inefficacy is, Scull argues, routinely suppressed. Meanwhile, anti-psychotics and anti-depressants are the most profitable drugs on the planet. A drug called “Abilify” (an anti-depressant) is the latest drug of choice; one might add that it is fittingly titled for our “lean in” moment, just as “The English Malady” or “American neurasthenia” were fittingly titled for their eras. One might also add that ADD (attention deficit disorder) diagnoses are endemic in the places where “leaning in” is a cultural virtue.


Like “poor folks waiting for Godot,” Scull thinks we are still waiting for those “long-rumored neuro-pathological causes of mental illness to surface.” In other words, biology is far too entangled with the social soup and how we’re inserted in systems of meaning to yield a univocal casual grammar. The social gets under our skins, and the social is complex. Most likely, poverty will ever be the biggest culprit in degenerative madness. But so will the accouterments of “civilization” (e.g., too-clean guts, the sperm of the aged, the couplings of engineers with each other, stressed wombs, domesticated cat feces, to name a few recently-scientized ragbag of culprits). In the end, the biggest problem to divesting ourselves of madness may be that, if we take it away, then we’ll be merely apes or robots, and so rather boring; we’ll lose the generative deviational bits and bytes of ourselves that make us mostly wilt, but occasionally soar.


Read more LARB pieces related to mental health and illness here.


Michele Pridmore-Brown is a scholar in History of Science at UC Berkeley and the Science Editor at the Los Angeles Review of Books.

LARB Contributor

Michele Pridmore-Brown is a scholar with the Center for Science, Technology, Medicine and Society at UC Berkeley and the Science Editor at the Los Angeles Review of Books.


LARB Staff Recommendations

Did you know LARB is a reader-supported nonprofit?

LARB publishes daily without a paywall as part of our mission to make rigorous, incisive, and engaging writing on every aspect of literature, culture, and the arts freely accessible to the public. Help us continue this work with your tax-deductible donation today!