Phillips’s vast Hall of Shame was assembled pre-coronavirus, but now the pandemic provides especially easy pickings, observable in real time. The most obvious is perhaps that of our leader promoting the unapproved drug Hydroxychloroquine (or bleach, or sunlight) as a cure for COVID-19, possibly endangering tens of thousands of lives, based on, um, his gut feeling. Then there are the thousands, perhaps millions, of his followers who choose to call this pandemic, which has claimed 100,000 lives as of this writing, a “hoax” in order to, um, own the libs. Meanwhile, just as many are fighting for the right to go unmasked at a time when that may mean death for people in their communities, if not themselves and their loved ones, and some of these people are attacking others who wear masks because … masks are politically correct? Symbols of a deep-state conspiracy? Who knows?! Add in assorted QAnon disciples and anti-vaxxers, and we seem to be living through a new Golden Age of partisan stupidity.
But while cartoonish science-deniers are an easy target, the danger from ostensibly rational parties may be more insidious — they have their own tendency to underestimate or misunderstand the power of natural events. As Phillips puts it, in what may be the perfect summary of his study, “Ecosystems are ridiculously complex things, and you mess with them at your peril.” Whether through arrogance or ignorance, people, many scientists included, tend not to grasp the profound intricacy and interconnectedness of the natural world, and the risks of messing with parts of it, especially when armed only with incomplete knowledge. Or to quote the original study behind the Dunning-Kruger effect: “Difficulties in recognizing one’s own incompetence lead to inflated self-assessments.” Take exponential growth: if you don’t grasp how it works, you’re not going to appreciate that coronavirus cases can go from 15 to 100,000 quite quickly if you don’t take tangible action to stop them. Take Donald Trump: suppressing medical studies and withholding testing equipment won’t do much to stop a highly infectious virus that has no known cure, even if it provides the illusion of keeping the numbers down.
From the book’s goofy “censored” title to the Darwin Awards–style sidebar lists like “6 Scientists Who Were Killed by Their Own Science,” you might get the impression that Humans is a jokey, bathroom-style compendium, but that would be a mistake. Phillips is hilarious and breezily readable, but he knows his sh*t, and his book is backed by mountains of research. His parade of screw-ups can be grouped into several categories. Among them is our tendency to spot patterns in nature that don’t actually exist — one I’m particularly fond of is Thomas Friedman’s “theory” that two countries with McDonald’s will never go to war. Another type, alluded to earlier, involves our eagerness to believe that we possess complete information on natural processes that are infinitely more complex than we know — as Phillips puts it, “Hey, what’s the worst that could happen if we divert this river?” There’s our tendency to be irrationally swayed by charismatic personalities or to engage in “groupthink” (the term comes from Irving Janis’s study of the Bay of Pigs fiasco, which was largely the result of government advisors being afraid to speak up about an obviously stupid plan). And there’s the phenomenon of being “fooled by randomness,” a concept first elucidated by risk-scientist Nassim Nicholas Taleb. He described how we often ascribe deeper meaning to what are in fact merely chance events — say, declaring some pundit a visionary because they lucked out and named the winners in three consecutive elections. And we can miss real connections due to time-lags between cause and effect, which can serve to protect malefactors — for instance, the fallout from disastrous environmental or military policies is often not apparent until many years later, when the parties responsible are gone or comfortably retired and the trail hopelessly muddied.
And then there’s the monumental danger of half-baked ideas wielded by the very powerful, a now-familiar phenomenon that has plenty of precedents. For example, among Mao Zedong’s achievements was a campaign to rid China of sparrows because of their tendency to eat grain. There was just one minor problem: with those annoying predators out of the way, the appetite of China’s locusts was unleashed. As Phillips recounts, “Unlike sparrows — who’d eat a bit of grain here and there — the locusts tore through the crops of China in vast, relentless devouring clouds,” a contributing factor to the great famine that decimated the country in the late 1950s, killing more than 15 million people. It’s worth noting that in this case Mao’s intentions were at least partly benevolent, but he was no environmental scientist. And, by virtue of his position, his crackpot idea was transformed into an order, with countrywide poster campaigns and “[j]ubilant sparrow-hating crowds” taking to the streets armed with flyswatters and rifles. This kind of Pied Piper event, in which authoritarians drunk on power incite masses of followers to take reckless actions in the name of patriotism or the collective good, recurs throughout history. And it raises the question: Toward what kind of “sparrow” will our own president push his riled-up base as the election approaches? What will come after “masks”-as-target?
Phillips also cites the example of a wealthy late 19th-century New York executive named Eugene Schieffelin, responsible for a purely well-intentioned intervention that proved to be an outright ecological fiasco. Schieffelin happened to love Shakespeare, and he decided to show that love by introducing into the United States every species of bird mentioned in the Bard’s plays. This included starlings, an alien species that soon spread across the country, becoming one of the most noxious birds in North America. “They destroy crops on a vast scale, tearing through wheat fields and potato fields alike and obliterating grain stores. They are aggressive, chasing native bird species out of their nests.”
And let’s not forget the chemist and inventor Thomas Midgley — an actual scientist — who has been described as a “one-man environmental disaster.” In addition to inventing chlorofluorocarbons during an innocent time when little was known about the ozone layer (oops!), he pioneered the use of lead in gasoline to prevent engine “knocking.” Unfortunately, lead builds up in the air and soil, and in the bodies of animals and humans. It is especially harmful to children, 70 million of whom in the United States were estimated to have toxic levels in their blood between the 1920s and 1970s. Not only were the effects of lead well known at the time, but cheaper, if less profitable, alternatives were readily available. The surgeon general was persuaded to look the other way, thanks to the strong-arm tactics of the fuel and automobile industries, some of them operating via a respectable-sounding front group called the Ethyl Gasoline Corporation.
A class of systemic screw-ups bedevils economic prediction, where the experts seem to get things wrong far more often than they get them right. Eight days before the 1929 market crash, in one of history’s most ill-timed predictions, the celebrated Yale economist Irving Fisher forecast that “stock prices have reached what looks like a permanently high plateau.” And just before the 2008 crash, whose devastation lingered for years, financial analyst Larry Kudlow wrote, “There’s no recession coming. The pessimists were wrong. It’s not going to happen.” The fact that Kudlow is now director of the National Economic Council is surely evidence that power and perceived expertise confer immunity. As a case in point, consider the repeated “predictions” of supply-siders like Kudlow that trillion-dollar tax cuts for the rich are a gift to the poor; each time they’re treated seriously despite being transparently false even to non–social scientists. (If Kudlow had already earned his place in history, he cemented it this year with his staggeringly incorrect predictions about the coronavirus. As of this writing, he still has his job and his status as a sober TV commentator.)
The financial markets may be Ground Zero for the misuse of science, whether due to ignorance or corruption, or both. Many modern risk models, to cite Taleb’s famous example, are the equivalent of inferring that turkeys have nothing to worry about by looking at the 364 days before Thanksgiving. Contemporary high-speed trading systems can be so complex and opaque — maybe intentionally so — that regulators are unable to accurately understand them, much less figure out how to monitor them. Who knows what will happen when these systems inevitably fail? (The widespread failure to foresee the cascading derivative disasters of the 2008 crash does not inspire confidence.) And there’s little evidence that the insanely technical charts and formulas used to back market forecasts today produce better results than Fisher did in 1929. (It’s probably safe to assume that the generals who declared that the 2003 Iraq invasion would be a “cakewalk” had equally impressive-looking charts to support their view.)
None of this is to impugn science, and that’s not Phillips’s intention in Humans — quite the contrary. His goal is simply to encourage awareness of our propensity to screw up, even or especially when we’re quite certain we’re right. This tendency may be harmless enough in day-to-day life (when we might, say, have one too many drinks or forget to put our gloves on when it’s snowing) but can be apocalyptic on a larger stage (when we might, say, become careless with the nuclear codes), or in cases where the variables are far too numerous to make a sound decision (when we allow the widespread use of pesticides whose long-term domino effects can’t be foreseen, or give the all-clear sign to “reopen” because coronavirus deaths have dropped to “only” 2,000 or so a day).
It’s important to keep in mind that we generally don’t have to worry about a random nut in a local coffee shop diverting a river, sending fighter planes into Iran, or convincing the government to build policy around economic pseudo-science. Power bestows the ability to co-opt both the language of science and often science itself on a large scale, and the worst catastrophes described in Phillips’s book might have been avoided if they didn’t have power behind them. But at the same time, if you’ve been granted the kind of power that modern governments have, it’s a terrible dereliction of duty to shirk responsibility when natural calamities are looming. A sensible, if un-sexy, rule in these cases is the precautionary principle, which dictates using the utmost caution when the stakes are extremely high and there is the slightest risk of an unmitigated disaster. In the predicament we find ourselves in now, this attitude seems more important than ever. Indeed, in the face of a deadly virus that we still don’t know enough about, and considering how often we’ve gotten things wrong in the past, it hardly seems wise to haphazardly roll the dice while keeping our fingers crossed that things will work out.
Dave Mandl writes about music, technology, and finance. He is also a bass guitarist, host of a radio show at WFMU, and a professional Wall Street computer geek.