Thinking, Fast and Slowby: Daniel Kahneman
WHAT TO DO? It seems as if the human race is ready to throw itself over a cliff. Or, rather, get thrown by the technology it has itself created. We run the risk of going extinct, and the irony is, we did it to ourselves. The ‘smarty pants’ brain that created advanced weapons, complex global economics, vaccines and antibiotics that save lives, universities, YouTube, and public parks (not to mention Bach and baseball) is routinely bossed around by the brain that shoots from the hip, makes often terrible decisions, and reacts more to fear and greed than to reason.
Climate change, nuclear weapons, deadly viruses, crashing markets, poverty and intolerance (to name a few) do not lend themselves to the kinds of solutions that this bossy, ancient brain, which runs most of our lives, can even understand. And the faster our lives go (and the speedier our technologies), the worse things get, because the smarty pants brain is also the slow brain; often, things just happen too fast for the slow brain to get a word in edgewise.
No one in their right mind would deliberately create the means of their own extinction, but that’s what we seem to be doing. The only conclusion is that we’re not in our right minds — which appears to be true. The two books considered in this review may not have an obvious relationship. Fred Guterl’s The Fate of the Species: Why the Human Race May Cause Its Own Extinction and How We Can Stop It, tells a compelling if disturbing narrative of what went wrong, with great stories, clear explanations and just enough optimism to think we might make it after all. But his book, by design I think, doesn’t deal with the biggest danger of all: the very nature of human thought. Daniel Kahneman’s Thinking Fast and Slow deals with the very nature of thought, and it just may be the most important book I've read in many years. Kahneman, a Nobel Laureate in Economics, offers potential solutions that actually might work. In tandem, the books provide a useful map of where the dragons lie (almost everywhere, alas) and also potential paths to safety.
So just how might the human race be rushing toward extinction? Guterl, the executive editor of Scientific American, doesn’t deal much with the human mind. He also doesn’t focus on the tens of thousands of nuclear weapons on hair trigger alert or other non-cyber military threats. He argues that commercial interests may be harder to counter than armies, because they’re so much a part of our everyday lives — and oh so deeply seductive. High on commerce, we create and consume and regurgitate so much stuff that we’ve altered the atmosphere, the oceans, and everything in them; we’re causing mass extinctions of species at an alarming rate — species we depend on for survival. (If it’s any consolation, the rest of life wouldn’t miss us much at all; if humans disappeared from the biosphere, it’s unlikely, according to biologist E.O. Wilson, that any insect species at all would go extinct — with the exception of a few forms of lice.)
Our bloated population and overcrowding combined with agricultural practices and good old evolution (we didn’t invent that, at least) has resulted in a situation where viruses, of all things, may be the single biggest threat to our survival as a species. To take one example, Bird Flu — an innocuous sounding bug that became supercharged in crowded Asian poultry farms — is now in the human population, though so far just barely. The flu can have a 100 percent mortality rate, turning organs into “blackish, gelatinous goo,” rather like Ebola. In our current interconnected world, a serious flu pandemic would make the Black Death seem mild by comparison.
Guterl often shares his own awe at the ingenuity of both the people who chase down these bugs, and the bugs themselves (we are using “bugs” loosely, of course). The virus that causes smallpox, he writes, “may be the most awe-inspiring human pathogen,” a “beast” so well protected by a “fearsome shell of protein” that it “effectively has no expiration date.” We can create vaccines, but they become obsolete before they’re needed because the viruses evolve too fast for the drugs to catch up. Stockpiling of vaccines reminds Guterl of children who stack stuffed animals on their beds “to keep the ghosts away.” It may make us feel better, but it doesn’t do much. And not only is it relatively easy to turn smallpox into an even deadlier disease (and disseminate it fast and widely), it’s getting alarmingly easier to design killer viruses from scratch.
Then there’s climate change, which may already be at a point of no return. As areas of the globe whip-saw between flood and drought (and thus wildfire), the extremes can reinforce each other, tipping the whole system into chaos. Of course, the Earth has experienced extreme climate changes before, including the one that occurred some 2.4 billion years ago, when mutant algae started producing a “poison” that killed much of the life on the planet. That “poison gas” was oxygen, and it gave rise, eventually, to us. Extinction is common on this Earth of ours.
If the bugs and heat don’t do us in, we have our marvelous machines — computers that open up the world, and also leave us vulnerable to mischief (and accidents) on unthinkable scales. Take the infamous computer virus Stuxnet, probably created by the CIA or Israel’s Mossad. Aimed at Iran’s uranium centrifuges, it “had the force of a military airstrike.”
No one really understands the sudden crashes that have melted down markets, and the fragility of the financial system is only the start. Take down the power grid, for example, and pretty soon we’ve got no water, gas, food, medicine, or communication. No bombs required. Simply ones and zeroes strung together by ingenious human beings. And no one seems to know what to do about this.
Kahneman’s book has no such sense of urgency, but his insights about how we think probably offer the only real way out of the mess we’ve created. It’s also a great deal of fun, if you don’t mind learning how badly your mind misleads you. Kahneman offers real solutions, including new ways for us to talk to each other, which are so desperately needed in a time when rant and deception seem to be the major coin of public discourse.
Described by Steven Pinker as one of the most influential psychologists in history, he is also a founder, along with his late collaborator Amos Tversky, of behavioral economics. In his book, Kahneman covers decades of research, much of it his own, in a charming, kindly and mostly accessible manner. In one sense, there’s nothing new here: psychologists, economists and neuroscientists have long known that we don’t behave like the rational actors we believe ourselves to be. Most of the time, our conscious mind doesn’t have a clue as to why we do things — though it’s very good at making up stories after the fact that explain (and excuse) almost any behavior.
What’s new is the growing scientific understanding of the oft-tortured relationship between the older parts of our brains and the complex societies created by the newer parts. Scientists used to think that it was emotion that clouded rational thinking. Now we know that emotion is essential to thinking. Our errors are based not on irrationality but on the very “design of the machinery of cognition,” he writes.
Essentially, we have two different thought systems that work very differently, and Kahneman refers to them throughout the book as characters he calls System 1 and System 2. System 1 is a marvel honed by millions of years of evolution that runs on automatic (and can’t be turned off). It’s a virtuoso at jumping to (usually correct) conclusions on the basis of very little information. A master at coming up with shortcuts (heuristics) that usually work, we couldn’t get through a minute of our day without it. As Kahneman points out, most of what we know about System 1 would have “seemed like science fiction” 30 or 40 years ago. Unfortunately, System 1 can’t be reflective. It can’t know what it doesn’t know, but it always knows that it’s right. And because it works so much faster and more smoothly than System 2, it almost always overrules our more rational selves.
System 2 is generally clueless about System 1’s flaws. It’s too slow and inefficient to handle immediate matters; it consumes huge amounts of energy, takes effort and time, and requires a great deal of self-control. Since “laziness is built deep into our nature,” we mostly glide along on System 1. System 2 is supposed to be the overseer, the skeptic, the doubter, but it’s often busy and tired and defers to System 1, which is gullible and biased. In fact, System 2 is often an apologist for System 1. “Its search for information and arguments is mostly constrained to information that is consistent with existing beliefs,” Kahneman explains.
So if we believe something is true, we also believe just about any argument that could support it, no matter how fragile. This makes us supremely overconfident. In fact, people who hear only one side of an argument are even more confident than those who hear two. Hearing just one side makes it easier to create a sensible story on little information, because there are fewer pieces to the puzzle. Experts can be the worst, because people with the most knowledge often develop both unrealistic illusions about their own skills and the predictability of our world.
System 1 is almost always in charge, often up to mischief, and entirely beneath our “rational” radar. Repeat any lie often enough, and people will believe it. Words such as “war” and “crime,” because they represent immediate threats, attract more attention — and do it faster — than words like “peace” and “love.” Mention words relating to old people to young students, and they start walking more slowly (the Florida effect). Think about money and you become more selfish and reluctant to get involved with others.
Many of these glitches in rationality have serious consequences. For example: when we let loud, authoritative voices, even when they’re wrong, dominate discussions. Or when we make heroes of people who may have simply been lucky and are in fact stupid — what Kahneman calls the “halo effect.” Or “outcome bias,” when we blame people for good decisions that turned out badly due to unforeseen circumstances.
Kahneman demonstrates ways to blunt System 1’s antics by adopting a new vocabulary, and using it in everyday conversations — say, over the “water cooler,” as he puts it. Some examples: “All she is going by is the halo effect from a good presentation.” Or: “Let’s not fall for outcome bias. This was a stupid decision even though it worked out well.” Or: “This was a pure System 1 response. She reacted to the threat before she recognized it.” “Slow down and let your System 2 take control.”
Left to its own devices, System 1 thinking can make “fair” solutions to problems impossible to see, because each side is motivated more by loss than gain, and each feels the other’s concessions are less painful. These proclivities keep us from seeing the enormous role luck plays in both success and failure, and messes with our notions of cause and effect. Praise someone for good work, and they’ll probably do worse the next time. Criticize them for bad work, and they’ll probably do better. Does this mean that criticism works better than praise? No, probably not. Despite what System 1 tries to tell you, it’s simply “regression to the mean.” In other words: a predictable return to what's more or less normal.
Both Guterl and Kahneman are ultimately optimistic, albeit with significant caveats. One big obstacle is simply that no one wants to admit how bad things are: “Most of us view the world as more benign than it really is, our own attributes as more favorable than they truly are, and the goals we adopt as more achievable than they are likely to be,” Kahneman writes. The good news is that our optimistic bias helps us persevere. With enough effort, we can overcome System 1’s irrationality. We should concentrate on situations where the stakes are high.
Guterl sees optimism as our only choice, and reminds us that the same creative (and rational) thinking that made the world a mess will be essential for setting it right. There’s no going back to some pre-scientific world. Yes, synthetic biology can make evil bugs, but it can also create bacteria that produce fuel, clean up dirty water, and develop cheaper and safer drugs. Science is simply knowledge; it’s science in the service of commerce that creates a lot of our problems. In the end, he sums it up nicely: “Humanity is a bold assertion, a derisive snort at nature. We’ve beaten the odds so far. To continue beating the odds will take every good idea.”
The good ideas we need are out there, but System 1 stops most of them from even entering our conscious awareness, because System 1 likes what it already knows. It essentially short-circuits our thinking.
Of course, we can circumvent System 1. For example: cut multitasking to a bare minimum. Spend more time staring into space. Walk, bike, or hike without headphones, so you can hear yourself think. Disallow news crawls during presidential debates. Cut way back on homework so students have time to actually reflect, and create assignments that demand every ounce of creativity. Ban timed tests. Ban high frequency stock trades. Reward failure. Dance. Hug. Put down the cell phone and look the bank teller, the grocery clerk, your dinner companion, in the eye. Reclaim our essential humanity.
In other words, put cognitive speed bumps everywhere we can to give System 2 a chance to get a word in; a chance to take crazy ideas seriously.
The late physicist Frank Oppenheimer, who spent most of his life trying to figure out how to contain the spread of the very nuclear weapons he helped create, once got a letter from a child suggesting we dump mountains of Jell-O on bad guys to stop them in their tracks. It might be just the thing for shutting down Iran’s nuclear reactors.
Okay, that sounds nutty. But maybe it’s not nutty enough.
At least, it’s something to think about. Slowly.