When Algorithms Are Cheaper than Humans: The Hidden Curse of Automation

By Clive ThompsonJanuary 23, 2015

The Glass Cage: Automation and Us by Nicholas Carr

I AM A FEARFUL FLYER. The motion of planes leaves me so panicked and white-knuckled that my doctor long ago prescribed powerful drugs to manage my anxiety. They help a bit. But I also apply reason. When the flight gets rocky, I remind myself that flying is safer than it's ever been. The odds of dying in a plane crash are 5 million to 1. Planes are tightly regulated, pilots well trained, so hey, it's crazy to worry about flying, right?


It was thus a terrible idea to crack open Nicholas Carr's new book during a bumpy cross-country flight.


In The Glass Cage, Carr argues that the modern age of computerized automation is eroding our human skills. As digital tools have become more capable of doing the type of sensing, reasoning, and decision-making that was once the province of humans, they are taking over far too many of our daily tasks. This can be practical in the short term, because machines are more precise and less prone to error. But in the long run humanity suffers, because we stop practicing those same daily tasks. We become “deskilled,” with occasionally ruinous results.


One alarming example is air travel. In the early days of flight, planes had only manual controls, and crashes due to pilot error were common. High-tech autopilots soon emerged — the first ones as early as 1914 — and, being tireless and undistractible, these robot systems made plane travel much safer. Indeed, before long, they were omnipresent in commercial flights: by the late ’80s, Airbus had created the “glass cockpit,” a flight deck so heavily computerized that it had six screens. Today planes take off and land almost entirely by themselves. Pilots mostly sit back and monitor the screens while the machine does the work.


Which, as Carr notes, is precisely the problem. Pilots who don’t actively fly planes get out of practice. They lose the deeper connection with the plane that comes from flying it, manually, on an even keel, or landing it in rough weather. And deskilled pilots are dangerous, not least because they must take control during the most challenging, devilish situations — which is, of course, precisely the worst moment for an out-of-practice pilot to take over. When a 2009 plane crashed in Buffalo, it appeared to be because the pilots had so lost their kinesthetic sense of what the plane was doing that they flew upward into a stall. Months earlier, the pilots of an Air France Airbus seemed to have made the same mistake. In each case, everyone died. The glass cockpit becomes a glass cage.


This is the hidden curse of automation. Sure, it frees us from hard work. But hard work is how we develop, and retain, expertise, and the skill that comes with it. “What we should be asking ourselves,” Carr writes, “is, How far from the world do we want to retreat?” Or as the technology historian George Dyson once asked, “What if the cost of machines that think is people who don’t?”


In The Glass Cage, Carr dives deeply into aviation, a field where the side effects of automation have been more carefully studied than anywhere else. (Indeed, federal aviation officials are so spooked by the evidence that they recommend pilots fly manually more often.) But deskilling appears to be happening in other domains, too. In the financial world, studies have found that auditors who rely on automated decision-support systems have less nuanced understandings of risk than those who don’t use such systems. People who play the “missionaries and cannibals” game using only their own smarts develop a much better ability to solve the puzzle than those who use a computerized system that offers hints. Even Google’s search engine seems to have deskilled us: we used to type longer queries, but as the search engine has improved, we now type shorter ones.


Worse, automation seems to hit experts particularly hard. Since the 1990s, radiologists have used systems that auto-identify cancerous areas in mammograms. Studies find that these systems improve the cancer-detecting ability of less-capable doctors. But when it comes to expert doctors, tackling really tricky diagnoses, the system seems to degrade their ability. The experts get lazy, and trust the machines too much, even when the machines are inferior. This, Carr suggests, is a curious and unsettling effect of automation. It helps the “dilettante,” the person who isn’t serious about acquiring skills. But it erodes the skills of the serious expert.


On one level, The Glass Cage is a simple argument about technical skills: if we don’t use them, we lose them. But it’s also, in a deeper and broader way, a cultural manifesto. What really worries Carr is that by making tasks easier, automation can diminish our very humanity. We become less dimensional as people.


GPS particularly unsettles him. Carr cites research showing that relying on GPS has eroded Inuit skills in navigating the Arctic, and that participants in a Japanese study who used paper maps to navigate a city formed more detailed memories of the areas than those who’d used GPS. Carr derides the egocentric style of GPS, which places you eternally at the center of the map, in a “miniature parody of the pre-Copernican universe.” This technique, he argues, severs your connection to the world around you, and leaves you with a weaker mental model of the world, compared to the richer sense-making that comes from poring over a paper map. “Getting to know a place takes effort, but it ends in fulfillment and in knowledge,” he writes. “It provides a sense of personal confidence and autonomy, and it also provides a sense of belonging, a feeling of being at home in a place rather than passing through it.” In a profound sense, “the world recedes.”


This is a dramatic claim. It’s also a cultural one, and thus awfully hard to prove. Is it actually true that GPS so unbolts us from the framework of the world that we are unable to feel “at home in a place”? Keep in mind, Carr here is talking not merely about the occasional machine-led error many people experience, where a wonky GPS leads you down a wrong road. He’s making a deeply existential and even spiritual claim. This is where he loses me. While I hesitate to rely on “anecdata,” I simply don’t know anyone who seems so gormlessly disconnected from the geography around them. It is a poetic enough concern, but it doesn’t feel real.


It may be that we cannot so easily extrapolate from high-stakes skills to the more quotidian actions of everyday life. When it comes to flying a plane, it’s easy to see — quite objectively — the problems caused when automation deskills the pilot. Planes crash. But what about a more prosaic example? How about when everyday photography becomes easier — as with the profusion of digital cameras in smartphones, complete with software for automating previously onerous filtering techniques? Is that good or bad? Now we’re having a cultural debate, with no crashing plane, no objectively awful outcome. Partisans of analog film, one of whom Carr quotes, speak eloquently of the focus and attention that chemical photography demands; they describe how it makes them deeply attentive in the moment of capturing an image. Yet it’s also apparent that digital photography has produced a renaissance in everyday snapshotting: cameras are more easily accessible, so more photos are taken, and the surplus of editing tools encourages often-fascinating aesthetic experiments. I am routinely agog at the lovely or weird pictures on display in venues like Instagram, Facebook, or Flickr. Dilettantes, it seems, can also do delightful things.


On his deepest level, Carr is worried that automation will give us “fewer opportunities to demonstrate our own resourcefulness and ingenuity, to display the self-reliance that was once considered the mainstay of character.” A fair point, but it begs the question: What skills of self-reliance ought an everyday person to have, specifically? Or rather, which ones shouldn’t you lose, lest you begin to suck at the task of being human? This is very hard to assess, and Carr straightforwardly acknowledges it. Consider my perspective as an example. I’m a former Canadian Boy Scout, so I find it baffling that my urban friends don’t know how to light a fire in the driving rain, or how to quickly disinfect large quantities of water. For their part, many of my friends are foodies who find my disinterest in cuisine deeply suspicious: Isn’t the ability to cook a sumptuous, four-course meal a central mark of a civilized and self-reliant individual? Aren’t I surrendering myself medically (to eating unhealthier, prepared foods), politically (to the industrialized, monocropped food empire), and aesthetically (to some awfully drab fare plopped out of a can)? Come the zombie apocalypse, of course, my neighbors and I might discover the pleasures not of self-reliance but of relying on each other.


Carr is not, as he often writes, opposed to all technology. Quite the opposite: he’s a romantic. “Technology is in our nature,” he notes. “Through our tools we give our dreams form. We bring them into the world. The practicality of technology may distinguish it from art, but both spring from a similar, distinctly human yearning.” What he advocates, though, are devices designed to enhance our humanity — to keep us in the loop, instead of passively watching the machine do the work for us.


We certainly could craft machines that more regularly require us to use our judgment, to tax and hone our skills. We could automate less. Boeing has already done this, designing force-feedback cockpit controls so pilots can more viscerally sense the behavior of the plane. And video games, as Carr adds, are often brilliant at striking this perfect midpoint. They automate many rote tasks (managing your ammunition or magic spells, for example) in the service of requiring us to master very challenging ones (establishing complicated situational awareness in a battle). The best machines are ones that confront us with a useful amount of friction and inefficiency. They give us problems to solve, reskilling us.


Will the firms that design our technologies — our phones, our cars, our office systems — heed Carr’s sensible message here? He worries they won’t, and I sadly agree. Silicon Valley types (and nearly all consumer industrialists) boast of how they make things “easy” and “frictionless” for us. The logic of capitalism, after all, is to stamp out inefficiency, certainly when it comes to labor. Algorithms are cheaper than humans. Airlines aren’t likely to shift back en masse to manual flying, because autopilots allow planes to fly closer together, saving money, as Carr grimly notes. Our glass cage was built by an invisible hand.


¤


Clive Thompson is a longtime contributing writer for The New York Times Magazine and a columnist for Wired.

LARB Contributor

Clive Thompson is a longtime contributing writer for The New York Times Magazine and a columnist for Wired.

Share

LARB Staff Recommendations

Did you know LARB is a reader-supported nonprofit?


LARB publishes daily without a paywall as part of our mission to make rigorous, incisive, and engaging writing on every aspect of literature, culture, and the arts freely accessible to the public. Help us continue this work with your tax-deductible donation today!