The Case for Kicking the Stone

Philip Ball finds Nicholas Carr’s “Superbloom: How Technologies of Connection Tear Us Apart” disturbingly compelling.

Superbloom: How Technologies of Connection Tear Us Apart by Nicholas Carr. W. W. Norton & Company, 2025. 272 pages.

Keep LARB paywall-free.


As a nonprofit publication, we depend on readers like you to keep us free. Through December 31, all donations will be matched up to $100,000.


HOW MUCH BETTER is the world since the arrival of what Nicholas Carr calls our modern “technologies of connection”—cell phones, personal computers, the internet, social media, artificial intelligence? As we watch these systems undermine democracy, flood our lives with misinformation and deepfakes, transform our children into screen-obsessed zombies, and threaten to eradicate us entirely, we might be tempted to respond with a hollow laugh.


But to what extent are these bleak scenarios real? What about being able to navigate by GPS almost anywhere in the world, or call for help when stranded in the middle of nowhere, or stream any song we desire? The cost-benefit calculation is complicated and nuanced, requiring us to find a course between apocalyptic visions of civilizational decline and the naive utopianism of Silicon Valley.


Carr established himself as an astute commentator on information technology in his 2010 book The Shallows: What the Internet Is Doing to Our Brains, in which he argued that Google and the internet are, in the words of his formative essay in The Atlantic, “making us stupid.” In his new book Superbloom: How Technologies of Connection Tear Us Apart, he expands that analysis to encompass social media and digital communication technologies, asking how they are changing us individually and collectively. As the book’s subtitle implies, the diagnosis is not promising. But if these systems are indeed tearing us apart, the reasons are neither obvious nor simple. Carr suggests that this isn’t really about the evil behavior of our tech overlords but about how we have “been telling ourselves lies about communication—and about ourselves.”


Carr takes his title from an event in 2019, when weather conditions produced an unusually abundant bloom of poppies in a canyon near Los Angeles. When an Instagram and YouTube influencer posted pictures of herself among the flowers, the hashtag #superbloom went viral, and before long, the place was overrun, a traffic officer was injured, online discourse curdled, and the media began speaking of “Flowergeddon.”


Carr argues that the #superbloom event exemplifies the problems with these new media. “We spend our days sharing information, connected as never before, but the more we communicate, the worse things seem to get.” Isn’t that just the opposite of what was supposed to happen?


To understand what went wrong, Carr takes the long view. Telecommunication in the literal sense—near-instantaneous communication over long distances—is an ancient dream. The German abbot Johannes Trithemius of Sponheim claimed in 1499 that he knew how to convey thoughts over vast distances—“by fire” with the assistance of angels. We shouldn’t scoff. Telecommunication has always been accompanied by notions of what we would now call the supernatural. Sending radio transmissions through the ether excited speculations about telepathy and unseen spirit worlds even among some scientists of the Victorian Age; the first televisions inspired ghost stories, and the internet has its own spooky mythology. “Every new medium,” wrote social theorist John Durham Peters, “is a machine for the production of ghosts.”


Communications technologies have psychic as well as practical and social consequences. Invisible influences arouse age-old fears and failings. The way internet anonymity unleashes the worst in some users has been dubbed the Gyges effect, after the tale told by Plato in The Republic about a shepherd who found a ring of invisibility and used it to become a tyrant. Plato’s point—that no person can be trusted to resist corruption if they cannot be held accountable for their actions—seems more relevant than ever, whether we’re talking of lonely trolls in bedrooms or Elon Musk.


The central issue, Carr implies, is that we have tended to suppose that new technologies for communication are either neutral media for making what was once laborious and expensive cheaper and easier, or positive developments that, by putting us ever more in touch with people and information, lubricate social discourse and make us more rational. This was the message pushed by advocates since the early days of radio, if not earlier: more information, and easier access to it, would lead to greater democratization. It was a nice story, and many tech entrepreneurs probably believed it—but only because they were ignorant of what social and political scientists had long been saying.


In his 1922 book Public Opinion, Walter Lippmann pointed out that because the real world is too big and complex to grasp, we navigate it using crude models of reality constructed from stereotypes. As a result, according to Christopher H. Achen and ‎Larry M. Bartels in their 2016 book Democracy for Realists: Why Elections Do Not Produce Responsive Government, “the political ‘belief systems’ of ordinary citizens are generally thin, disorganized, and ideologically incoherent.” Dismissed as an antidemocratic elitist, Lippmann wasn’t suggesting that people are too stupid to handle all that information, but merely too finite. “Well before the net came along,” says Carr, “[the] evidence was telling us that flooding the public square with more information from more sources was not going to open people’s minds or engender more thoughtful discussions. It wasn’t even going to make people better informed.”


By the age of modern information technologies, there was abundant evidence to support that view. But the inventors of these technologies didn’t care, or, more probably, didn’t know. (“One of the curiosities of the early 21st century is the way so much power over social relations came into the hands of young men with more interest in numbers than in people,” Carr writes—but in fact, their interest in most things, besides profit, tends to be extremely shallow.) The prevailing attitude was that adduced by the philosopher and educator John Dewey in his 1927 book The Public and Its Problems: he imagined a “Great Community” united by a free and open system of communication that would usher in a utopia of harmony. “We live in its ruins,” says Carr, “overwhelmed by the information that was meant to enlighten us, imprisoned by the data that describe us.” How and why did this happen?


The term “social media” was coined by a reclusive political scientist named Charles Horton Cooley in an 1897 essay in which he pointed out that communication media do much more than channel information. They establish social norms and beliefs, mediate praise and blame, and establish power hierarchies. As a result, says Carr, “every communication medium is political.” It determines what we see and know, and shapes it in the process. Why, in the age of Donald Trump, would we even need telling that? It was obvious even for Kennedy, Nixon, and Reagan. Yet this plain truth collides with the rhetoric of the moguls who control these channels today, whether it’s the pathologically callow Mark Zuckerberg or the manipulative hypocrite Elon Musk: information will set us free. Even if social media really were nonpartisan, they would still threaten rather than promote democratization.


At root, we’re the problem. Our minds don’t simply distill useful knowledge from a mass of raw data. They use shortcuts, rules of thumb, heuristic hacks—which is how we were able to think fast enough to survive on the savage savanna. We pay heed, for example, to what we experience most often. “Repetition is, in the human mind, a proxy for facticity,” says Carr. “What’s true is what comes out of the machine most often.”


Famously, psychologist Daniel Kahneman has argued that, beside this “fast thinking” module, we also have a “slow thinking” one that is more deliberative and weighs the evidence more carefully. But social media are designed to undermine this mode by eliciting “fast” responses, which sends us seeking the next and the next dopamine hits. Take the retweet button, invented by a Twitter team led by developer Chris Wetherell. By making it so easy to amplify posts, perhaps to thousands or millions at once, without needing to ask if the posts are true (compared to, say, passing them on to friends in a phone call), the function is a megaphone for misinformation. To his credit, Wetherell did eventually rue his invention, recognizing that implementing the button was like “hand[ing] a 4-year-old a loaded weapon.” But it was good for business.


The central problem, however, is that an onslaught of information—of everything, all at once—flattens all sense of proportion. When Zuckerberg said to his staff that “a squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa,” it’s not that his tone-deaf observation was untrue but that, as Carr says, he was making a category error, equating two things that cannot be compared. Yet “social media renders category errors obsolete because it renders categories obsolete. All information belongs to a single category—it’s all ‘content.’” And very often, the content that matters is decided in the currency of commerce: content is “bad” when it harms profits.


This flattening reconfigures our world. Electronic media “destroy the specialness of place and time,” wrote professor of communication Joshua Meyrowitz in 1985. “[W]hat is happening almost anywhere can be happening wherever we are.” Cultural philosopher Jean Baudrillard called a world that had dissolved into information and communication “hyperreality.” Nothing is “real” any more that is not a part of that world. We enter what he called “the ecstasy of communication,” where there is only a “smooth and functional surface” with nothing beneath it. There was no superbloom, only #superbloom. Reality is remodeled to fit with what hyperreality offers. Deepfakes and viral conspiracy theories obscure not just what is real but also what can be real; “strangeness itself becomes a criterion of truth,” says Carr.


Are these gloomy forecasts about the modern information ecosystem just reiterations of old fears? Plato lamented that writing would erode our minds; the printing press was denounced as a diabolical device. Newspapers were accused of peddling filth and debasing public morality; television was going to rot our minds.


But Carr makes a persuasive case that this time is different. With older media, the friction of the interface provided some space for reflection and hierarchizing significance. What was on the front pages or what led the news bulletins was what we heeded most. Music had to be sought out and didn’t come with an infinite stream of more. Digitization has become a “universal solvent” for all information, fed to the same device on the same platform with a convenience and ease that becomes a curse. We have evolved to seek, says Carr, but with the internet, there is no natural curb to that desire, and never any sense of satiation. Reality can’t compete with the internet’s steady diet of novelty and shallow, ephemeral rewards. The ease of the user interface, congenial even to babies, creates no opportunity for what writer Antón Barba-Kay calls “disciplined acculturation.”


Not only are these technologies designed to leverage our foibles, but we are also changed by them, as Carr points out: “We adapt to technology’s contours as we adapt to the land’s and the climate’s.” As a result, by designing technology, we redesign ourselves. “In engineering what we pay attention to, [social media] engineers […] how we talk, how we see other people, how we experience the world,” Carr writes. We become dislocated, abstracted: the self must itself be curated in memeable form. “Looking at screens made me think in screens,” writes poet Annelyse Gelman. “Looking at pixels made me think in pixels.”


Carr is no conservative or Luddite: he sees, for example, how Gen Z social-media-speak is witty, inventive, fun. But it, too, is flattening, inviting fast, thoughtless interaction. It is hard to imagine some of the infantile communications of UK governmental officials revealed by their COVID-19 inquiry happening by letter or written memo rather than on WhatsApp. Only after such erosion of standards in public discourse could the misinformation propagated about the official response to Hurricane Milton—lies, embraced by the target audience, about lives and livelihoods lost—become possible. “[W]hen did that become okay?” Barack Obama furiously demanded. It became okay once social media allowed us to choose which reality we preferred.


Let’s put a date on it: 2016, the year of Trump and Brexit, when politicians realized that misinformation only has to last long enough to get them into office. Of course, communication technologies have been agents of power ever since Martin Luther took advantage of the printing press, and even more so with mass media. “It would not have been possible for us to take power or to use it in the ways we have without the radio,” said the Nazis’ chief propagandist Joseph Goebbels in 1933, and it is no longer hyperbolic to substitute his name for Musk and radio for social media. As Stanford University professor of communication Fred Turner has written, “The technologies of decentralized communication […] can in fact be made cornerstones of [authoritarian] power.” And that is merely assuming that would-be dictators know how to; imagine what they could do if the owners of such media actively supported that process. Except that now we don’t have to imagine this scenario, do we?


The temptation to blame every current sociopolitical failing on communications technologies should be resisted, though, and just occasionally Carr’s argument goes beyond the evidence. He accepts the contested claim that false stories spread faster than true ones and cites social psychologist Jonathan Haidt’s disputed (if plausible) argument that social media can be identified as a cause and not just a correlate of depression and anxiety in young people. Carr’s claim that “open-ended, contemplative ways of thinking—the philosophical, the ruminative, the introspective—have been marginalized” warrants further interrogation. And when he quotes media scholar Ian Bogost as saying that social media offer only a “sociopathic rendition of human sociality,” one has to ask: Is that really all it does? Do we not initiate and cultivate friendships this way? Didn’t communication technologies help relieve the isolation of pandemic lockdown? Additionally, Superbloom is heavily predicated on the American experience. For many people globally, hyperreality offers no escape from hardship, drudgery, peril, and war.


All the same, the case Carr makes is compelling. Is there an antidote? He does not believe we can simply reshape and constrain the technologies. It is too late for that—it would be like putting a path across a park that no one wants to follow. That’s not to say that we can’t have better laws and regulations, checks and balances. One suggestion is to restore friction into these systems. One might, for instance, make it harder to unreflectively spread lies by imposing small transactional costs, as has been proposed to ease the pathologies of automated market trading. An option Carr doesn’t mention is to require companies to perform safety studies on their products, as we demand of pharmaceutical companies. Such measures have already been proposed for AI.


But Carr doubts that increasing friction will make much difference. And placing more controls on social media platforms raises free speech concerns. Whose values decide the rules? One need only consider the effect of state restrictions in China, which show that facts can be suppressed, history distorted, and political historian Timothy Snyder’s notion of “anticipatory obedience” inculcated.


We can’t change or constrain the tech, says Carr, but we can change ourselves. We can choose to reject the hyperreal for the material. We can follow Samuel Johnson’s refutation of immaterialism by “kicking the stone,” reminding ourselves of what is real. It is an inspiring rallying call, and Superbloom shows us what is at stake—but with market forces, peer pressure, and our own instincts ranged against us, this might be easier said than done. As Richard Powers wrote in his 1998 novel Gain: “People want everything. That’s their problem.”

LARB Contributor

Philip Ball is a science writer and author. His latest book is How Life Works: A User’s Guide to the New Biology (2023).

Share

LARB Staff Recommendations