Can the Internet Be Saved?

After troubleshooting Tim Berners-Lee’s memoir, it becomes clear that the internet’s flaws were there from the start.

This Is for Everyone: The Unfinished Story of the World Wide Web by Tim Berners-Lee. Farrar, Straus and Giroux, 2025. 400 pages.

Did you know LARB is a reader-supported nonprofit?


LARB publishes daily without a paywall as part of our mission to make rigorous, incisive, and engaging writing on every aspect of literature, culture, and the arts freely accessible to the public. Help us continue this work with your tax-deductible donation today!



THE INTERNET IS getting old. Nearly four decades have passed since an unassuming British programmer named Tim Berners-Lee invented the World Wide Web in his office at the CERN accelerator complex near Geneva. By transforming the net, until then an arcane research hub for scientists and engineers, into a buzzing, hyperlinked, multimedia information system, Berners-Lee unwittingly sparked the public’s mass migration online. He also, as he writes in his memoir, This Is for Everyone: The Unfinished Story of the World Wide Web (2025), opened the gates for those who “might corrupt what [he] was trying to build.”


Born to two mathematicians in 1955—the same year, he notes, as Steve Jobs and Bill Gates—Berners-Lee was an unusual kid. Growing up in the quiet London neighborhood of East Sheen, not far from the Thames, he didn’t listen to rock music or follow sports teams or watch television, and he remained oblivious to the excitements and upheavals of the 1960s. “I missed out on pop culture more or less entirely,” he writes. Fascinated by the workings of electronic circuitry, he spent his teens building model railroads and cobbling together gadgets, including a rudimentary computer, from scavenged parts. An outstanding student, he entered the Queen’s College at Oxford on a scholarship and went on to earn a first in physics. The door to an academic career was open, but he chose instead to pursue his passion for tinkering by becoming a writer of software code.


Berners-Lee took on a series of short-term coding assignments before landing a more permanent job at CERN in 1984. As a member of the Data and Documents division, he wrote programs for capturing detailed digital images of atoms colliding. He was well liked by his colleagues, though they often found it difficult to follow his rapid-fire, abstract style of speech. They would hold up signs that read “Tim, slow down” when he took the floor at meetings. When, early in 1989, he began talking up his idea for the World Wide Web—he originally called it “Mesh” but decided that sounded too much like mess—he described it as a way for researchers using incompatible computers to share experimental results and other data via documents uploaded to the internet and connected with hyperlinks. It was a tool, he explained, for “intercreativity.” His pitches were met with bafflement and indifference. “[F]ew of us if any could understand what he was talking about,” his boss at the time recalled. His written proposals, illustrated with elaborate diagrams, went unread.


His break came in October 1990 when CERN decided to test one of the powerful new “cube” computers produced by NeXT, the company Steve Jobs formed after being ousted from Apple in 1985. Berners-Lee was chosen to “kick the tyres” of the machine, with the suggestion that he use it to work on the “hypertext project” he couldn’t stop talking about. In an intense, one-man hackathon over the ensuing two months, he wrote the foundational codes that govern the design and functioning of websites and browsers. Just before Christmas, he posted the world’s first web page on a CERN server. A year passed before people outside CERN noticed his invention. Then it exploded, becoming one of the most rapidly adopted technologies in history. Soon, “web” was a synonym for “internet,” and Berners-Lee was famous.


The inventor moved to the United States, where excitement about cyberspace had been mounting for years, and joined the faculty at the Massachusetts Institute of Technology. MIT had agreed to host the World Wide Web Consortium, an independent standards-setting organization Berners-Lee set up in the hope of ensuring that the web’s protocols remained open, free, and unsullied by the profit motive. He wanted his system, he made clear, to be geared to the needs of individuals and small groups, not those of big businesses. But even as the W3C, as the consortium was called, began hosting conferences and issuing guidelines, the web’s enormous commercial potential started drawing the interest, and the money, of venture capitalists, entrepreneurs, and corporations. He soon found himself battling them for control of his invention’s fate.


Berners-Lee’s first nemesis was an unlikely one: a husky Midwestern kid named Marc Andreessen who had his own ideas about how the web should work. While an undergrad at the University of Illinois at Urbana-Champaign, he and a friend created a new browser called Mosaic, which incorporated, over Berners-Lee’s objections, an unsanctioned code for displaying images directly on web pages. The innovation proved popular, and Mosaic quickly became the most widely used browser. The only problem for Andreessen was that Mosaic was owned not by him but by the university lab that funded his work. Sensing an entrepreneurial opportunity of epochal proportions, he decamped to Silicon Valley and joined forces with a wealthy investor, Jim Clark, to launch the browser company Netscape. With many new and attractive features, the company’s browser, Navigator, quickly displaced Mosaic as the market leader. When Netscape went public in 1995, Andreessen made millions and Clark made even more. The internet gold rush was on.


While Berners-Lee would continue to win awards and accolades, culminating in his knighting by Queen Elizabeth II in 2004, he became a peripheral figure in the story of the web’s transformation into a global thoroughfare for commerce and communication dominated by a handful of very large, very profitable, very American companies. His dream of building a decentralized system that would give people “the ability to make links around the world” without “ensnaring them in dead-end, anti-human materialism, or systems of surveillance, coercion and control,” went unfulfilled. The internet giants continued donating cash to the W3C, but they paid little heed to its strictures or goals. Rather, the businesses themselves set the rules for how the web operates, and their rules made a mockery of its inventor’s ideals. Far from being a shield against materialism, surveillance, and control, the net became their instrument.


Berners-Lee hasn’t given up hope. If anything, he seems more optimistic than ever, seeing “signs of spring” sprouting everywhere. He rhapsodizes about a future internet that, having incorporated artificial intelligence and virtual reality systems, escapes the computer and the phone to become a permanent “overlay onto the physical world […] a kind of seamless informational filter on top of reality.” We will be wrapped up in it even more than we are already—a prospect he welcomes.


The “new web” will also be freed from corporate control. “It will be decentralized” and will grant its users “data sovereignty.” Thirty-five years of history will be erased, in other words, and Berners-Lee’s original vision for the web will at last come to pass. How he expects this to happen in the face of powerful, entrenched business interests is a little sketchy, but he suggests that social media companies and other internet firms will come to adopt a new set of technical protocols and operating constraints that effectively destroy the data collection, customer-profiling, and behavior-prediction systems at the core of their business strategies and profitability. They will put Berners-Lee’s ideals ahead of their own business interests.


Inventors are rarely able to see their inventions objectively, just as parents are rarely able to see their children objectively, but Berners-Lee seems particularly impaired. His focus on protocols and codes may be understandable for a programmer, but it has blinded him to a deep and uncomfortable truth about his invention. The web wasn’t corrupted by outside forces. The corruption was there from the start, latent in its design. A vast, decentralized communication network that can transmit data of all sorts to all people is not resistant to the establishment of information monopolies; on the contrary, it encourages their formation. Decentralization at a technical level breeds centralization at an industrial level. The reasons are manifold:


1. Such a network is subject to particularly strong network effects. Because a communication system becomes more valuable to users as the number of users increases, a boundaryless network with few physical or technical constraints on its expansion will consolidate traffic on a massive scale, giving strong advantages to the biggest players.

2. The net is also a marketplace where an unimaginable number of transactions, both financial and social, are completed every second without regard to the physical location of the participants. That favors large intermediaries, or middlemen, that have the infrastructure necessary to host myriad market participants, execute transactions quickly and precisely, and maintain detailed records of all that transpires.

3. Operating at such scale requires large capital investments—for servers, storage drives, networking gear, cooling systems, and the like. The capital requirements present daunting barriers to entry for newcomers, barriers that are growing more forbidding as resource-intensive AI routines are incorporated into online processes and services.

4. The interpersonal links that Berners-Lee rightfully celebrates for their intellectual and social value also have outsize financial value when captured as data and analyzed by computers. The network effect applies not just to people but also to information about people. The more aggregated the information, the bigger an asset it becomes for companies that profit by predicting attitudes and behavior.

5. Consumers benefit from the companies’ scale. Whatever fears people may have about privacy invasions or wealth concentration, they enjoy the personalization, convenience, and diversion that social media companies and other internet outfits serve up in endless quantities for free. People’s loyalty to algorithmically tuned feeds may be a form of addiction, but it doesn’t seem to be an addiction many are eager to break.

Berners-Lee may be loath to admit it, but the web he designed rewards both massive scale and centralized control.


He barely mentions the only practicable way to curb the size and power of the internet giants: breaking them up through aggressive antitrust litigation. Such a course seems unlikely for the moment, as Congress and the courts have for years taken a passive approach to antitrust enforcement, leaving control of monopolies to the marketplace. Late last year, a federal judge threw out a long-running, much-watched Federal Trade Commission suit seeking to split up Meta, ruling that the company’s social media platforms face sufficient competition for people’s attention from other popular platforms like YouTube and TikTok. But, as the legal scholar Tim Wu explains in The Age of Extraction: How Tech Platforms Conquered the Economy and Threaten Our Future Prosperity (2025), governmental antitrust philosophies tend to shift with public opinion. The robber barons of the Gilded Age were brought to heel by celebrated “trustbusters” like Teddy Roosevelt at the turn of the 20th century. Later in the century, AT&T, IBM, and Microsoft all saw their market power constrained by government action. Should public outrage over the financial and political power of internet companies grow in the years ahead, antitrust suits may become more common and successful.


But while breaking up companies like Meta and Google could well bring economic and social benefits—Wu makes a compelling case that the unconstrained growth of monopolies undermines democracy—it wouldn’t change the way the web works. An Instagram spun off from Meta would still be Instagram. It would still make money by, to use Wu’s terminology, extracting attention and data from the masses and using that raw material to feed a lucrative advertising business. The companies that rule the web might not be quite as large as they used to be, but they’d still do business in much the same way. And they’d still rule the web.


This Is for Everyone ends with an idyllic anecdote. Berners-Lee and his third wife, the entrepreneur and philanthropist Rosemary Leith, take their 14-foot Hobie catamaran out for a sail around a Canadian lake on a fine fall day. Leith takes the tiller while Berners-Lee sits on the trampoline between the hulls, manning the mainsheet. He feels a sense of exhilaration as the little boat catches the wind and zips across the lake. “While the physical activity and constant adjustments kept us engaged,” he writes, “there were also moments of pure enjoyment. The feeling of the sun on one’s back, the sound of water rushing past, the wind in the sails and the sensation of gliding over the waves can create a deep glow of connection with nature.”


Then, abruptly, Berners-Lee breaks the reverie to tell us, for the umpteenth time, why his invention is such a gift to us all:


Even on my sailing boat, in the middle of the water on a sunny afternoon in autumn, I could use my smartphone to contact almost anyone alive. I could deliver a package to my doorstep, or a hot lunch to my office. I could buy Rosemary flowers. I could listen to any song ever recorded and watch old clips from Fawlty Towers. I could do my banking or trade my portfolio or donate to UNICEF. I could read the news in any language and check the weather in any country. I could book a flight, a hotel room and a car in nearly any city in the world. All I needed was a web browser and phone reception.

So much for that connection with nature.


Not all technologies improve people’s lives. Just as Berners-Lee’s now-omnipresent web shapes industries and markets, it molds its users’ thoughts, perceptions, and relationships. As we’re slowly coming to understand, human beings did not evolve to be virtual creatures in a computer-generated world. The internet operates at a scale and speed that conflict with the brain’s deliberate pace of thought, the intellect’s slow accumulation of knowledge, and the psyche’s limited capacity for stimulation and social exchange. To be able to do anything and be anywhere at any moment seems liberating for a while, but it ends in a blurred and chaotic existence, the physical world’s familiar, steadying divisions of space and time dissolving in endless torrents of data. It’s an existence that may be vivifying to certain software programmers—Tim, slow down!—but for the rest of us, the virtual world’s hyperkinetic superabundance ends up feeling like emptiness: a very, very busy void. We may be drawn to that void by our native attraction to information, novelty, and spectacle, but we’ll never make a home there.


It’s hard not to admire Berners-Lee’s ability to sustain his idealism about the web in the face of so many disappointments. His optimism, with its childlike resiliency, is winning. But we should be wary of it. The qualities of the web he celebrates unquestioningly are the very qualities we should be questioning. What he can’t imagine, or at least won’t allow himself to imagine, is the possibility that his gift to humanity is more bane than boon.

LARB Contributor

Nicholas Carr is the author of The Shallows: What the Internet Is Doing to Our Brains (2010) and, most recently, Superbloom: How Technologies of Connection Tear Us Apart (2025).

Share

LARB Staff Recommendations

Did you know LARB is a reader-supported nonprofit?


LARB publishes daily without a paywall as part of our mission to make rigorous, incisive, and engaging writing on every aspect of literature, culture, and the arts freely accessible to the public. Help us continue this work with your tax-deductible donation today!