The Bargaining Chips Are … Chips: On Chris Miller’s “Chip War”

October 4, 2022   •   By W. Patrick McCray

Chip War: The Fight for the World’s Most Critical Technology

Chris Miller

ACCORDING TO CANONICAL accounts, several key milestones have shaped the history of computing, at least when viewed from the United States. These typically start with the year 1937, when the British mathematician Alan Turing published a paper that proposed building a “universal computing machine.” In that same year, American engineer Claude Shannon suggested that electrical switches set in an “on” or “off” state could perform logic operations using Boolean algebra. In the next decade, these two strands came together when the first programmable digital computers — machines like the ENIAC, completed in Philadelphia in 1945 — were used for military applications. Meanwhile, in 1947, researchers at Bell Labs invented the transistor. These solid-state devices, crafted out of semiconducting materials like germanium and silicon, soon replaced the larger and more power-hungry vacuum tubes that had served as the logic elements in early digital computers.

Jump ahead a decade or so into the 1950s, and two engineers at two different companies — Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor — independently invented the integrated circuit. This device featured an ensemble of electronic components, including transistors, crafted onto wafers (“chips”) of semiconducting material. This chronology of milestones also features a 1965 observation by Gordon Moore — who would cofound Intel with Noyce in 1968 — that engineers were cramming more and more transistors into increasingly cheaper integrated circuits. Moore predicted that this pattern was likely to continue for years to come. In time, Moore’s Law, which was based on economics and industry practice and not physics, set the expected relentless pace for the global electronics industry: ever smaller and more powerful integrated circuits and microprocessors started appearing in the first personal computers (ca. 1974) as well as every other conceivable type of consumer product from cars to kitchen appliances to sex toys. As you read this, you probably have billions of transistors within easy reach — there are 15 billion of them on Apple’s A15 chip inside a newish iPhone — making these nanoscale on-off switches the most produced commodity in all of human history.

In Chip War: The Fight for the World’s Most Critical Technology, historian Chris Miller centers his survey of this technology around a different ensemble of key events. These have less to do with technological or cultural milestones than with geopolitical shifts in the global semiconductor industry itself. For example, we learn that in 1968, industry giant Texas Instruments opened its first semiconductor fabrication plant (“fab” in industry parlance) in Taiwan. Seemingly innocuous corporate decisions presaged a wholesale shift in the entire electronics industry as the centers of production shifted from the United States to East Asia and, worryingly for analysts today, China. We also learn that 1984 was an important temporal marker. The year marked the founding of Advanced Semiconductor Materials Lithography (ASML), a multinational company headquartered in the Netherlands. Today, ASML exerts a monopoly over lithography machines, which can cost $200 million or more. These machines are essential for making the latest computer chips. If ASML were to magically vanish, global chip making would stop.

By focusing on the rise and contemporary state of the global semiconductor industry, Miller’s book offers a retelling of the history of computing itself. His general approach might be likened to Daniel Yergin’s Pulitzer-winning book, The Prize (originally published in 1990), which set out to give the definitive account of the global petroleum industry from the 1850s onward. My advance copy of Miller’s book has an endorsement from Yergin. Over the years, experts have christened computer chips “the new oil”: they are a strategically important global commodity essential for the functioning of contemporary society. Nation-states have gone to war over oil many times. The same may come to be true with chips.

The comparison between oil and chips makes sense up to a point. Many commodities — uranium, lithium, data, etc. — have been likened to petroleum. But, as skeptics have countered, especially since the Russian-Ukraine conflict started in February 2022, “oil is the new oil.” More importantly, oil and chips differ in one key sense. Oil is a natural resource found in abundance in specific regions and, most would argue, of finite supply. Computer chips are made by humans (and their corporations) in specific regions while silicon is the second most abundant element on our planet. As Patrick Gelsinger, Intel’s current CEO, says, “God decided where the oil reserves are, we get to decide where the fabs are.” In other words, anxiety over where chips are made, and where they are not made, is a problem that people — business executives and state leaders, specifically — have created, not nature. Moreover, few countries dominate oil production the way that nations like Taiwan and South Korea dominate chip making. The United States, the world’s largest petroleum producer, is responsible for only about 15 percent of the global oil supply. In contrast, Taiwan builds a significant fraction of the world’s chips (around 65 percent), including chips for almost all of the most advanced products in the global market (over 90 percent).

Nonetheless, Yergin’s book (and perhaps Sven Beckert’s epic narrative about cotton) offers some points of comparison with which to evaluate Miller’s book. Chip War is fundamentally about business and statecraft. Central to the narrative are the inventions and innovations surrounding a new commodity, its incorporation into every facet of human life, and tales of corporate courage and cowardice. Unlike many histories of computing and information technologies, Chip War is not about what happens after the chips get made. Like The Prize, Miller’s Chip War is a story that centers on production, not consumption, while presenting an account of how chips became a strategically vital resource whose importance is overlooked at our peril. It is on this latter point, I think, that Miller has placed his own chips. His bet has largely paid off.


In August 2022, House Speaker Nancy Pelosi visited Taiwan. It was a short trip — less than two days — but it prompted an aggressive response from nearby China, including days of military exercises. These events cast into stark relief how much the United States and other countries rely on Taiwanese chips. A significant portion of these devices come from one company: Taiwan Semiconductor Manufacturing Company (TSMC). It’s a little-known corporation started in 1987 but one whose products are found in the pockets and on the desks of hundreds of millions of people. The company’s fabs can make cutting-edge chips with features as small as five nanometers (a nanometer is one-billionth of a meter or about 1/20th the size of a COVID-19 virus), with three-nanometer chips coming this year from TSMC and rivals like Samsung. In comparison, a few of the most advanced Chinese fabs are just starting to make chips with seven-nanometer features, but experts generally agree that most of that country’s chip factories churn out products that are technologically behind what the leading manufacturers are making. More than size, what’s significant is that TSMC makes so many chips — about 55 percent of all the chips produced worldwide. For comparison, the United States is responsible for only about 12 percent of chips made annually, with China making somewhat less than that.

Miller surveys how the manufacture of semiconductor chips migrated from the United States, which dominated through the 1970s, to other key locales. I wished, however, for a more sustained and finer-grained analysis of the actual processes at play in this shift. To be fair, its absence may well be due to the limitations of historical sources. Corporate archives are notoriously hard to get into, and gaining access, when granted, is often circumscribed. And then there is the question of languages. Miller, an associate professor of international history at Tufts University, has previously written books about the Soviet Union and Russia. Diving deeply into the global history of computer chips would, ideally, require at least an ability to work with Chinese, Korean, and Japanese sources, where the same caveats about access noted above apply but are amplified. Or it would entail a bevy of co-authors.

Chip War thus provides a top-level rather than intimate or insider view of events as they unfolded. For example, Texas Instrument’s 1968 decision to start manufacturing chips in Taiwan (discussed in the twelfth chapter) planted the seeds for the eventual emergence of Taiwan’s TSMC. A few paragraphs later we learn that chip making started around the same time in places like Singapore, Hong Kong, the Philippines, and Malaysia. Meanwhile, Lee Byung-chul, the founder of rival Samsung, started out as a purveyor of dried fish in the 1930s. The company started making electronics in the 1960s, but it was only in the 1980s that Samsung pivoted to semiconductor manufacturing. I would have welcomed an even more detailed analysis of the corporate thinking and actions that made the emergence of these industry superstars possible. Perhaps this is a sign of a successful book.

But it might also be a sign of pressure by the publisher (in this case, Scribner) to write a fast-paced narrative that keeps the reader flipping pages. Miller’s success in this regard is a feature as well as a bug. The downside to a 350-page book (excepting notes) containing 54 chapters is that it’s easy to experience a sense of temporal, topical, and spatial dislocation. Even though the book is structured roughly chronologically, there is little continuity in actors and action from one short chapter to another. There are, however, recurring themes to help connect some of the dots.

One theme revolves around the critical role played by state support in the emergence of chip-making capacity in every country. In the United States, for example, the leaders of Silicon Valley famously extolled the virtues of unfettered markets and free enterprise. As Apple CEO Steve Jobs said in a 1996 reality-distorting conversation with the Clinton White House, “Silicon Valley doesn’t traditionally look for handouts.” This was, of course, bullshit. Semiconductor manufacturers, especially in their early years, relied on huge government contracts from NASA and the Department of Defense. Miller notes in passing that the US government purchased almost all of the integrated circuits that Texas Instruments and Fairchild produced in the early 1960s. Margaret O’Mara’s book The Code explores this reliance on Uncle Sugar in much more detail, showing how Silicon Valley’s fabulous success was built not so much on clever computer programs as on relaxed tax regulations and revised immigration codes.

This pattern of state support was repeated in every other locale where semiconductor manufacturing took root, from Hong Kong to South Korea to the Soviet Union (a notable story of failure) and, most recently, China. Huawei, an electronics company based in Shenzhen, has received some $75 billion in state subsidies, a fact which prompted the Trump administration to impose a series of semi-successful trade sanctions on the company. Miller, a visiting fellow at the American Enterprise Institute, is more critical of the Obama administration’s lackadaisical attitude (“almost everyone had drunk their own Kool-Aid about globalization”) towards China than he is about the “China hawks” in the subsequent administration. Some experts, however, disagree, claiming that aggressive Trump-era policies did not achieve their goals, but instead intensified confrontation.

Another animating theme in Chip War is how much chip making was and remains a transnational enterprise. Although it’s not discussed in the book explicitly, the invention of the transistor by three American physicists (John Bardeen, Walter Brattain, and William Shockley) in 1947 came after decades of basic research in solid-state physics and quantum mechanics. This was conducted at university and corporate laboratories throughout Europe, the United States, the Soviet Union and elsewhere. It was the US, however, with its labs and industries unscathed by World War II, that was able to capitalize on and commercialize this knowledge.

This pattern continued throughout the 20th century. One of the most compelling characters in Miller’s book is Morris Chang. Born in mainland China in 1931, he came to the United States in 1949 to attend Harvard and soon transferred to study engineering at MIT. After finishing his degree, he took a position at Texas Instruments in 1958. A decade later, Chang was one of the catalysts in his company’s decision to start manufacturing chips in Taiwan. In 1987, Chang moved there and, with extensive state support, founded TSMC. Similarly, Intel’s former CEO, Andy Grove, was born in Budapest, Hungary, as András Gróf before coming to the United States in 1957. The story of the semiconductor industry is undeniably one of cross-border flows of people and information, with the latter sometimes moving about in ways that legal authorities have found suspicious. Miller, a Russian expert, details the Soviet Union’s botched attempts to copy American chip technology during the Cold War. At one point, engineers for Digital Equipment Corporation, a company which made the VAX line of mainframe computers, inscribed microscopic “Easter egg” messages in Russian on its chips (“VAX — when you care enough to steal the very best”) that mocked the proclivity of their counterparts to purloin American technology.

Humorous as such pranks were, chips were central to the United States’ efforts to defeat the Soviet Union during the Cold War. They were the foundation of what William Perry, a Silicon Valley research manager who served as President Jimmy Carter’s Under Secretary of Defense for Research and Engineering, proposed as the Pentagon’s strategy to “offset” the Soviets’ decided advantages in manpower and materiel. The United States’ better and more plentiful chips underpinned the development and deployment of precision weapons, satellite navigation, and better command and control technologies throughout the 1980s.

While the Soviet Union was a military adversary, Japan was the United States’ main economic competitor during this same period. Again, the federal government, in response to entreaties from Silicon Valley leaders, came to the rescue. It bankrolled government-corporate partnerships like SEMATECH (founded in 1987) while DARPA, a research wing of the Pentagon, funded cutting-edge “Very Large Scale Integration” and artificial intelligence programs under the aegis of its Strategic Computing Initiative. Chip makers argued that semiconductors were a “strategic resource” like oil or steel for the Department of Defense. The federal government agreed. This “chips = oil” analogy is especially ironic given the rapid decline in oil prices during the same period. The process of how a commodity comes to be understood as “strategic” is something I wish Chip War might have explored more fully. What exactly was being said inside Pentagon briefing rooms or in corporate strategy meetings? Was the rhetoric of American executives motivated by a sense of patriotism and concern for national welfare? Or by something else?

The 1980s saw another milestone fundamental to Miller’s narrative. Previously, major semiconductor firms in the United States tended to design and manufacture their own chips. But, with the emergence of pay-to-play “foundries” like TSMC, these two activities increasingly became separate. Chips would be designed, for example, in Silicon Valley but then manufactured at much lower labor costs in South Korean or Taiwanese fabs. (It’s not for nothing that many Apple products today say “Designed by Apple in California. Assembled in China,” with the chips inside coming from TSMC.) The success of these overseas factories was initially celebrated by American companies like Intel because they were seen as counterweights to the “real” danger — Japan. This threat, as we know in hindsight, proved to be a mirage: Japan’s overheated economy started imploding in 1990 and soon neither it nor the Soviet Union were critical players in the chip game. As Miller writes, “The Cold War was over; Silicon Valley had won.”

Globalization and 1990s-style neoliberalism followed. Again, we can see the hand of the state at work: the Clinton administration pushed a host of computer-related initiatives leading to Vice President Al Gore’s unfairly ridiculed claim that he and the Clintonites “had taken the initiative in creating the internet.” (Had Gore said, “in creating the commercial internet,” then he would have been right.) Information technologies were a central aspect of what has come to be called the “New Economy” of that era. As Clinton said in an April 2000 speech — even as the dot-com bubble was gassing out — “I believe the computer and the internet give us a chance to move more people out of poverty more quickly than at any time in all of human history.” Central to New Economy rhetoric was a close relationship between the Democratic establishment (“Atari Democrats”) and Silicon Valley firms. Even as Clinton was extolling the value of chip-based technologies, the Republican-controlled Congress was preparing to normalize trade relations with China, setting the stage for that country’s remarkable rise.

After the 1990s, Miller’s story increasingly shifts to the emergence of China as a technological superpower and existential risk to the United States and its economy. China made its first integrated circuit in 1965, but, until the 1980s, the country was hamstrung by retrograde economic policies and political ineptitude. As Miller’s narrative moves to the present day, the threat posed by China stands out as markedly different from the ones that had been posed by Japan or South Korea. That difference boils down to two factors. One is the proximity of mainland China to Taiwan and the fact that so much semiconductor manufacturing critical to today’s global economy happens on that relatively small island. The other is that while Japan and South Korea offered stiff economic competition to American companies, these countries depended on the protective umbrella that US military power provided.

Chip War, true to its title, concludes with speculations about various scenarios that could occur if an increasingly militaristic China decided to exert more control, direct or otherwise, over Taiwan. During the Cold War, the United States and Soviet Union built out their nuclear triads, with nuclear-capable bombers, land-based missiles, and submarines. Experts like Georgetown’s Ben Buchanan speak of a new triad today — data, algorithms, and computing power — based entirely around information technology and semiconductor chips. Whether in the form of blockades or an outright invasion, Chinese aggression would likely fail, Miller optimistically asserts. Chip making today is an extraordinarily complex activity, exquisitely and precariously dependent both on global supply chains and skilled workers with their tacit knowledge. Taiwan is a “choke point” when it comes to supplying semiconductors, which means that any disruptions due to military activity (or an earthquake) would be economically catastrophic for all parties. But ASML, the little-known company in the Netherlands that provides tools critical for printing chips, is another choke point. Without ASML’s fabulously complex and expensive lithography machines — each has over 450,000 parts — Taiwan’s semiconductor fabs, even if occupied by mainland China, would slowly stop working. As Taiwan’s President Tsai Ing-wen recently wrote, her island’s fabs stand as a “silicon shield” against Chinese aggression. It’s chips all the way down.

In some ways, it’s a shame that Chip War appeared in late 2022. On August 9, President Biden signed the Creating Helpful Incentives to Produce Semiconductors Act. The CHIPS legislation (yes, those bureaucrats are clever with the acronyms) allocated $39 billion over five years to encourage semiconductor production in the US. Another $13 billion was for new research programs, including a DoD-managed National Network for Microelectronics R&D. Given how the US government helped create and support the electronics industry in the first place, one might say “plus ça change, plus c'est la même chose.” But one might also wonder how Miller, who credits Trump for pushing back against Chinese companies like Huawei, would explain the large number of Trump-aligned politicians in Congress who voted against CHIPS. (The Senate passed the CHIPS and Science Act 64-33 on July 27, 2022, followed with House passage by a vote of 243-187 on the following day.) At least during the Cold War, both political parties agreed that the Soviets were the common overseas enemy.


Although it certainly wasn’t Miller’s intent, Chip War counters one of the persistent interpretations of how the “computer revolution,” to use that hackneyed phrase from the 1970s, happened. For some scholars, the personal computer (and, later, the internet) represented a form of liberation, perhaps even political revolution. It was called personal computing, after all. These narratives give an outsized role to hobbyist groups, hackers, and hippies. Chip War includes none of these actors. The emergence of the personal computer, the smart phone, and the hundreds of thousands of other chip-containing consumer goods (and military weapons) is, for Miller, fundamentally about global business and the pursuit of technological superiority on a geopolitical scale. Businessmen (yes, men) and not beatniks are the primary actors here.

Given the vast terrain surveyed by Chip War, there are bound to be some blind spots. The book doesn’t mention the environmental effects of making electronics. While it may be seen as the birthplace of “modern innovation,” with its much-fetishized garage-based startups and unicorns, Silicon Valley — a place once billed as “the Valley of Heart’s Delight” — is now home to nearly two dozen Superfund sites created by the negligence of now-defunct electronics companies. The fabrication of chips consumes vast amounts of water, and fabs in the United States are often located for political reasons in places like Arizona and Texas. One wonders how these factories will secure the resources they need to function, and at what cost. Perhaps water is the new oil? While issues like these are obviously not the focus of Miller’s book, they bear mentioning as possible factors in the choice of where fabs are built, especially as access to abundant, clean water becomes a geopolitical issue.

The world of Chip War is also almost entirely devoid of women. The key exception is the chapter “Transistor Girls,” which briefly describes the largely female workforce that assembled chips in the United States in the 1960s. In Silicon Valley, unions were weak or nonexistent, and executives were committed to maintaining this state of affairs. Women were hired because they worked for less money and were unlikely to demand more. As Louis Hyman stated in his book Temp (2018), “To understand the electronics industry is simple: every time someone says ‘robot,’ simply picture a woman of color.” As their businesses expanded, executives sought cheap labor outside of the Bay Area, looking, for example, to Indigenous workers in the Navajo Nation. It makes sense that this managerial greed would filter down to employees. Miller recounts what one Fairchild employee wrote on his exit survey when leaving the company: “I…WANT…TO…GET…RICH.” Eventually, the covetous corporate gaze turned overseas. Seen in this way, the troubles that now bedevil American electronics companies might be viewed as a legacy of their own greed-fueled habits of thought.

Computer engineer Lynn Conway, who helped pioneer innovative chip design efforts in the 1970s and ’80s, once noted that the American semiconducting industry displayed a “famously macho disdain of women.” I thought of Conway’s observation when I read Miller’s account of Jerry Sanders, the ambitious, “Rolex-clad, Rolls Royce–driving” engineer-turned-entrepreneur who founded Advanced Micro Devices (AMD) in 1969. As design and manufacture were increasingly becoming separate activities with the latter shipped overseas, Sanders started warning colleagues that “real men have fabs.” Given increasing competition from China and a dependence on overseas factories to produce a critical and strategic resource, perhaps Sanders’s statement — despite the obvious macho dick-swinging — hits at a vital truth. Let’s put it this way: we live in a world where semiconductors and integrated circuits are not just technological devices. Their production, centered in a few key geopolitical spots, turns them into outright bargaining chips.


W. Patrick McCray is a professor of history at UC Santa Barbara and the author of five books on the history of modern science and technology includingMaking Art Work: How Cold War Engineers and Artists Forged a New Creative Culture, which appeared from MIT Press in 2020. McCray is an elected fellow of the American Association for the Advancement of Science.