IN THE 1960s, the Simulmatics Corporation sold a big idea: whirling IBM computers (the kind James Bond villains had) churning through your personal data in order to predict and influence your decision-making.

Simulmatics did several rather consequential things. It helped propel JFK to the presidency. It modernized the New York Times’s election reporting. It modeled the social effects of pandemics for the Department of Health. It launched a boardgame called Ghetto for teaching high school students about social advancement. It advised state police forces on predicting riots. It helped blue-chip companies sell more dog food, coffee, and cigarettes. And it developed the US Army’s counterinsurgency operations in Vietnam.

But then the company flopped and disappeared. You’ve never heard of Simulmatics. The name sounds like an awful line of protein milkshakes, or a one-room cryogenics firm hidden away in a suburban shopping center. Think again.

Jill Lepore, the award-winning Harvard historian and New Yorker staff writer, retraces its incredible rise and fall in If Then: How the Simulmatics Corporation Invented the Future. She tells the story of how data science went digital in the Cold War, turning research in behavioral psychology into big business. If Then reads as a kind of prehistory to Cambridge Analytica, the private consultancy that imploded in 2018 after revelations of its dubious use of Facebook data to elect Trump and win Brexit. Exhuming Simulmatics from the dustbin of history also recasts our own strange moment as a mystery story: Why did the company that “invented the future” fail? And why did we forget it ever existed?


Lepore stumbled across Simulmatics in the papers of its co-founder and intellectual guru, Ithiel de Sola Pool (1917–’84). Pool was a brilliant polymath, a high-flying political scientist at MIT equally comfortable chatting with early developers of the internet or with the Secretary of State. Saul Bellow, a friend from college days in Chicago, used Pool for his novella A Theft: Pool was “like one of those Minoans dug up by Evans or Schliemann,” or characters “in the silent films, painted with eye-lengthening makeup.”

[H]e would speak about Keynes’s sketches of Clemenceau, Lloyd George and Woodrow Wilson. If he wanted, he could do with Nixon, Johnson, Kennedy, or Kissinger, with the Shah or de Gaulle, what Keynes had done with the Allies at Versailles. […] Ithiel could be the Gibbon or the Tacitus of the American Empire.

Charming, and confident he could write history by making it, Pool was part of the postwar “behavioral revolution.” In the years after 1945, many social scientists believed that human nature was irrational but could nonetheless be “managed” if its fundamental laws were revealed. This intellectual community was a surprisingly small and cliquey world. Almost all its major players are in Lepore’s book, knew each other personally, and passed through Stanford’s Center for Advanced Study in the Behavioral Sciences residency program. The community was in fact so small that it inspired Pool and an Austrian colleague to write a paper in the late 1950s on what they called the “small-world phenomenon” — we know it as Six Degrees of Separation — which quietly kickstarted the now-enormous field of Social Network Analysis, and the “science of a connected age.”

Simulmatics’s first project in 1960 was the People Machine, a system for converting demographic data into voter prediction for John F. Kennedy’s surprisingly close election against Nixon. Pool christened the People Machine a “kind of Manhattan Project gamble in politics.” He lured prestigious colleagues by offering them bigger salaries and, even more alluringly, with the promise of significantly upscaling their experiments beyond the lab. In an unguarded moment, Pool would excitedly exclaim to historian Fritz Stern that the Vietnam War “is the greatest social science laboratory we have ever had!”

The Manhattan Project imagery stuck. Harold Lasswell, doyen of communications theory, called the People Machine the “A-bomb of the social sciences” (he meant it positively), as did writer Mary McCarthy (she didn’t). Of the behaviorists’ increasing role in government policy, McCarthy presciently wondered: “[C]onceivably you can outlaw the Bomb, but what about the Brain?” But Simulmatics didn’t turn out to have an A-bomb. Not yet. Punch cards revealing the exact models it used were destroyed, but in general terms the firm did three things: conducted interviews; organized that data, and other secondhand data, into social segments; and then ran it through a primitive language of probabilistic “If/Then” permutations (FORTRAN). Lepore uses this last stage as the basis for her title.

By the mid-1960s, Simulmatics already had numerous competitors, which was one reason why it flopped. What set it apart from the crowd: aggressively underbidding for services it knew it could not afford with resources it did not have. Lepore details how it filled key positions with undergraduates on internships or earning the minimum wage, and how it descended into a chaos of faked data and perpetual tardiness. Its big clients began to cotton on. One colonel, overseeing a large Simulmatics contract to win Vietnamese hearts and minds, grimaced that it was “composed of nothing but Sandy Koufaxes, all pitchers.”

Lepore argues that Pool’s small world also had an aggressively macho ethos in the manner of Mad Men. Its sloppy and toxic work environment was ripe for male hubris, which was then hardwired, she says, into the models of society Simulmatics used to “predict the future.” “[T]hey did not consider a female understanding of human behavior to be knowledge.” The problem persists. After all, much of AI’s nursery material was the violent 20th century. Lepore also interviews the children and partners of Simulmatics staff to reconstruct their roles as guinea pigs and unpaid coders, aspects of Cold War science and technology that are now receiving more attention.


In November 2017, Britain’s Channel 4 News held a series of secretly filmed meetings with Cambridge Analytica. Having one of their team pose as a potential client desperate to win election by any means possible in Sri Lanka, the journalists were able to capture the company’s entire sales pitch: “The two fundamental human drivers when it comes to taking information on board effectively are hopes and fears. And many of those are unspoken and even unconscious. […] Our job is to drop the bucket further down the well than anybody else.” The meetings are fascinating, and you can watch them on YouTube.

Michal Kosinski designed their famous model, which used Facebook likes to predict human behavior. The Polish computational social psychologist had been working at the University of Cambridge Psychometrics Centre when he was introduced to Cambridge Analytica by a senior colleague. Now a Stanford professor and advocate for digital privacy, Kosinski told me via email that he had never heard of Simulmatics before. But he was not surprised. Cambridge Analytica techniques had been around a long time, he said. I gave him a copy of Lepore’s book.

Throughout If Then, Lepore attacks the vast authority claimed by the behavioral sciences. In this regard, she echoes a crisis of confidence within the discipline itself. Researchers are struggling to replicate results, yet they continue to perpetuate “structural priorities that stress confirmation” while deeming “failures uninformative and unworthy,” according to leading social psychologist Michael Inzlicht. This critique is a useful backdrop to the Simulmatics story, though Lepore has an odd habit of lumping behaviorists with technologists. She often construes “poetry and paintings and philosophy” and “numbers and graphs and simulations” as exclusively binary or orthogonal worldviews. But some programmers do love Dante and are also skeptical of the behavioral gospel.

Simulmatics and Cambridge Analytica are only briefly contrasted in If Then, perhaps in part because they seem so hauntingly similar. Analytica, Lepore writes, was “faster, better, fancier, pricier,” but had “the same hucksterism.” The climax of Analytica’s sales pitch in 2017 cackles through her book: “It’s no good fighting an election campaign on the facts. Because actually it’s all about emotion.”

Across the political spectrum, our emotions blaze with fear and indignation. In Cambridge Analytica’s portfolio, there are tools for exploiting that emotion. Social media companies can turn emotion into lucre. Angry people click. And algorithms make sure we only speak to people with whom we already agree. Reading Lepore’s story of the origins of computational data mining for micro-targeting and messaging offers an excellent starting point for thinking about how we are socialized to fit the clickbait model.


The Cold War, Lepore writes, “altered the history of knowledge by distorting the aims and ends of American universities.” It also provided them with lots of money. MIT, where Pool taught, and where the early internet was partially designed, received over half its entire budget from the Department of Defense in 1968. Universities continue to be important hubs driving the future of data science, not just because they have lots of smart people and a deep reservoir of cheap labor in the form of student research assistants, but also because they work to maintain an aura of impartiality and ethics. Most private companies try to project that aura as well.

Christopher Wylie recalls how Cambridge Analytica got its name and most famous client. Steve Bannon was looking for an academic, “ideas-focused” consultancy that would model big data. So, Wylie was told to set up a fake office near the University of Cambridge solely for the purpose of Bannon’s visits. It was unofficially called Potemkin, after the sham, plywood villages built to impress Catherine the Great on her visits across the Crimea in the late 1700s.

Lepore shows how hard Pool worked to sell ideas we now take for granted, including those surrounding data. Once upon a time, no one outside of laboratories talked about data. Selling the future took a lot of work. New metaphors had to be invented that sounded emancipatory and reduced complexity: the “cloud,” your “social network.” Because Pool’s ideas were far ahead of the computing power that existed in the 1960s, Simulmatics rarely delivered tangible results, which was yet another reason why it flopped. Pool had to construct his own Potemkin, a technological black box that tailored the promise of predictive modeling to the dreams and desires of his diverse clients.

Lepore’s book reminds us that there are no inevitable uptakes or applications of new digital technologies. Cambridge Analytica’s public mea culpa in 2018 was to claim that profiling technologies were a “Pandora’s box,” only recently opened. If Then not only shows that the box was already opened 50 years ago, but that this metaphor glosses over all the little choices made by important people, and by you and me, regarding what data means in our lives.


In 1969, Simulmatics was bankrupt. Ithiel Pool launched one last bid to win a big defense contract. He pitched Project Cambridge, a vast “data library” run by MIT and Harvard that would “develop better computer methods for behavioral science.” But Simulmatics had become tainted by its involvement in the Vietnam War. Lepore vividly captures the student protests against the Project that ricocheted through Boston’s quads that autumn. On his way to the jailhouse, an adolescent future Fox anchor reported live for his student radio station — “This is Chris Wallace in custody.” Harvard social scientists like Barrington Moore did not miss an opportunity to kick their behavioral rivals while they were down. Their “illusion of technical omnipotence” had failed.

Except it hadn’t. Harvard, MIT, and in fact every other STEM-minded university today would scramble to secure a similar data-processing center. Technologies influenced by behavioral social science are purring away beneath the screen you’re reading this on. But the Simulmatics Corporation did disappear, leaving no trace in our cultural or political discourse.

Lepore can’t fully account for this vanishing act. She rightly points to a culture of forgetting in Silicon Valley, where the “meaninglessness of the past and the uselessness of history became articles of faith.” But I think this forgetting is the tip of the iceberg. We forgot Simulmatics because we didn’t have a good story to tell about it. It was too messy. The general public has shown a limited appetite for exhuming big data’s Cold War antecedents. We watch Aaron Sorkin’s Facebook film and that’s as far back as we go, though the road to Zuckerberg’s Cambridge dorm room, and on to Cambridge Analytica, is paved with a rich mosaic of important start-ups, personalities, and experiments spanning decades. We need more captivating books like If Then. And we need more historians willing to meticulously research and creatively write them for audiences beyond academia.


Alex Langstaff is a PhD candidate in history at New York University, and lecturer at Cooper Union.