THE MACHINES ARE COMING, but when? And how, exactly? In Daniel Suarez’s 2006 novel Daemon, a dying computer programmer creates a set of programs that start up after his death: one reads news headlines and triggers events based on them. One uses text-to-speech to give news scoops to a recently fired journalist. Another recruits elite hackers through an online game, and it pays them from the dead programmer’s ample bank account. Still another program hacks into private prisons and transfers, then releases, individuals who agree to work as its henchmen. Daemon is not just a novel about the perfect crime, but one that tries to imagine just how powerful a computer program could become, especially if it were able to recruit the help of willing individuals. A decade later, Ed Finn’s What Algorithms Want shows us just how powerful computer programs and their helpers actually are, and the book needs no recourse to science fiction.

Consider, as Finn points out, that Google is at bottom a multibillion-dollar wrapper for its search algorithm, and that algorithms are at the center of many other high-value tech companies, including Facebook, Uber, and Netflix. Employees are there to tune the algorithms, to analyze the vast stores of user data they provide, and to design new ventures, and new algorithms along with them. What Algorithms Want is about this system, what Finn calls a “culture machine”: “complex assemblages of abstractions, processes, and people,” and, later, a “feedback loop between algorithmic and human actors.” These “culture machines” are already the filters through which entertainment, ads, news, and even general knowledge reach us, and Finn promises to help us understand both why and how they’re important.

We’ve heard a lot recently about advances in machines and computers in terms of what and who they’ll replace: the usual suspects, like industrial farming, self-driving cars, the office software suites that have already phased out a great deal of secretarial labor. I sometimes wonder, as a knowledge worker whose job is to train a future generation of knowledge workers, what aspects of my job, and my students’, will be “safe,” or machine-proof. What kinds of information processing, and what kinds of creativity will still need humans in 50 years? In 75? What Algorithms Want helps us to see just how thoroughly creativity has already been transformed by algorithms.

Finn, the founding director of the Center for Science and the Imagination at Arizona State University, is concerned here with how algorithms are changing culture, information networks, and questions of value. What Algorithms Want doesn’t linger on how machines will take our jobs, nor is it a book about how we might one day upload our brains into computers. It’s a book about how a handful of algorithms have already become the most important new development in media technology in the present decade. Marshall McLuhan once observed that different media are like extensions of our senses — in the way that movies extend our vision, and diaries extend our memories. Following that logic, Finn wants to show us just how thoroughly these algorithms are extending and transforming the ways that we think, from the information we see to our relationships with each other.

What Algorithms Want describes a handful of the algorithms that matter most for 21st-century culture, and for each, it traces the circuit between those users, algorithms, programmers, and the values and ideas that animate the whole system. It’s this larger frame for understanding these “culture machines” that makes Finn’s book so insightful about the roles of algorithms. In an analogy Finn has made elsewhere, algorithms are more like Kirk than Spock. We often imagine that because computer programs handle things using computation, that they’re logical and even impartial. But if we look at the larger system algorithms are part of, we see that algorithms themselves are records of countless human assumptions, first, and then of exceptions to the rules, tweaks, changes, and human decision-making that take place along the way. The program itself, this record of exceptions and tweaks, will resemble Kirk’s captain’s log of difficult decisions, rather than any of the rulebooks Spock might quote from. The “algorithmic object of study,” Finn writes, is less a single program than a “system in motion, a system of iterations,” and it’s certainly not just “the surface material it presents to the world (e.g., the items appearing at the top of one’s Facebook feed).” It’s much bigger than that.

Algorithms, Finn explains, are computational attempts to model and thus make “effectively computable” problems in the real world. Maybe it’s matching stores of data to questions in ordinary language (Google search, Siri), or maybe it’s matching people to rides (Uber, Lyft). Programmers figure out ways to parse queries or how to match drivers and clients efficiently, turning unpredictable possibilities into something that can usually work well. These programmers seem like the creative geniuses of the enterprise, finding clever ways to improve the models of the systems they work on, to “close the gap” between the model and reality. They’re people like the Google Maps engineers who figured out that, in addition to adding existing traffic reports to the maps, they could measure the location and speed of users checking Google Maps, thus improving on the traffic reports.

For Finn, the story of Netflix follows this pattern, and this story gives one sense of how algorithms are intervening directly in our nightly viewing and end-of-vacation binges. Back when Netflix was a mail service, you might remember that they were interested in recommending movies, ostensibly so customers wouldn’t run out of things to rent. They would ask you to rate five movies you’d seen in order to reveal their next recommendation, thus building a database of customer preferences. When streaming became a major part of the business, though, they found they had a great deal more data to work with, and data that was more detailed and reliable than star ratings. At what times of the day do you watch certain programs? What do you binge-watch, and for how long? How long do you watch a program or movie before deciding it’s not interesting to you? Netflix gives this huge amount of data to human “taggers” to find significant data points among viewers and programs, including Netflix’s now-infamous micro-genres: suspenseful foreign films with strong female leads, coming-of-age animal tales, et cetera. Netflix had by that point built a recommendation engine that was good for more than just filling queues and encouraging usage. As Finn describes it, their production team now uses this extraordinary amount of viewer data in order to make decisions about what kinds of shows to market and produce as Netflix Originals. In the example Finn reads in-depth, the department decided that a dark political thriller with strong female leads, also a remake of a well-liked BBC show, with artsy cinematography, to be released the whole season all at once, and starring — you guessed it — Kevin Spacey could be a hit. Finn makes a strong case that it’s really impossible to fully understand a show like House of Cards without considering how this circuit between Netflix customers, the collection and algorithmic processing of their viewing behavior, human tagging, and marketing shaped it into the show that it is, cast and shot as it is. This phenomenon is neither essentially good nor bad, from Finn’s point of view, but it’s one example among many of the determining role algorithms can play in the industry. Most significant writing and media, from books to television to magazines to news sites, comes into contact with algorithmic platforms of one sort or another, whether it’s an author who reaches out to readers on her Twitter profile, a reader seeing book recommendations on Amazon, or a website churning out clickbait headlines.

While Finn devotes one chapter to considering Netflix and similar transformations in culture, other chapters turn to different transformations in our values related to programs like Siri, BitCoin, high-frequency trading algorithms on Wall Street, or Ian Bogost’s FarmVille parody, Cow Clicker. In a particularly original take on gamification, for instance, Finn describes how algorithmic abstractions often give interpersonal relationships the features of computer games. From Facebook interactions and social games like FarmVille to the frictionless transactions in Uber or Amazon’s Mechanical Turk labor service, algorithmic platforms manage and delimit the relationships we carry out on them, through interfaces modeled after computer games. Uber made a point in its advertising that drivers and riders are clients on the same footing, and Mechanical Turk abstracts and anonymizes the labor agreements for the often tedious tasks assigned. Particularly in the sharing economy, the game and the company are responsible for generating trust between players. The companies of the sharing economy guarantee that we can trust our Airbnb host, but Finn’s concern is that in such a gamified sharing economy, there’s a risk that in a broader sense “the imaginative work of empathy has been outsourced to the algorithm.”

Another chapter, on Siri, Alexa, and their predominantly female cousins, takes up our desire to have a computer that has both all the knowledge in the world and the ability to answer our questions. The fantasy of collecting all knowledge dates back to Diderot’s Enlightenment project of the Encyclopédie, and projects like Wikipedia and even the searchable internet itself. We’re closer than ever to having fingertip command of all the information that Diderot’s ideal might have had, but there is also a vast quantity of noise that comes along with it. But Finn also takes up the human side of this relationship, considering how Siri also has the task of figuring out what people want to know, and, at an even more fundamental level, what people want. Spike Jonze’s 2013 movie Her takes on new meanings in Finn’s analysis, where Joaquin Phoenix’s character lives out the fantasy of owning a computer that’s a universal encyclopedia, personal assistant, and even, for a short time, a lover. The perfect Siri or Star Trek computer is an impossible fantasy, at the same time that the good-enough Google search box has changed how we access, remember, and think about information.

And it’s in the shaping of our access and exposure to information that algorithms seem to be most politically consequential. In 2016, far-right partisan bots swarmed through both the Brexit and Trump Twitter conversations, and after the US election Facebook tweaked their algorithm to curb the rampant spread of false infographics and conspiracy theories. And as Finn points out, Nick Denton, the founder of Gawker, wrote in 2015 that his enterprise was ultimately a “slave to the Facebook algorithm.” In Finn’s words, the business models of many blogs and aggregation outlets encourage “writing first for the algorithms that will choose whether or not to disseminate their work on Facebook, and only second for the readers who will then read and share it with others.” For Finn, it’s essential that we raise our awareness, both as producers and consumers of information, of this “path dependency on the black boxes at Facebook, Google, Apple, and other tech companies.”

What Algorithms Want, then, ultimately describes a major transformation of the public sphere, an “algorithmic sphere” that “obscures certain operations while making others more visible,” based on the desires and assumptions of programmers, on the one hand, and on the ways that users adopt or game those systems, on the other. The book ends with a welcome call for humanities scholarship to make tactical, exploratory, and creative use of algorithmic technologies, though to my mind the broadest conclusion comes a few pages earlier. It comes as Finn describes participation in crowdfunding platforms — a humble but substantial part of this algorithmic public sphere — as a way for individuals to raise money and awareness in an era when SuperPACs and activist billionaires enjoy outsized political influence. Finn describes how “crowdfunding rewards those who master the methods of privatized publicity: a strong introductory video, frequent updates, tiered reward structures, and effective use of social media to raise awareness.” They might seem trivial, but they’re examples of new and possibly unfamiliar skills for a new public sphere, a media ecology that amplifies different strengths and weakness from print- or television-based conversations. It’s a media ecology that can privilege the savvy troll over the clueless expert, but savvy experts are coming along, too, particularly as they hone these skills. In this way, Finn’s approach to analyzing “culture machines” joins other welcome calls for new kinds of digital media literacy, both in schools and outside of them. While it’s tempting to imagine algorithms as either impartial bystanders or Daemon-like monsters that might take on lives of their own, seeing them as bigger systems helps us to advocate for more just arrangements and better information. What Algorithms Want explains how we all have a stake in shaping the algorithms that have, for all intents and purposes, already taken over.


Scott Selisker is assistant professor of English at the University of Arizona and the author of Human Programming: Brainwashing, Automatons, and American Freedom (University of Minnesota Press, 2016). He writes and teaches courses on science and technology in US literature and culture.