Neoliberal Tools (and Archives): A Political History of Digital Humanities

Neoliberal Tools (and Archives): A Political History of Digital Humanities
Advocates position Digital Humanities as a corrective to the “traditional” and outmoded approaches to literary study that supposedly plague English departments. Like much of the rhetoric surrounding Silicon Valley today, this discourse sees technological innovation as an end in itself and equates the development of disruptive business models with political progress. Yet despite the aggressive promotion of Digital Humanities as a radical insurgency, its institutional success has for the most part involved the displacement of politically progressive humanities scholarship and activism in favor of the manufacture of digital tools and archives. Advocates characterize the development of such tools as revolutionary and claim that other literary scholars fail to see their political import due to fear or ignorance of technology. But the unparalleled level of material support that Digital Humanities has received suggests that its most significant contribution to academic politics may lie in its (perhaps unintentional) facilitation of the neoliberal takeover of the university.

Neoliberal policies and institutions value academic work that produces findings immediately usable by industry and that produces graduates trained for the current requirements of the commercial workplace. In pursuit of these goals, the 21st-century university has restructured itself on the model of the corporate world, paying consultants lavish fees, employing miserably paid casual laborers, and constructing a vast new apparatus of bureaucratic control. The humanities are, in their traditional form, less amenable to such restructuring than other disciplines, relying on painstaking individual scholarship and producing forms of knowledge with less immediate economic application. By providing a model for humanities teaching and research that appears to overcome these perceived limitations, Digital Humanities has played a leading role in the corporatist restructuring of the humanities.

What Digital Humanities is not about, despite its explicit claims, is the use of digital or quantitative methodologies to answer research questions in the humanities. It is, instead, about the promotion of project-based learning and lab-based research over reading and writing, the rebranding of insecure campus employment as an empowering “alt-ac” career choice, and the redefinition of technical expertise as a form (indeed, the superior form) of humanist knowledge. This is why Digital Humanities is pushed far more strongly by university administrators than it is by scholars and students, who increasingly find themselves pressured to redirect their work toward Digital Humanities. In what follows, we focus on the ways that Digital Humanities scholarship has been imagined and justified by key figures, and on how the rationale for Digital Humanities has complemented and supported the transformation of higher education. While many will be able to say, with some justification, “But that’s not my Digital Humanities!” what we discuss here is the Digital Humanities that is helping to transform the academy, because this is the Digital Humanities that has proved itself so useful to university administrators and to funding bodies.

Digital Humanities, Textual Studies, and Interpretation

To understand the politics of the Digital Humanities, it is necessary to understand the context from which it emerged. One crucial point of origin, rarely remarked in discussions of the subject, is in the literary studies subfield known as “textual studies.” This subfield has two broadly defined forms. In one approach, usually known as “book history,” scholars study the material history of texts, the people who have made and read them, and the meanings ascribed to them. As with other forms of social history, this is an interpretative activity that can be carried out to affirm received ideas about the world or to challenge them. A second approach to textual studies, usually known as “textual criticism” or “textual scholarship,” concentrates on the production of new editions of old texts. While this can be done in such a way as to challenge received ideas, a more typical approach is to produce an “authoritative” edition supposed to embody the value of the original work. In the textual scholarship approach known as the New Bibliography, that “authority” was equated with authorial intention, so New Bibliographers sought to give literary texts the forms their authors apparently meant them to have.

One of the most important centers for the New Bibliography was the English Department of the University of Virginia. The department was the birthplace, as we show below, of the Digital Humanities movement, and its reputation as a safe harbor for conservative intellectuals is key to understanding the story we tell. Fredson Bowers, a leading light of the New Bibliography, was for several decades the Chair of English at the University of Virginia. Some sense of the intellectual climate of that department can be gleaned from a 2009 essay entitled “When Is Diversity Not Diversity” published by Bowers’s conservative colleague, Paul Cantor, in an American Enterprise Institute volume revealingly entitled The Politically Correct University:

Under Bowers’s leadership, Virginia became a center of scholarly editing and bibliography at a time when much of the profession had become focused on the theory and practice of interpreting texts, rather than the process of how their exact wording should be determined. But Bowers was a pluralist and a pragmatist, and in trying to build the best English department he could, he brought in professors of all stripes. One of his key hires in the 1960s was E. D. Hirsch, who was worlds apart from Bowers in many respects, but whose hermeneutic theories also emphasized the importance of authorial intention (in contrast to the New Criticism, which rejected what it called the “intentional fallacy”). Informed by Hirsch’s common sense and empiricist spirit, Virginia became known as a bastion of resistance to the abstractness of French literary theory.


E. D. Hirsch’s most influential book, Validity in Interpretation (1967), was published at the height of the civil rights and countercultural moments. Although it contains no reference to contemporary politics, it was written with the aim of restoring decorum to literary studies by limiting its object to what Hirsch called “the re-cognition of what an author meant,” an idea that gave philosophical justification to the methodology of the New Bibliography. Hirsch claims not to be a political conservative, but his scholarly arguments were enthusiastically received by those who wanted to reverse the politicization of literary studies. Indeed, his book was a touchstone for those who opposed socially engaged literary study. Hirsch’s work demanded that the critic stop after recovering the author’s intention. Cantor paints Hirsch and Bowers as representing a “diversity” that trumps gender, racial, and ethnic diversity, but to scholars who champion those values, they look like two sides of the same coin.

Hirsch’s work entered the national conversation in 1987, when he published a popular work, Cultural Literacy: What Every American Needs to Know. At the time, literary scholars were arguing not only that the canon maintained a biased cultural status quo, but also that the very idea of canonicity was conservative. Hirsch’s book was therefore very welcomed by cultural conservatives who were fighting the critique of the canon, as well as by conservative textual scholars, whose work had always been focused on the production of editions of canonical works. Cultural Literacy claimed that it is possible to identify a “core” of key works that “every American must know.” The book and the project, along with a significant series of follow-on works by Hirsch and others, were celebrated by conservatives, including former Education Secretary William Bennett, who characterized Hirsch’s work as a rebuke to the “multiculturalism” that he claimed was ruining American education. Hirsch’s argument was foundational for the Common Core educational program favored by the political right.

The idea that interpretation is inherently progressive is questionable, but conservative commentators nonetheless have converged on the view that readers shouldn’t interpret and evaluate texts for themselves, especially with regard to politics. At its foundations, political interpretation historicizes (“relativizes”) claims to superior cultural status, especially for the favored texts and artifacts of privileged groups. Hirsch’s own puzzlement at the left’s rejection of his work — even after minor concessions to cultural representation for women and minorities — is echoed in the aggressive incomprehension with which many Digital Humanists respond to the argument that their work supports the rise of the neoliberal university. It is telling that Digital Humanities, like Hirsch, and like Bowers, has found an institutional home at the University of Virginia. We argue that, like Hirsch’s tightly constrained approach to literary criticism, and like Bowers’s similarly constrained approach to textual scholarship, Digital Humanities has often tended to be anti-interpretive, especially when interpretation is understood as a political activity. Digital Humanities instead aims to archive materials, produce data, and develop software, while bracketing off the work of interpretation to a later moment or leaving it to other scholars — or abandoning it altogether for those who argue that we ought to become “postcritical.”

Computing in the Humanities, Computing as the Humanities

Computer use in the humanities of course predates the formal movement that calls itself Digital Humanities. The trailblazer is usually identified as a Jesuit priest, Roberto Busa, whose 56-volume concordance to the works of St. Thomas Aquinas was produced over a period of three decades from 1949, with support from IBM. In the early 1980s, computing became centrally important to the discipline of lexicography, thanks to work done by academic linguists at the University of Birmingham and professional lexicographers at Oxford University Press. The former employed statistical analysis of large volumes of text to study contemporary word usage, contributing to dictionaries and English language teaching materials published by Collins, the financial backer for much of their work. The latter continued to produce the sort of historical scholarship for which they had long been known, but were able to use new computer technology to store, edit, and typeset their work in producing what would become the 20-volume second edition of the Oxford English Dictionary.

Such work required tremendous amounts of primary scholarship. But it also required new forms of technical infrastructure. In many institutions, this was provided on an ad hoc basis, but some universities began providing generalized Humanities Computing services. These were mostly run not by full-time professors but by staff members from the library and elsewhere. Often enough, this support was understood very literally: Humanities Computing professionals were frequently called upon to help professors get their computers to work. They were also called upon to build or to help build digital projects (especially archives) for which professors did not have technical expertise. It is understandable that Humanities Computing specialists might have seen themselves as second-class citizens, especially in the North American university, with its academic caste system that divides workers into “faculty” and “staff.” North American universities are organized around the research generated, or at least directed, by faculty. Faculty are immediately rewarded for those kinds of research activities, while staff tend to be rewarded in less direct ways, if at all.

The academic need for Humanities Computing resources was growing all the time, and this was especially true at the University of Virginia. In 1986, when Hirsch was writing Cultural Literacy, his department hired the scholar who would do more than any other to define Digital Humanities: Jerome McGann. McGann, one of the leading US scholars of British Romanticism, had sought to bring textual scholarship into the academic mainstream, not by demanding that literary studies adopt the textual scholar’s obsession with authorial intention, but by demanding that textual scholarship adopt literary theory’s philosophical sophistication — and indeed, take that sophistication further, foregrounding the instability not only of meanings but of physical texts themselves. McGann situated his approach within the New Historicism, a form of cultural studies that rose to prominence in the 1980s.

McGann’s encyclopedic approach to textual editing took paper-based scholarship to its limit, leading him eventually to seize on the potential of computer technology to extend his work. In the 1990s, he began to construct a total online edition of the poet and painter Dante Gabriel Rossetti: an edition containing facsimiles and transcripts of every manuscript, every sketch, every painting, and every publication — in short, of every single version of every single Rossetti work. It was a project that caught the imagination of many people in textual studies. Textual scholarship has a tendency to fetishize the historical archive as the guarantor of knowledge. Here, computers appeared to be facilitating the construction of the most complete and integrated archive that the literary world had yet seen. And it certainly didn’t hurt that the object of all this effort was the canonical oeuvre of a dead white man.

For the initial “seven, eight, or even ten years” of the Rossetti project, McGann was by his own admission narrowly focused on technical issues. This focus would provide a new way to argue for the relevance of textual scholarship, rebranding it as “the future” on the grounds that it required technical skills that were outside the remit of literary training as it then existed. Such a project could only be carried out with extensive support from Humanities Computing. It therefore seems appropriate that the University of Virginia should have been the location in which Humanities Computing practitioners made a tactical decision. Rather than accommodate themselves to the requirements of the humanities research and teaching system, they would — in the style of Silicon Valley “disruptors” — attempt to force the system to accommodate them.

Thus, Digital Humanities was born from disdain and at times outright contempt, not just for humanities scholarship, but for the standards, procedures, and claims of leading literary scholars. Those scholars had told the Humanities Computing specialists, even if only implicitly, that their work didn’t count as scholarship. Now, it was time to prove them wrong. The goal was not merely to show what technical expertise could bring to humanities research. Rather, it was to redefine what had been formerly classified as support functions for the humanities as the very model of good humanities scholarship. This agenda crystalized at two events held at and coordinated by the University of Virginia English Department, the “Is Humanities Computing an Academic Discipline?” conference in 1999 and the publicly funded “Digital Humanities Curriculum Seminar” that ran from 2001 to 2002. These events formally rebranded Humanities Computing as Digital Humanities and established textual scholarship as the discipline’s core concern. When we speak of Digital Humanities, we acknowledge the significance of the institutional shift that occurred at these events. The events themselves did little to change the nature of the research projects to be undertaken in the humanities, but did much to shape Digital Humanities as a social movement in the contemporary academy, especially North American departments of English.

The political affinity between Humanities Computing and textual scholarship has already been noted by some who were active in both fields, such as Martha Nell Smith, an early Digital Humanist and founder of one of the first Digital Humanities centers, the Maryland Institute for Technology in the Humanities (MITH). Despite her early and continuing involvement in Digital Humanities, Smith has emerged as a leading critic of the movement:

When I first started attending humanities computing conferences in the mid-1990s, I was struck by how many of the presentations remarked, either explicitly or implicitly, that concerns that had taken over so much academic work in literature — of gender, race, class, sexuality — were irrelevant to humanities computing. […] [I]n the wake of the sixties, the humanities in general and their standings in particular had suffered, according to some, from being feminized by these things. […]


Quite frankly, those observations also seemed to apply to much of STS [the Society for Textual Scholarship] […] More than a few participants in STS seemed to think of it as a space free from all the messiness of questions of identity and politics.


The University of Virginia conferences were overseen by three senior scholars at that institution: McGann himself; John Unsworth, a pioneer of Humanities Computing; and Johanna Drucker, an eclectic scholar and artist with an idiosyncratic relationship to the humanities (Drucker now teaches in the University of California, Los Angeles Department of Information Studies). McGann and Drucker would soon go on to direct and create the early Digital Humanities project known as SpecLab, in which prototypes for many projects such as the Rossetti Archive and “The Ivanhoe Game” were first designed. The majority of the participants in the two University of Virginia conferences were Humanities Computing specialists and graduate students, including Steven Ramsay, Matthew Kirschenbaum, and Bethany Nowviskie, who became arguably the most influential Digital Humanities practitioners of the next generation. While the reading lists and position statements with which the events were launched make formal nods toward the importance of historical, sociological, and philosophical approaches to science and technology, the outcome was the establishment, essentially by fiat, of Digital Humanities as an academic and not a support field, with the accompanying assertion that technical and managerial expertise simply was humanist knowledge. The unavoidable implication was that other humanists were wrong — not so much about how they went about answering humanist questions as about the very definition of the humanities.

This view reaches its apotheosis in the repeated suggestion that building computational tools should qualify as a replacement for scholarly writing: an idea that runs counter to the culture not only of English departments but also of Computer Science departments, which have never handed out PhDs for competence in programming alone. Versions of this principle — the idea that technical support is the cutting edge of the humanities — continue to surface. For example, in the characteristic assertion that, “One day, creating [and] maintaining platforms to enable the dissemination of [and] engagement with scholarly content will ‘count’ as scholarship.” Carried to its logical conclusion, such a declaration would entail that the workers in IT departments of corporations such as Elsevier and Google are engaged in humanities scholarship. Often, the contribution of computational projects to scholarly knowledge has been slight — and some Digital Humanities advocates have been at pains to defend their frequent inability to explain the point of much digital humanist activity. For example, Tom Scheinfeldt, Director of Digital Humanities at the University of Connecticut, played for time: “Eventually digital humanities must […] answer questions. But yet?” Outside Digital Humanities, of course, even an undergraduate dissertation needs a question to answer if it is to be taken seriously. The implication is that in Digital Humanities, computer use is an end in itself.

At the same time that scholars at the University of Virginia argued for the rebranded Humanities Computing as the future of the humanities, a separate development was taking shape at Stanford University, under the leadership of Franco Moretti. Moretti is a Marxist literary critic who — while continuing to practice the form of interpretive scholarship known as “close reading” — wished to extend the range of methodologies available to literary studies with what he wittily called “distant reading”: essentially, the application to literature of forms of quantification previously associated with the social sciences. In itself, this was not particularly revolutionary, as a strong tradition of quantitative research on literature already existed, especially in the sociology of culture and in certain forms of linguistics. However, this quantitative work tended to be published in journals humanists didn’t read.

Moretti’s innovation, then, was to present a version of this quantitative tradition in humanities journals such as New Left Review. Moretti was not employed in Humanities Computing, and did not do software design or maintenance himself, but relied on technical support staff, and, in presenting the fruits of their labors in a form that traditional humanities scholars could understand, implicitly argued for the value of skills he himself neither possessed nor practiced. Moretti was not initially associated with Digital Humanities, but was rather claimed for it when his work seemed to point toward the scholarly breakthroughs its advocates had been promising. The interest generated by Moretti’s work led others to import a range of quantitative methods from outside the humanities, but also created confusion about why those procedures were valuable. It was not the code through which they were implemented but the theoretical and methodological traditions from which they had emerged that really mattered. While some scholars affiliated with the Digital Humanities movement became conversant with the intellectual background of the procedures they employed, producing valuable work that could have seen publication in the venues within which the procedures themselves had been developed, the fetishizing of code and data and the relative neglect of critical discourse within Digital Humanities have led to the emergence of an environment in which one is more likely to encounter oddities such as Michael Dalvean’s recent claim that the probability scores yielded by a machine learning algorithm are an “objective” measure of literary value.

It is indicative of the Digital Humanities movement’s general disdain for scholarship as it had hitherto been defined that arguments making as little sense as Dalvean’s can appear in leading Digital Humanities journals, with the result that much of the more interesting side of Digital Humanities research has a tendency to resemble a slapdash form of computational linguistics adorned with theoretical claims that would never pass muster within computational linguistics itself. The success of the Digital Humanities movement perhaps owes less to the honorable exceptions to this trend (which tend to be distinguished by the comparative modesty of their claims as much as by their attention to questions of validity) than by the compatibility of the movement’s favored research models with administrative demands that academic work be tied, at least potentially, to external revenue streams. From the viewpoint of the neoliberal university, the best kind of research (and the kind to be most handsomely rewarded) is the kind that brings in the most external funding. This is one of the main reasons why the digitization of archives and the development of software tools — conspicuously expensive activities that influential funding bodies have enthusiastically supported — can exert such powerful attraction, effectively enabling scholarship to be reconfigured on the model of the tech startup, with public, private, and charitable funding in place of Silicon Valley venture capital.

Funding Digital Humanities

Such funding comes from a range of sources, but some have been vastly more influential than others. The main US public research funding body, the National Endowment for the Humanities (NEH), and the charitable Andrew W. Mellon Foundation, were key early funders of Digital Humanities. One of the early spurs for the development of Digital Humanities at the University of Virginia was a $1,500,000 Mellon Foundation Distinguished Achievement Award to Jerome McGann in 2003, which partly funded many early projects including SpecLab and the Rossetti Archive. After heavily funding Digital Humanities projects for almost a decade, the NEH founded an Office of Digital Humanities in 2008, setting up a permanent and separate funding stream devoted to the kind of work promoted under the Digital Humanities brand. And in 2010, Google itself launched a Digital Humanities Research Program with a particular focus on the production of tools that would integrate with its existing commercial services, especially Google Books.

Google’s financial support for Digital Humanities has been intermittent, but the announcement of the program had a dramatic impact, triggering an avalanche of top-down promotion of Digital Humanities as university administrators sought to capitalize on what appeared to be a major new funding stream. And Google’s own Digital Humanities–esque project, the Google Ngram viewer, which allows users to search for frequencies of words and phrases drawn from the Google Books database, has been the subject of Uncharted: Big Data as a Lens on Human Culture, by Erez Aiden and Jean-Baptiste Michel: a breathless pro-industry volume whose contempt for non-data-driven and -industry-centered scholarship is as palpable as its political disengagement. Although not emerging from Digital Humanities proper, the Ngram Viewer has inspired much Digital Humanities work, and represents exactly the kind of “success” that drives the movement’s public relations: a technique is appropriated from an academic field not associated with humanist scholarship (here, linguistics), shorn of the theoretical checks that once regulated its use, and given the appearance of revolution through application to a dataset that appears capable of silencing all criticism through sheer size.

The priority accorded to Digital Humanities by the Canadian government in its support for humanities research funding is directly comparable. Digital Humanities projects are compatible with two of the six “Future Challenge Areas” currently named by the Social Science and Humanities Research Council of Canada (SSHRC). In the first, priority goes to research that answers the question, “What new ways of learning, particularly in higher education, will Canadians need to thrive in an evolving society and labour market?” In the second, researchers are enjoined to study how emerging technologies can be “leveraged to benefit Canadians.” Canada’s leading Digital Humanities specialist, Raymond Siemens, who held a Canada Research Chair in Humanities Computing from 2004 to 2015, is one of 15 members of SSHRC’s core council.

The priority accorded to Digital Humanities by the SSHRC only amplifies general tendencies that affect all applicants. Those who wish to acquire a sizeable grant, and who do not have site-based research needs, must develop a compelling rationale to employ graduate students. One of the simplest ways to justify the need for graduate students is to set up a named lab — a lab that requires not just funding but continual funding, and whose students can work on an evolving list of projects. In turn, applicants must explain how graduate students’ research enhances their employability. This makes Digital Humanities labs especially attractive, and makes researchers feel as if they cannot win large grants without doing Digital Humanities. The biggest projects require wages for graduate students, technical staff such as developers, consultants working on paid short-term contracts, and salaried project managers. SSHRC’s model of funding therefore complements the development of new models of intellectual work within the neoliberal university — accelerating the devaluation of older models of literary study.

The impact of these changes to the North American funding environment has been much greater than the total sums involved might suggest. Because university administrators tend to value research according to the size of the grant that supports it, and because funding for the humanities is otherwise mostly limited to salary replacement sabbatical grants to individual faculty members, this specific targeting of large-value grants toward Digital Humanities projects has had a transformative effect upon the humanities throughout the United States and Canada, where it is difficult to get six-figure grants for English scholarship without engaging in computational work. Despite the Office of Digital Humanities being the smallest office in NEH, its proactive allocation of funds has had a remarkable effect on administrative priorities, promoting methodologies that, until the emergence of these funding sources, had little support within the fields themselves.

The work that the NEH and the Mellon Foundation tend to fund remains largely confined to the “tools and archives” paradigm that many in Digital Humanities claim to have surpassed, but that continues to drive the institutional expansion of Digital Humanities through its likely receipt of major research grants. This is no coincidence. Digital Humanities enabled the creation of new pools of funding specifically devoted to an entirely new conception of the humanities that was promulgated by a small minority within English departments. Why these funders chose to do this remains something of a mystery. To find precise explanations, we would need to have access to private conversations and communications, though it is remarkable that such an epoch-making shift can be so lacking in explicit justification. What we do know is that in the early 2000s, both Mellon and NEH began devoting unusually large amounts of resources to these projects, while leaving the rest of their funding streams largely unchanged, and that this had more influence than anything else in raising the profile of Digital Humanities by creating an expectation of reward for those working on the lines laid down at the aforementioned University of Virginia humanities computing and Digital Humanities events (the latter of which was funded by the NEH in both of the years that it ran).

Exceptions to the general trends of course abound. The problem is that these exceptions too easily function as alibis. “Look, not everyone committed to Digital Humanities is a white man.” “Look, there are Digital Humanities projects committed to politically engaged scholarly methods and questions.” We are not negating the value of these exceptions when we ask: What is the dominant current supported even by the invocation of these exceptions? The appeal of Digital Humanities is undeniable: as the burden of paying for university is increasingly shifted to students, and university staffing is increasingly temporary, the acquisition of marketable skills, and the ability to justify those skills as integral to the market-oriented evolution of knowledge and education, becomes all but essential. Rather than act in vocal opposition to the university’s transformation, and to the forces driving it, the Digital Humanities social movement seeks to prove that a humanities education is beneficial to job seekers by reinventing that education as a course of training in the advanced use of information technology. It unavoidably also suggests that other approaches in the humanities fit less well into the contemporary university, because the implied measure of success is economic.

What Now?

For enthusiasts, Digital Humanities is “open” and “collaborative” and committed to making the “traditional” humanities rethink “outdated” tendencies: all Silicon Valley buzzwords that cast other approaches in a remarkably negative light, much as does the venture capitalist’s contempt for “incumbents.” Yet being taken seriously as a Digital Humanities scholar seems to require that one stake one’s claim as a builder and maker, become adept at the management and visualization of data, and conceive of the study of literature as now fundamentally about the promotion of new technologies. We have presented these tendencies as signs that the Digital Humanities as social and institutional movement is a reactionary force in literary studies, pushing the discipline toward post-interpretative, non-suspicious, technocratic, conservative, managerial, lab-based practice.

The movement’s level of institutional support may be exceptional, but its biases are not. They accord with a variety of other postcritical methodologies, such as versions of Speculative Realism and Object-Oriented Ontology, and the explicitly “postcritical” literary theory advocated by scholars such as University of Virginia English Professor Rita Felski, which tend to challenge, avoid, or disavow scholarly endeavor that is overtly critical of existing social relations. We therefore suggest that it is not the “traditional” scholarly world, with its hierarchies and glorified experts and close reading of works read by only a precious few people, to which the Digital Humanities social movement is most meaningfully opposed. What it stands in opposition to, rather, is the insistence that academic work should be critical, and that there is, after all, no work and no way to be in the world that is not political. This is what textual scholarship à la Bowers as well as E. D. Hirsch’s approach to literary interpretation share with Digital Humanities, and so the fact that all three have shared an institutional base in the same department is hard to overlook.

We say this knowing that we will be read — despite two of us having long histories as digital researchers — as outsiders who do not know what we are talking about, as zany zealots with no real knowledge of the field, as people with no right to comment, whose views are anyway retrograde and irrelevant. But in fact, our critique echoes and builds upon much that has already been said by some within the field, although to very little apparent effect. Alan Liu, one of the earliest supporters of Digital Humanities, wrote half a decade ago that:

While digital humanists develop tools, data, and other meta-data critically […] rarely do they extend their critique to the full register of society, economics, politics, or culture. How the digital humanities advances, channels, or resists today’s great postindustrial, neoliberal, corporate, and global flows of information-cum-capital is thus a question rarely heard in the digital humanities associations, conferences, journals, and projects with which I am familiar. Not even the clichéd forms of such issues — for example, “the digital divide,” surveillance,” “privacy,” “copyright,” and so on — get much play.


It is as if, when the order comes down from the funding agencies, university administrations, and other bodies mediating today’s dominant socioeconomic and political beliefs, digital humanists just concentrate on pushing the “execute” button on projects that amass the most data for the greatest number, process that data most efficiently and flexibly (flexible efficiency being the hallmark of postindustrialism), and manage the whole through ever “smarter” standards, protocols, schemas, templates, and databases uplifting Frederick Winslow Taylor’s original scientific industrialism into ultraflexible postindustrial content management systems camouflaged as digital editions, libraries, and archives — all without pausing to reflect on the relation of the whole digital juggernaut to the new world order.


The problem to which Liu draws attention is a theme that has recurred throughout this article: purported technical expertise trumps all other forms of knowledge, including critique of the uses to which such expertise is put. (What counts as “expertise,” however, turns out to be highly variable. For example, most of the senior scholars mentioned here — Moretti, Liu, McGann, Drucker, and Smith — openly disclaim any ability to code, even as other major figures in the field insist on this as a minimum qualification.) Digital Humanities has achieved its institutional prominence precisely because of a willingness on the part of many of its key figures to play the role that Liu describes. Liu’s repeatedly published complaints of the absence of “cultural criticism” from Digital Humanities might have had some effect if that absence was coincidental, but it is not. Indeed, the institutional success of Digital Humanities appears to be explained in large part by its designed-in potential to drive social, cultural, and political critique from the humanities as a whole.

A parallel can be drawn with the often-remarked raced and gendered makeup of Digital Humanities, whose key figures remain (mostly) white men even as greater numbers of female scholars and scholars from minoritized backgrounds have entered the “big tent.” The problem has been highlighted again and again, yet nothing changes besides the occasional invitation of female scholars to speak alongside the established male stars. At the Digital Humanities 2015 conference, noted media studies scholar Deb Verhoeven addressed the men present as follows:

You have made a world designed around ensuring your own personal comfort, but it’s not comfortable for many, many other people. […] This is not about issuing another policy advisory for “inclusion.” This is not about developing a new checklist to mitigate your biases. And its definitely not about inviting a token female speaker to join you – this actually needs to be about your plans to exit the stage. This is not about learning how to do it better next time – this is about you leaving before there’s a next time. […] This is about letting other people in by letting go of your privileged position.


Such a recommendation is, of course, extremely unlikely to be followed. With astonishing bravery, Verhoeven told the men in attendance that the problem is “how many of you occupy the positions that get to speak,” i.e. to define what Digital Humanities is taken to be. We might say in the same spirit that if Digital Humanities means to be (as it claims) a truly “big tent,” defined by openness to all perspectives and diversity of projects and applications — a definition capacious enough to make one wonder what purpose the label could serve at all — it will be necessary for its chief practitioners, associated with the biggest projects and the biggest labs, to mute themselves for a number of years so that the voices of the outsiders they claim to welcome may be amplified in turn. But the idea is laughable: major research institutions, from the University of Virginia to University College London, have invested in Digital Humanities precisely in order to consolidate their grip on available research funding, and are about as likely to renounce their market dominance as are Facebook, Amazon, or Google.

And even if it were to be taken seriously as a proposal, it could hardly correct the structural and institutional conditions that explain the unique position of Digital Humanities today. In an article on what she sees as the unrealized radical potential of the field, Miriam Posner writes that “[w]e can’t allow Digital Humanities to recapitulate the inequities and underrepresentations that plague Silicon Valley,” but we argue that its spectacular institutional success is a consequence of its constitution, from the outset, as precisely such a recapitulation. In the academy and outside of it, the privileging of technical expertise above other forms of knowledge is a political gesture, and one that has proved highly effective in neutralizing critique of established power relations. We offer our analysis of the Digital Humanities social movement as a way of resisting that gesture and as an inducement to other scholars to do the same.

With thanks to Brian Lennon and to our anonymous critical readers.

¤


Daniel Allington is associate professor of Digital Cultures at the University of the West of England, though he previously lectured in Applied Linguistics at the Open University.


Sarah Brouillette is a professor in the Department of English at Carleton University, where she teaches contemporary literature and culture, and social and cultural theory.


David Golumbia teaches in the English Department and the Media, Art, and Text PhD program at Virginia Commonwealth University. He is the author of The Cultural Logic of Computation (Harvard University Press, 2009) and many articles on digital culture, language, and literary studies and theory. From 2003-2010 he taught as a specialist in Digital Humanities in the Media Studies program and the Department of English at the University of Virginia.

LARB Contributors

Daniel Allington is associate professor of Digital Cultures at the University of the West of England, though he previously lectured in Applied Linguistics at the Open University. He has published extensively on the relationship between culture, sexuality, technology, and inequality.

Sarah Brouillette is a professor in the Department of English Language and Literature at Carleton University in Ottawa, Canada. She is the author of two books, Postcolonial Writers in the Global Literary Marketplace (Palgrave, 2007) and Literature and the Creative Economy (Stanford University Press, 2014), and one forthcoming on UNESCO.

David Golumbia teaches in the English Department and the Media, Art, and Text PhD program at Virginia Commonwealth University. He is the author of The Cultural Logic of Computation (Harvard University Press, 2009) and many articles on digital culture, language, and literary studies and theory. He maintains the digital studies blog http://uncomputing.org and edits The b2 Review “Digital Studies” section (http://boundary2.org/category/the-b2-review/digital-studies/) for the boundary 2 editorial collective. His The Politics of Bitcoin: Software as Right-Wing Extremism is forthcoming in 2016 from the University of Minnesota Press, and he is currently working on a book titled Cyberlibertarianism: The False Promise of Digital Freedom. From 2003-2010 he taught as a specialist in Digital Humanities in the Media Studies program and the Department of English at the University of Virginia.

Share

LARB Staff Recommendations

Did you know LARB is a reader-supported nonprofit?


LARB publishes daily without a paywall as part of our mission to make rigorous, incisive, and engaging writing on every aspect of literature, culture, and the arts freely accessible to the public. Help us continue this work with your tax-deductible donation today!