Disrupt Yourself (And Do Us All a Favor)

By Stuart WhatleyMarch 28, 2014

Disrupt Yourself (And Do Us All a Favor)

 
…man is pre-eminently a creative animal, predestined to strive consciously for an object and to engage in engineering — that is, incessantly and eternally to make new roads, wherever they may lead […] Man likes to make roads and to create […] But why has he such a passionate love for destruction and chaos also?


—Dostoevsky, Notes from the Underground


 


HOW TRAGIC — TO SAY YOU’RE BETTERING the world and have it come out worse. These days, the rhetoric of change is practically pro forma, drained of any real meaning and used interchangeably with the concept of progress. How else are we to explain such milquetoast slogans as “Change We Can Believe In,” “The Strength and Experience to Bring Real Change,” “It’s Time to Change America,” and “Vote for Change”?1 Worthy campaigns, but to what ends have they led us? And who has benefited? As we start looking back at the Obama era — centered on, arguably, the most celebrated “change agent” of our time — it’s an important question to ask.


The next time you hear someone invoke “change,” look for the simultaneous wink about “disruption,” a method now touted as a new conceptual framework for progress that ignores our tendency to ruin an established good thing in the service of an idea. It is a catchword for a distinctly neoliberal process of identifying existing systems in civil society and federal government — public education, health care, prisons, social security, foreign aid — and subjecting them to free market principles as an end in itself. Few areas of contemporary society reflect such a movement with more clarity than health care, under the so-called Affordable Care Act; public education, under No Child Left Behind and Race to the Top; and foreign development, under the Washington Consensus. Each is part of a massive social reengineering project currently underway, with little mainstream debate: its express purpose to reorganize (disrupt) policies of state intervention which exist on behalf of social solidarity and the common good in favor of privatization, economic deregulation, and public monetary policy that benefits corporations and their shareholders.


Harvard Business School professor Clayton M. Christensen coined “disruption” — a shortening of “disruptive innovation” — in the late 1990s to describe simpler, cheaper and initially inferior products or services that replace customers’ preferred choice. (Think online streaming replacing traditional cable television; fittingly, a recent Netflix profile in The New Yorker carried the subtitle “Netflix and television’s great disruption.”) But figurative catchphrases have a way of becoming literal. Consider a directive from Amazon’s Jeff Bezos to his SVP Steve Kessel: “I want you to proceed as if your goal is to put everyone selling physical books out of a job.”


Such entrepreneurial belligerence is characteristic of adherents to what Paul Carr described in 2012 as a “Cult of Disruption”:


[t]he faddish Silicon Valley concept which essentially boils down to “let us do whatever we want, otherwise we’ll bully you on the Internet until you do.” To proponents of Disruption, the free market is king, and regulation is always the enemy.


Which is to say, disruption is often unilateral, profit-driven change carried out in such a way as to ignore the potential for unintended consequences. It is the building of roads, “wherever they may lead.” We are so saturated by the vernacular of this cult that last year The New Republic’s Judith Shulevitz rightly deemed it “the most pernicious cliché of our time.” No longer relegated to the realm of startups, the term now pops up in policy proposals and press releases from the world’s largest corporations and wealthiest individuals, looking to shore up their position in untapped markets and traditional institutions alike by injecting newfangled ideas and technologies, because they can.


What we find about disruptive change in the world today is that it tends to either disrupt without really changing anything, or, when it does change things, it does so to the benefit of the few, and not the many. Departing from Christensen’s jargonization of the word, which gives it a veneer of scrappy success, disruptors remain unaccountable in spite of their own massive demonstrated failures. Driven by greed or hubris instead of lust, they are Phoebuses — late capitalist man-gods — pursuing helpless Daphnes to the ends of the earth.2


 ¤


Shulevitz, in her criticism, focuses on the role of disruption in the language of education reform schemes bankrolled by multinational corporations and the billionaire Broad, Gates, and Walton Family foundations. In recent years, the charlatanism and opportunism of the 21st-century school reform movement have become woefully apparent. Beyond No Child Left Behind’s undisputed failure, we’ve seen cheating scandals in Atlanta, Washington, D.C., and Philadelphia, fears of a teacher “mass exodus” in other locales, and, now, disappointment and demoralization following testing under Common Core, the corporate-backed, hyper-proceduralizedphilistinic national standards initiative currently adopted in 45 states and the District of Columbia.3 The roots of these blunders and shortfalls are annotated in Diane Ravitch’s latest book, Reign of Error, one of the most important nonfiction works published last year.


One takeaway from Ravitch is that disruption in the context of education — manifested in George W. Bush’s and Barack Obama’s No Child Left Behind and Race To The Top policies, respectively — is almost always a top-down attempt to privatize public facilities deemed languorously “stagnant” and, as such, in “crisis.” Never mind that a public entity like education, to a certain extent, is supposed to be static, as in stable and consistent in its institutional culture over time; that most of the problems we see in schools stem from larger structural issues within society, such as poverty and inequality; or that disruptors’ litmus for educational failure centers on standardized curricula and tests they themselves devised.


In education, as in other realms, most things don’t benefit from being disrupted. As Ravitch points out, each measure proposed by soi disant disruptors — “high-stakes testing, test-based accountability, competition, and school choice (charters and vouchers)” — is at odds with what we know to be pedagogically true: children benefit from “full curricula, experienced staffs, rich programs in the arts, libraries, well-maintained campuses […] small classes” and a supportive local community.


According to Ravitch, evidence for the failure of educational disruption isn’t hard to find. For example, in the international PISA rankings delivered late last year, the United States’ performance was average to below-average. That it has been average to below-average since the very beginning of international education rankings did not stop Secretary of Education Arne Duncan (the school reform movement’s man in Washington) from playing the “crisis” card. When he described the PISA results as “a picture of educational stagnation,” he was stating the obvious while ignoring the even more obvious. Duncan displays the disruptor’s proclivity to use one’s own failure to shore up one’s position. He takes no responsibility for the lack of improvement under Race to the Top or its likeminded precursor, NCLB, despite the fact that these policies are geared solely toward improving standardized test scores above any alternate metric of educational success. Having failed on their own terms while also frustrating teachers and demoralizing students and parents, shouldn’t Duncan suggest turning the microscope on the policies and corporations that meant, and failed, to improve the results, rather than the results themselves? As Ravitch puts it:


The billions invested in testing, test prep, and accountability have not raised test scores or our nation's relative standing on the league tables. No Child Left Behind and Race to the Top are manifest failures at accomplishing their singular goal of higher test scores.


It’s reasonable to predict that disruptors will brandish the “picture of educational stagnation” to redouble their reforms. In fact, the disruptive mode was epitomized earlier this month in New York City. Underprivileged children attending hedge fund- and Walton Foundation-backed Success Academy charter schools were bused to Albany and mobilized against the well-being of, among others, underprivileged students attending the special needs Mickey Mantle school. Recently, the new de Blasio administration granted “only” five out of eight Success Academy schools co-location applications, which allow them to use existing public school space at no cost. Success Academy’s CEO Eva Moskowitz and her backers then launched a national astroturfing blitz that succeeded in duping most of the mainstream press into believing poor children from the remaining three charters were being kicked out on the street. In reality, two of the applicant schools don’t even have students yet, while the third, Harlem Success 4, would have further overcrowded a location where, according to Community Education Council District vice president Noah Gotbaum, “wheelchair-bound 4-year-olds have been forced into overfilled hallways and cafeterias, with many special-needs services already provided in hallways, stairwells and bathrooms.”


The idea behind charter schools (which have proven to be no more effective than public schools on average and routinely inflate their performance with their ability to pick and choose their students) is simply a repackaging of Milton Friedman’s 1955 education voucher proposal that would have diverted public funds to private religious institutions. Charters are not prima facie “good” or “bad” for children’s education, as the quality of learning they provide ranges widely, but they have undeniably proven their worth to those seeking to defame public institutions and subject education to a market system.


We’re also watching a rerun when it comes to misguided attempts to introduce new technology into classrooms to help teachers “prepare students for a 21st-centry workforce,” a favorite talking point for the privatization movement. Todd Oppenheimer related public education’s many previous brushes with “technotopia” over a decade ago in his balanced, thoroughly researched book The Flickering Mind: Saving Education from the False Promise of Technology. As Oppenheimer explains, technology corporations, entrepreneurs, and industry-funded foundations throughout the first decades of the neoliberal era viewed public education as an untapped market. Oppenheimer quotes one computer scientist — the proprietor of a failed attempt throughout the 1980s and ’90s to reorient education around computer coding — who refers outright to computer technology as “a disruptive technology. It should be so. School was designed for a different medium.”


Even then, they should have known better: technology made failed appearances in education throughout the 20th century, from Thomas Edison’s ploy in 1922 to replace teachers and textbooks with then-new motion picture technology (“There is no limitation on the camera”), to Cleveland public school director William Levenson’s 1945 idea for portable radio instruction (“[It] will be as common in the classroom as is the blackboard”), to President Lyndon Johnson applauding television sets in the classroom (“[The] one requirement for a good and universal education […] is an inexpensive and readily available means of teaching children”). At each stage, the respective technology was not “disrupted” by newer technologies but rather failed on its own terms, only to have the cycle expensively repeat itself each generation, thanks to historical amnesia. As Oppenheimer sums up:


The cycle always began with big promises, backed by the technology developers’ research. In the classroom, teachers never really embraced the new tools, and no significant academic improvement occurred. This provoked consistent responses from technology promoters: The problem was money, or teacher resistance, or the paralyzing school bureaucracy. Meanwhile, few people questioned the technology advocates’ claims. As results continued to lag, the blame was finally laid on the machines. Soon schools were sold on the next generation of technology, and the lucrative cycle started all over again.


Oppenheimer makes clear that his book is not meant to be “another jeremiad in the thin but long line of Luddite literature that sees nothing but evil in technology,” and he lists technologies that prove their supplemental worth, such as “computerized vocabulary exercises […] foreign language drills […] and basic word-processing software.” But his conclusion about technology’s history in education is convincing: the bottom line has always come first.


In January of this year the Los Angeles Board of Education barreled forward with a $1 billion plan to issue an iPad to every student, defying advice from its oversight committee, which considers such an exorbitantly expensive procurement unjustifiable. As the Los Angeles Times’ Howard Bloom reported, “L.A. Unified is paying $768 per iPad, one of the highest prices among school districts because it selected a relatively expensive device and included curriculum in the cost.” That curriculum is provided by the British corporation Pearson, a key player in developing the Common Core standards and, as it happens, a producer of the CCSI standardized tests and correspondent coursework now being sold to public school districts.3 The initial rollout of the iPad program last fall was a predictable comedy of errors. One student confessed to Bloom that he used his iPad primarily to play a soccer video game. Others smartly deactivated the security mechanism on theirs with “two clicks,” enabling them to use their devices for gaming, socializing, and various other non-scholastic endeavors.


¤


Foreign aid and development practices under the doctrine of neoliberalism take a similar approach: ignore the advice of those whom the policy will actually affect while simultaneously transforming them into a new market for corporations and entrepreneurs to exploit.


On a macro scale, international organizations such as the World Bank and the IMF have historically deferred to a similar list of policy prescriptions known as the Washington Consensus, a one-size-fits-all policy package of fiscal austerity, privatization of public institutions/industries, and market liberalization.


During his tenure at the World Bank in the late 1990s, Joseph Stiglitz saw the macro side firsthand; in his 2002 book, Globalization and Its Discontents, he observed that the institutions dictating measures to developing states “tended to ignore the problems” underlying their condition and — in every case where the state was not permitted to implement reforms itself — actually exacerbated that condition. In East Asia especially, states went from having practical economic management that encouraged saving, exports, and safety nets for the poor to a regional crisis where “the IMF itself had become a part of the countries’ problem rather than part of the solution.” Elsewhere around the globe, “[t]he transition from communism to a market economy [had] been so badly managed that, with the exception of China, Vietnam, and a few Eastern European countries, poverty has soared as incomes have plummeted.”


At its core, the Washington Consensus approach, as the policy manifestation of market fundamentalism, is about disruption. Its self-selection of jargon like “shock therapy” makes that clear. It makes available new markets, labor, and raw materials for the benefit of the world’s largest corporations, without assuming any responsibility when things go awry. In The White Man’s Burden, another well-known foreign aid and development treatise, former World Bank economist William Easterly’s own appellation for the type of change agent now engaging in disruption is “Planner,” one who “thinks he already knows the answers; he thinks of poverty as a technical engineering problem that his answers will solve.” Planners, according to Easterly, are “why dying poor children don’t get twelve-cent medicines, while healthy rich children do get Harry Potter.”


Since Stiglitz’s and Easterly’s works on the topic in the early 2000s, the NGO sector has ballooned by a quarter, with thousands of philanthropic organizations taking a larger role in the foreign aid space (incidentally, the prevalence of NGOs working on development issues correlates exactly with neoliberalism more generally, beginning with the World Bank’s refocusing on poverty in the 1970s). Past foreign aid failures by the Washington Consensus have set the stage for “social entrepreneurs,” corporate-funded philanthropies and others willing to sell further neoliberal policies as a panacea for all social policy, whether in need or not.


One of the most prominent examples of disruption in foreign aid, according to Public Sector, Disrupted: How Disruptive Innovation Can Help Government Achieve More for Less, a Deloitte University Press paper published last year, is microfinance — made famous by the Nobel Prize-winning Grameen Bank in Bangladesh:


Most developing countries are vastly underserved by banking institutions. But traditional international development organizations for their part are not institutionally well equipped to deliver low-cost disruptive innovations. The hugely successful Grameen Bank in Bangladesh, which offers women tiny loans to establish microbusinesses and buy raw materials for self-employment, provided the first alternative model to traditional development.


But a more comprehensive study conducted in 2009 by the Jahangirnagar University economist Anu Muhammad drew less sanguine conclusions about the Grameen Bank’s work. According to Muhammad, microfinance “created a good opportunity for expanding the market for finance capital, thereby ensuring Grameen Bank’s spectacular success. However, it failed as a tool for poverty alleviation and empowerment of women.” Which is to say, microfinance has opened a “vast virgin market” that appears to do little more than sustain its own pecuniary existence.


Meanwhile, microcredit has been commercialized by most of the major global financial firms and is increasingly operating outside of the realm of philanthropy. Some microfinance charities have even undergone initial public offerings and turned themselves into massive corporations. “What was once an idealistic movement is now a fast-growing industry, and one that is rapidly losing its innocence,” Tim Harford wrote in the Financial Times in 2008. Lost innocence indeed — as the journalists Mark Ames and Yasha Levine point out, one such microlender’s (SKS Microfinance) “cruel and aggressive debt-collection practices” were such that they caused mass suicides among impoverished villagers in Andhra Pradesh, India.


¤


Like a virus, neoliberalism has rapidly infected many aspects of civil society. It spreads by capitalizing on spheres of inequality, by redirecting capital flows from one side to the other with no regulation or regard for the consequences. Thus, instead of a public health care option subsidized by public funds for the public good, millions of Americans are being shunted into privatized insurance plans in one of the grandest moves to privatize civil society in a generation; public funds for public schools are being redirected to private charter schools and corporate curriculum- and test-makers, with many thanks to the great change agent of our day: Barack Obama.


Such acts of privatization have led to an unprecedented amassing of wealth by select individuals, turning some billionaires and corporate leaders into modern-day Phoebuses with the will, means, and lack of accountability to insinuate themselves into whatever circle of society they like, just because they can. As Oppenheimer notes, a primary cause behind public education’s ongoing tryst with privatization is chronic underfunding — especially in times of austerity such as now — that leaves school administrators beleaguered and open to exploring easy solutions pitched their way. The same can be argued for development efforts that backfire and reinforce the status quo of gross inequality abroad — the needy are left even needier and dependent on the wealthy.


These and other factors have led to a society hell-bent on gratuitous change. No matter the circumstances, a shakeup is always in order, but as Oscar Wilde would say, only insofar as it is a “practical scheme,” adhering to the assumptions and strictures of ideology — in our current day the ecumenical mélange of market fundamentalism and techno-utopianism applied to all spheres of life.


Are there potential benefits to such change in public programs? Can invention deliver prosperity and save us from ourselves? It doesn’t seem likely. As the Northwestern University economist Robert J. Gordon suggests, we may have already cashed in on most of authentic innovation’s potential for driving economic growth in the digital age. Whereas the discoveries and inventions of previous eras — electricity, steam engines, indoor plumbing — were truly far-reaching in their impact on growth, productivity, and standards of living, the innovations through which we define our own time have already slowed their delivery by those same metrics. According to Gordon in an August 2012 working paper


The innovative process is a series of discrete inventions followed by incremental improvements which ultimately tap the full potential of the initial invention. For the first two industrial revolutions, the incremental follow-up process lasted at least 100 years. For the more recent [third], the follow-up process was much faster. Taking the inventions and their follow-up improvements together, many of these processes could happen only once. Notable examples are speed of travel, temperature of interior space, urbanization itself.


Gordon’s argument that technological innovation suffers from diminishing returns, while counterintuitive, is credible. Even if one considers a globally significant and wildly popular staple of the digital age like social media (Facebook, Twitter, and the like), when measured against a metric of labor productivity or standard of living, this innovation may just contribute a net of nothing. If the digital innovations hoarding the spotlight are mostly Potemkin villages, then where does that leave us? Gordon is not without detractors. Most prominent is the MIT Sloan School of Management professor and internet evangelist Erik Brynjolfsson, who debated Gordon at TED and argues that, rather than stalling out, we’re at the beginning of the “Second Machine Age,” where robots that can think, speak, and learn new knowledge over time will provide “economic bounty to America and the world.” “While we've seen significant economic disruption in the recent past, we'll see much more as we head deeper into the second machine age. Improved technologies are the ultimate driver of productivity growth, and productivity growth is essential to higher living standards,” Brynjolfsson wrote in a recent column. In this new age, it is expected that machines will continue replacing manual labor in manufacturing but will also take over more sectors of the knowledge economy. But, when Brynjolfsson also writes, “I see an amazing array of new technologies in the pipeline that promise even more productivity and progress,” one can only wonder, for whom?


¤


None of this is to say that social, technological, or political innovation is inherently destructive, or its practitioners conniving and depraved. Modern science and technology have ushered in the first large-scale societies in world history not dependent on slavery. That cannot be understated. And while there are certainly bad eggs in the ranks of business, philanthropy, politics, and media, most people probably mean well. Among them, there are people and organizations doing actual constructive work. Indeed, contributive innovation should be welcomed. But, as many of the above authors’ insights suggest, that work is diminished by a malignant conceptual framework for progress: one that exploits our expectations about change, pushes it for its own sake and sees no problem in overturning systems and institutions it doesn’t understand. At the heart of disruption today is not so much radical surety in money and technology’s ability to fix any problem, hardly novel, but rather an imprudent absence of any dialectical counterbalance — a sense of accountability or selfless stewardship — to rein in profiteers’ greed and philanthropists’ hubris.


“Why, why is everything so stupid?” Dostoevsky’s Ivan Karamazov asks when he realizes the terrible real-world potential of an abstract idea.


Would that today’s man-gods could experience such an epiphany.


__________________________


Notes:


1. Campaign slogans of, respectively, Barack Obama in 2008, Hillary Clinton in 2008, her husband in 1992, and the British Conservative Party in 2010.


2. Ovid, for his part, never presumed to depict transformations as always episodes of progress, only as inevitabilities. We today, of course, prefer to remember Pygmalion’s beautiful statue, happily rendered from stone into flesh, but let us not forget poor Daphne, who, fleeing Phoebus, begged her father to “mar the beauty which made me admired too well.” Unable to fend for herself, she was rendered a tree.


3. Much has been written about the Common Core, the national school standards assembled by Achieve, Inc., a D.C. nonprofit, and funded largely by the Gates Foundation and some of the world’s largest corporations, now being used as the benchmarks for the Obama administration’s Race to the Top program. This piece does not have the space to do full justice to the topic but readers are encouraged to follow up with excellent essays by Stan Karp in The Washington Post and Jane Robbins in Academic Questions, as well as Hofstra University education professor Alan Singer’s Huffington Post blog. This publication’s readership will find most germane the recommendation, according to Robbins, that English teachers not bother assigning full works of literature, only excerpts, and that those be presented with no historical or literary context so as to allow students to engage the work “cold.” Under Common Core, this is all they need to learn the “critical thinking” skills needed for “21st century jobs.” Presumably, it is not considered problematic if students enter college thinking Othello is the story of a happy new marriage and Twelfth Night a tragedy about a lost sibling. Robbins also tells us teachers are being encouraged to assign 140-character compositions, so as to reflect the real world limitations placed by Twitter. Last month, the president of the Gates-funded Urban League endorsed Common Core by writing, “There is a quiet — yet increasingly disruptive — revolution underway in American education.”


4. This conflict of interest has not gone unnoticed. In New York last year Pearson paid $7.7 million to settle out a New York Attorney General investigation. According to The Washington Post, “Pearson Charitable Foundation, the nonprofit arm of educational publishing giant Pearson Inc., has agreed to pay a $7.7 million settlement to New York Attorney General Eric T. Schneiderman after he determined that the foundation had created Common Core products to generate ‘tens of millions of dollars’ for its corporate sister.” In a typical statement addressing the pay out, Pearson admitted no guilt and denied any wrongdoing.


¤


Stuart Whatley is an editor and writer in New York.

LARB Contributor

Stuart Whatley is a senior editor at Project Syndicate. He has written for CNN, Democracy: A Journal of Ideas, Los Angeles Review of Books, The Baffler, The Christian Science Monitor, The Guardian, The Atlanta Journal-Constitution, The American Prospect, Free Inquiry, and other outlets.

Share

LARB Staff Recommendations

Did you know LARB is a reader-supported nonprofit?


LARB publishes daily without a paywall as part of our mission to make rigorous, incisive, and engaging writing on every aspect of literature, culture, and the arts freely accessible to the public. Help us continue this work with your tax-deductible donation today!