By Alexander SternApril 1, 2018


    This piece appears in the LARB Print Quarterly Journal: No. 17,  Comedy

    To receive the LARB Quarterly Journal, become a member  or  donate here.


    In a little-watched 1947 comedy, The Sin of Harold Diddlebock, directed by Preston Sturges, the title character, an accountant, takes shelter against the bureaucratic drudgery of his existence with a literal wall of clichés. Diddlebock covers every inch of the space next to his desk with wooden tiles engraved with cautious slogans like “Success is just around the corner” and “Look before you leap.” Head down, green visor shielding him from the light, he passes the years under the aegis of cliché, until one day he’s fired, forced to pry the clichés off his wall one by one, and face the world — that is, Sturges’s screwball world — as it is.

    The French word “cliché,” as Sturges must have known, originally referred to a metal cast from which engravings could be made. Diddlebock’s tiles point to one of the primary virtues of cliché: it is reliable, readymade language to calm our nerves and get us through the day. Clichés assure us that this has all happened before — in fact, it’s happened so often that there’s a word, phrase, or saying for it.

    This points to one of the less remarked upon uses of language: the way it can, rather than interpret the world or make it available, cover it up. As George Orwell and Hannah Arendt recognized, this can be socially and politically dangerous. Arendt’s Eichmann spoke almost exclusively in clichés, which, she wrote, “have the socially recognizable function of protecting us against reality.” And, referring to go-to political phrases like “free peoples of the world” and “stand shoulder to shoulder,” Orwell wrote:

    A speaker who uses that kind of phraseology has gone some distance towards turning himself into a machine. The appropriate noises are coming out of his larynx, but his brain is not involved as it would be if he were choosing his words for himself. If the speech he is making is one that he is accustomed to make over and over again, he may be almost unconscious of what he is saying, as one is when one utters the responses in church. And this reduced state of consciousness, if not indispensable, is at any rate favorable to political conformity.

    Orwell’s critique of political language was grounded in a belief that thinking and language are intertwined. Words, in his view, are not just vehicles for thoughts but make them possible, condition them, and can therefore distort them. Where words stagnate thought does too. Thus, ultimate control in 1984 is control over the dictionary. “Every year fewer and fewer words, and the range of consciousness always a little smaller.”

    Orwell worried not just about the insipid thinking conditioned and expressed by cliché, but also the damaging policies justified by euphemism and inflated, bureaucratic language in both totalitarian and democratic countries. “Such phraseology,” he wrote — as “pacification” or “transfer of population” — “is needed if one wants to name things without calling up mental pictures of them.”

    In our democracies words may not mask mass murder or mass theft, but they do become a means of skirting or skewing difficult issues. Instead of talking about the newest scheme to pay fewer workers less money, “job creators” talk about “disruption.” Instead of talking about the end of fairly compensated and dignified manual labor, politicians and journalists talk about the “challenges of the global economy.” Instead of talking about a campaign of secret assassinations that kill innocent civilians, military leaders talk about the “drone program” and its “collateral damage.”

    The problem is not that these phrases are meaningless or can’t be useful, but, like Diddlebock’s desperate clichés, they direct our thinking along a predetermined track that avoids unpleasant details. They take us away from the concrete and into their own abstracted realities, where policies can be contemplated and justified without reference to their real consequences.

    Even when we set out to challenge these policies, as soon as we adopt their prefab language we channel our thoughts into streams of recognizable opinion. The “existing dialect,” as Orwell puts it, “come[s] rushing in” to “do the job for you.” Language, in these cases, “speaks us,” to borrow a phrase from Heidegger. Our ideas sink under the weight of their own vocabulary into a swamp of jargon and cant.

    Of course politics is far from the only domain where language can serve to protect us from reality. There are many unsightly truths we screen with chatter. But how exactly is language capable of this? How do words come to forestall thought?


    The tendency of abstracted and clichéd political language to hide the phenomena to which it refers is not unique. All language is reductive, at least to a degree. In order to say anything at all of reality, language must distill and thereby deform the world’s infinite complexity into finite words. Our names bring objects into view, but never fully and always at a price.

    Before the term “mansplaining,” for example, the frustration a woman felt before a yammering male egotist might have remained vague, isolated, impossible to articulate, unnameable (and all the more frustrating therefore). With the term, the egotist can be classed under a broader phenomenon, which can then be further articulated, explained, argued over, denied, et cetera. The word opens up a part, however small, of the world. (This domain, unconscious micro-behavior and its power dynamics, has received concentrated semantic attention of late: “microaggression,” “manspreading,” “power poses,” “implicit bias”.)

    But, like all language, the word is doomed to fall short of the individual cases. It will fit some instances better than others, and, applied too broadly, it will run the risk of distortion. Like pop stars and pop songs, words tend to devolve, at varying speeds, from breakthrough to banality, from innovative truth to insipid generalization we can scarcely bear to listen to. As words age, we tend to forget the work they did to wrench open a door into the world in the first place. In certain cases, this means the door effectively closes, sealing off the reality behind the word, even as it names it.

    When a word is new, the reality it opens up is too, and it remains fluid to us. We can still experience the poetry of the word — we can see the reality as the coiner of the word might have. We can also see how the word approaches its object, and for that matter that it does approach the object from a certain perspective. But over time, the door can close and leave us only with a word, one that was never capable of completely capturing the thing to begin with.

    Nietzsche, for example, rejected the notion that the German word for snake, Schlange, could live up to its object, since it “refers only to its winding [schlingen means “to wind”] movements and could just as easily refer to a worm.” The idea that a medium evolving in such a partial, anthropocentric, and haphazard manner might be able to speak truth about the world, when it could only ever take a sliver of its objects into account, struck him as laughable. He famously wrote:

    What is truth? A mobile army of metaphors, metonymies, anthropomorphisms — in short a sum of human relations, which are poetically and rhetorically intensified, passed down, and embellished and which, after long use, strike a people as firm, canonical, and binding.

    Language is made up of countless corrupted, abused, misused, retooled, coopted, and eroded figures of speech. It is a museum of disfigured insights (arrayed alongside trendy innovations) that eventually congeal into the kind of insipid prattle we consider suitable for stating facts about the world.

    Still, we may not want to go as far as Nietzsche in denying language its pretension to objective truth simply because of its unseemly evolution. Language, in a certain sense, needs to be disfigured. In order to function at all, it must be disengaged and its origins forgotten. If using words constantly required the kind of poetic awareness and insight with which Adam named the animals — or anything approaching it — we’d be in perpetual thrall to the singularity of creation. We might have that kind of truth, but not the useful facts we need to manage our calendars and pay our bills. If the word for snake really did measure up to its object, it would be no less unique than an individual snake — more like a work of art than a word — and conceptually useless. Poetry, like nature, must die so we can put it to use.


    Take the phrase the “mouth of the river” — an example Elizabeth Barry discusses in a book on Samuel Beckett’s use of cliché. We certainly don’t have an anatomical mouth in mind anymore when we use that phrase. The analogy that gave the river its mouth is, in this sense, no longer active, it has died and, in death, become an unremarkable, unconscious, and useful part of our language.

    This kind of metaphor death is typical of, perhaps even central to the history of language. As Nietzsche’s remarks on the word “snake” suggest, our sense that when we use “mouth” to refer to the cavity between our lips we are speaking more literally than when we use the phrase “mouth of a river” is something of an illusion. It is a product of time and amnesia, rather than some hard and fast distinction between the literal and the figurative. The oral “mouth” was once figurative too; it has just been dead longer.

    Clichés, as Orwell writes of them, are one step closer to life than river “mouths.” They are “dying,” he says. Not quite dead since they still function as metaphor, neither are they quite alive since they are far from a vibrant figurative characterizations of an object. Orwell’s examples include “toe the line,” “fishing in troubled waters,” “hotbed,” “swan song.”

    These metaphors are dying since we use them without seeing them. We don’t see, except under special conditions, the line to be toed, the unhatched chickens we should resist counting, the dead horse we may have been flogging. We may even use these phrases without understanding the metaphor behind them. Thus, Orwell points out, we often find the phrase spelled “tow the line.”

    But they haven’t become ordinary words, like the river’s “mouth.” We’re much more aware of their metaphorical character and they are much more easily brought back to life. They remain metaphorical even if their metaphors usually lie dormant. This gives them the peculiar ability to express thoughts that haven’t, strictly speaking, been thought. We don’t so much use this kind of language as submit to it.

    This in-between character is what leaves cliché susceptible to the uses of ideology and self- deception. We come to use cliché like it’s ordinary language with obvious, unproblematic meanings, but in reality, the language remains a particular and often skewed or softened interpretation of the phenomena.

    In politics and marketing, spin-doctors invent zombie language like this in a relatively transparent way — phrases that present themselves as names when they really serve to pre-interpret an issue. “Entitlement programs.” “A more connected world.” These phrases answer a question before it is asked; they try to forestall alternative ways of looking at what is referred to. Cliché is the bane of journalism not just for stylistic reasons, but also because it can betray a news story’s pretense of objectivity — its tendency to insinuate its own view into the reader.

    Euphemistic clichés are particularly good at preventing us from thinking about what they designate. Sometimes this is harmless: we go from “deadborn” to “stillborn.” Sometimes it isn’t. We go, as George Carlin documented, from “shell shock” to “battle fatigue” to “post- traumatic stress disorder,” to simply “PTSD.” By the time we’re done, the horrors of war have been replaced by the jargon of a car manual, and a social pathology is made into an individual one. To awaken the death and mutilation beneath the phrase takes an effort most users of the phrase won’t make.

    This is how a stylistic failure can pass into an ethical one. Language stands in the way of a clear view into an object. The particular gets buried beneath its name.

    We might broaden the definition of cliché here to include not just particular overused, unoriginal phrases and words, but a tendency in language in general to get stuck in this coma between life and death. Cliché is neither useful, “dead” literal language that we use unthinkingly to get us through the day, nor vibrant, living language meant to name something new or interpret something old in a new way. This latter type of language figures or articulates its object, translating something in the world into words, modeling reality in language and making it available to us in a way it wasn’t before. The former, “dead” language need only point to its object.

    The trouble with cliché is that it plays dead. We use it as dead language, simply pointing at the world, when it is really figuring it in a particular way. It is language that has ceased to seem figurative without ceasing to figure, and it is this that accounts for its enormous usefulness in covering up the truth or reinforcing ideology. In cliché, blatant distortions wear the guise of mundane realities. In the process, it is not just language that is deadened, but experience itself.


    I want to suggest now that in popular language today irony has become clichéd. Consider “binge-watching.” The phrase might seem too new to really qualify as a cliché. But, in our ever-accelerating culture a clever tweet can become a hackneyed phrase with a single heave of the internet. In “binge-watching” what has become clichéd is not a metaphor, but a particular tone — irony.

    “Binge-watching” is of course formed on the model of “binge-drinking” and “binge-eating.” According to the OED, our sense of “binge” comes from a word originally meaning “to soak.” It has been used as slang to refer to getting drunk (“soaked”) since the 19th century. A Glossary of Northamptonshire words and phrases from 1854 helpfully reports that “a man goes to the alehouse to get a good binge, or to binge himself.” In the early 20th century the word was extended to cover any kind of spree, and even briefly, as a verb, to mean “pep up” as in “be encouraging to the others and binge them up.”

    Like most everything else, binging started to become medicalized in the 1950s. “Binge” was used in clinical contexts as a prefix to “eating” and “drinking,” and both concepts underwent diagnostic refinement in the second half of the century. Binge-eating first appeared in the DSM in 1987. Binge drinking is now defined by the NIH as five drinks for men and four for women in a two-hour period.

    By the time “binge-watch” was coined and popularized (Collins Dictionary’s 2015 word of the year), the phrase “binge-drinking” had moved from the emergency room to the dorm room. The binge-drinkers themselves now refer to binge-drinking as “binge-drinking” while they’re binge-drinking. In a way, they’re taking the word back to its slang origins, but, in the meantime, it has accumulated some serious baggage.

    The usage is ironic, since when the binge-drinkers call binge-drinking “binge-drinking,” they don’t mean it literally, the way the nurse does in the hospital the next morning. Take this example from Twitter. “I’m so blessed to formally announce that I will continue my career of binge drinking and self loathing at Louisiana state university!” The tweet is a familiar kind of self-deprecating irony that aims, most of all, to express self-awareness.

    It elevates the user above the concept of binge-drinking. If I actually had a problem with binge-drinking, the writer suggests, would I be talking about it like this? By ironically calling binge-drinking “binge-drinking,” binge-drinkers deflect anxiety that they might have a problem with binge-drinking. They preempt criticism or concern by seeming to confess everything up front.

    The phenomenon is part of a larger incursion of the language of pathology into everyday life, where it has, in many contexts, replaced the language of morality. Disease is no longer just for sick people. Where we might have once worried about doing wrong, we now worry about being ill. One result is that a wealth of clinical language enters our everyday lexicon.

    These words are often not said with the clinical definition in mind. “I’m addicted to Breaking Bad/donuts/Zumba.” “I’m so OCD/ADD/anorexic.” “I’m such an alcoholic/a hoarder.” “I was stalking you on Facebook and I noticed...” They are uttered half-ironically to assure you that the utterer is aware that his or her behavior lies on a pathological continuum. But that very awareness is marshaled here as evidence that the problem is under control. It would only really be a problem if I didn’t recognize it, if I were in denial.

    This is a kind of automatic version of the talking cure. If I’m not embarrassed to call my binge-drinking “binge-drinking,” it must mean it’s not a problem. In effect, we hide these problems in plain sight, admitting to them so readily, with just the right dose of irony, that we don’t have to confront them. This self-awareness of the compulsive character of much of our behavior becomes in essence a way to cover it up.

    “Binge-watching” accomplishes this move even more effectively, since it beats the clinic to the punch. It doesn’t ironize an extant clinical term, but instills irony in the very name of the activity — watching an obscene amount of television to block out the existence of the outside world. It can’t even be referred to without the term. Irony is thus built in to any discussion of the activity. “Binge-watching” neutralizes critique. Taking it seriously is made reactionary and moralizing by the term itself.

    Irony, in effect, becomes clichéd here. In the same way as cliché, the irony in the phrase “binge-watching” hovers comatose between life and death. Just as the dead horse in the cliché stops being really experienced as a metaphor, “binge-watching” stops being experienced as irony. The irony that animated the initial use of the term fades away and it is used as a simple name for an activity. But the irony doesn’t disappear completely. It still serves to automatically undercut the possibility of critique through the ironic admission of a compulsion. We no longer really experience the irony when we use the term, but it insinuates itself into our expressions and becomes thereby all the more effective.

    This means that we get the benefits of irony — its feeling of distance and superiority — without having done any of the work of critical negation. Actual literal engagement with the activity gets covered over by a clichéd irony. When this pseudo-confessional irony is embedded in the name for the thing itself, we prevent it from really being seen. The name conceals the named. We cease to see people sitting in front of a screen for hours at a time, unable to tear themselves away, infantilized, stupefied by cheap cliffhangers and next episode autoplay, some rendered so impatient they feel compelled to consume the content on fast forward. Instead, they are simply “binge-watching” — engaged in a socially sanctioned activity with a cute name.


    This clichéing of irony undercuts its social significance. The Greek eironeia meant “feigned ignorance.” Socrates was ironic because he affected not to know when he questioned accepted wisdom. It was a way of undermining thoughtlessly held beliefs. Irony has come (“ironically”) to mean almost the opposite. We now engage in clichéd mass irony in order to feign awareness of things we really aren’t aware of. We pay lip service to critique in order not to confront it head-on. Irony props up thoughtlessly held beliefs — clichés that we stamp on experiences before we have a chance to really experience them.

    There is an ongoing complaint, periodically renewed by a spate of books or think-pieces, that sees us plagued by ironic language, attitudes, dress, et cetera. These are taken as signs of cynicism, alienation, disaffection. But, in reality, irony has become a coping mechanism, built-in critique that is not cynical, alienated, or disaffected enough. Half-sensing that something is wrong, we gesture toward critique, toward an external, alienated view of our circumstances, so we don’t have to take it up.

    The pseudo-awareness that this engenders comes to conceal not just personal failings, but also the moral compromises extracted by modern bureaucracies. Indeed it is one of the remarkable and terrifying things about a bureaucracy — perhaps about our culture as a whole — that everyone in it seems to know better.

    Diddlebock’s self-delusion looks naïve, but our protective clichés — not imprinted on woodblocks that hang above our heads, but integrated into our everyday speech — are just as damning. We deliver a series of pre-packaged postures and ready-made ironies on cue to keep the realities of our culture from making their repeated, compulsive appearance in full force. They serve the same purpose as Diddlebock’s tiles: to ensure us that what we’re doing is all right — what else can we do? — and to keep us from seeing things as they are. This refuge quickly becomes a prison.


    Alexander Stern is a postdoctoral fellow at the Goethe- University in Frankfurt, where he is working on a book on the philosophy of language.

    LARB Contributor

    Alexander Stern is a writer based in New Orleans. His first book, The Fall of Language: Benjamin and Wittgenstein on Meaning, was published by Harvard University Press last year. He also teaches philosophy at Loyola University New Orleans.


    LARB Staff Recommendations

    Did you know LARB is a reader-supported nonprofit?

    LARB publishes daily without a paywall as part of our mission to make rigorous, incisive, and engaging writing on every aspect of literature, culture, and the arts freely accessible to the public. Help us continue this work with your tax-deductible donation today!