Note: The following is a much-expanded version of a lecture I delivered at the University of Michigan, Department of Screen Arts and Cultures on November 12, 2012, and a short summary blog published at OKBJGM.tumblr.com on July 29th, 2013.
PEOPLE WHOSE JOB it is to declare the comings and goings of the Golden Ages of things agree that the “Golden Age of Television” began in the mid-1950s. That’s when the medium, only milliseconds past its infancy, simultaneously delivered a near-catastrophic blow to the feature film industry and became a cigarette smoke–filled bullpen where the great middlebrow drama of the Chayefskys, Footes, and Serlings mingled with the live comedy of the Caesars and Berles while the Sevareids and Murrows rolled up their shirt sleeves and fought demagoguery from the corner office. More recently, the explosion of “quality television” in the wake of the 1998 and 1999 premieres of Sex and the City and The Sopranos has led many (especially those who profit from TV) to declare that a new Golden Age is upon us. This “Second Golden Age” has stolen the mantle of the auteur cinema of the late 1960s and ’70s and the independent films of the ’90s as the pinnacle of thoughtful, character- and theme-driven entertainment for the mass audience.
While I feel about Golden Ages the way Groucho Marx felt about clubs that would have him as a member, I suppose it would be fair to say that — as one of the Emmy Award–winning writer/producers of Lost (a Second Golden Age stalwart that celebrates the 10th anniversary of its premiere this week) — I have had a ringside seat for whatever it is you call the current wave of pop culture. From that vantage point, I feel that I can provide some perspective on the hows and whys for the sudden explosion of creative ambition in a medium which — for decades prior — was mostly the prefix to the suffix “is a vast wasteland.”
When Lost was called onstage to receive the Best Dramatic Series Emmy in 2005, I was the chubby guy in the Nehru jacket and thick-rimmed, yellow-lensed glasses standing catty-corner from series co-creator and thank-you-speech-deliverer Damon Lindelof. I suppose I always wanted to win a major award on primetime while dressed like a Bond villain or a waiter.
At the time, one of my fellow producers turned to me and said: “The first line of your obituary was just written.”
While I completely understand that my being there, even in that regrettable getup, makes my modest contribution to the Second Golden Age a matter of record — and in the eyes of many reduces my claim to complaint — I still marvel with some shock and horror at how much jockeying there is for the credit of being “the reason TV got good all of a sudden.”
Feature film directors and screenwriters love to make the case that it was their descent from the Olympian perches of high-stakes, high-minded, high-budget, world-class entertainment that suddenly elevated “the boob tube” into “high art.” Scratch “high-budget” from the previous statement and you get the playwright’s version of that argument.
The so-called “auteur showrunners” — writer/producers who are understood to single-handedly perform short-run series miracles on basic and premium cable — insist that it was their tenacity in forcing cowardly networks to give them the latitude to single-mindedly express their voice and vision that birthed the renaissance. Concomitantly, the new breed of “non-writing showrunners,” directors and producers who have recently gained a larger role in the upper management hierarchy of TV shows, shout from the rooftops that without their stewardship of production resources and visual storytelling, the auteur showrunners would be mere novelists.
Much to their credit, the standard-bearing Mandarins of Tiffany television in the 70s, 80s and 90s — the Lears, Bochcos, E. Kelleys, Milchs, Carons, Whedons, and John Wellses — mostly keep on doing what they always did so well; completely at ease with their titanic contribution and feeling no need to leap into the unseemly fray.
Ignoring their fine example, the executives chime in that — of course — it all happened because their maverick programming choices and game-changing strategies created environments where success made it possible to launch greater success, and take greater creative risks.
Finally, cable and satellite providers sit amused on leather chairs in the back of the room and quietly assert to one another with great certainty that they turned TV from what Hunter S. Thompson called “a cruel and shallow money trench” by offering so many channels that the demand for more and more programming eventually brought about creative critical mass.
It’s the Hamlet/Monkey argument: when the number of venues increased exponentially, so did the number of shows, and with that, the chance that, through sheer tyranny of volume, some of them would wind up being … uh … you know … good.
However fatuously expressed in press conferences and trade publication interviews, these arguments are all based on varying degrees of truth. They also ignore the less glamorous — and less easy to claim — cultural, technological, and social influences that created the opportunity for the Second Golden Age to arise.
My journey toward understanding the changes that led to where we are today began in 1993. After graduating with a screenwriting degree from the same school as George Lucas — my heart’s desire was the same as every other filmmaker of Generation X: to remake Star Wars over and over while being hailed as a creative visionary — I blundered into a position as a junior executive in current programming and drama development at NBC. There, I got to witness not just the Viking funerals of L.A. Law and a few other still-standing, groundbreaking 10 o’clock dramas of the 1980s, but also the birth of “Must See TV”; the ongoing development of seminal shows like Homicide: Life on the Street, Seinfeld, and Law & Order; and the development and launch of some of the great precursors to the Second Golden Age, specifically, ER and Friends.
Among my most salient memories of that time is an overheard conversation between two senior broadcast executives during which one of them scratched his head and asked, “Is it just me, or is TV getting good all of a sudden?”
After two years at the network, I returned to writing — staying in the medium I had grown to love and at which I have since worked tirelessly to achieve some proficiency. For two seasons in the mid-aughts, I not only worked as a writer and supervising producer of the first and second season of Lost, but was also, more importantly, part of a think tank of four writers (the others were Paul Dini, Jennifer Johnson, and Christian Taylor) hired by series creators J.J. Abrams and Damon Lindelof to aid in the development of the series and its themes and structure while they wrote and produced the pilot episode.
That bright and shining moment aside, my bread and butter has been that of the traveling story salesman — moving from show to show, season after season — sometimes spending multiple years on one, often suffering the slings and arrows of early cancellation or the inevitable purges that occur in writing staffs as upper management tries to figure out the right blend of voices to enable their vision. After putting time in brilliant-but-cancelled shows like Boomtown, artistically and commercially successful outings like Medium, popular hits like Charmed and The Pretender, limited-run basic cable series like Helix, cult rarities like Jake 2.0 and The Chronicle —and occasional misfires like the Charlie’s Angels reboot and seaQuest 2032 — I would love to sell you the idea that the Second Golden Age of Television is the result of an army of competent journeyman craftspeople toiling endlessly and thanklessly to prop up the messianic visionaries, feature directors, empire-building absentee micromanagers, 900-pound gorillas, and network executives.
That, however, would also be fatuous and self-serving. What I have observed in 20-plus years as a member of television’s middle-class, and believe to be the true reasons for TV’s emergence as the pre-eminent, thought-provoking, visually stimulating, character-revealing mass medium of the early 21st century is the result of a number of interdependent factors. All these factors are primarily external to the actual business of creating televised narrative; and none of them have anything to do with anyone’s personal greatness. Hard as it may be to admit, those of us working in television today may merely be the lucky surfers of a three-crested wave of socio-techno-psychological change.
And, believe it or not, it all starts with Meryl Streep and Dustin Hoffman.
1. IT’S NOT THE HERO’S JOURNEY, BUT THE PARENTS’S DIVORCE
The most cursory glance at the pop culture zeitgeist would still provide much fuel for the argument that Star Wars was — and is — the pivotal narrative model for two generations of TV writer/producers. While the impact of George Lucas’s samurai space epic on current film and television is so deeply felt that even visiting extraterrestrials might think that Joseph Campbell’s The Hero with a Thousand Faces as vulgarized by a frustrated race car driver from Modesto is the only true religious text of the time, I would propose that Robert Benton’s 1979 film of Avery Corman’s novel Kramer vs. Kramer is the more accurate snapshot of the formative years of today’s dominant content producers.
Pound-for-pound and show-by-show, the monolithic thematic obsession of present-day narrative television can be summed up in two words: bad parenting.
In film school, I was assigned Robert B. Ray’s book A Certain Tendency of the Hollywood Cinema. Here’s the TL;DR: Every Hollywood film between 1930 and 1980 is essentially a Western focusing on either an outlaw hero or an official hero. To Ray, the film industry in those 50 years was nothing less than a vast cultural project to transpose on all genres the tropes of our national heroic narrative of manifest destiny, conquest, greatest-generation stoicism, and jingoistic machismo.
So what is the TL;DR of my first factor? Every show of the Second Golden Age of television is essentially Kramer vs. Kramer: a sustained exploration of the consequences of divorce — and absent and abandoning parenting — on the now-grown children of the first generation to experience it as a widespread social custom.
Fair or not to the parties involved, we are re-living the childhood trauma of an entire generation dramatized on television. Divorce was, quite simply, THE national narrative of the formative years of Generations X and Y in the United States. More so than Vietnam, Watergate, the AIDS epidemic, or even the fever-pitching of the Cold War by Reagan’s nuclear hawks, the rising divorce rates of the 1970s and ’80s touched the lives of every single American child in a profound, personal way that defined not just the way in which we would live our lives, but also the way we would use the medium of television to tell our stories.
Late-term war babies and boomers have been vilified enough for their need to claim responsibility for all that was good about the 1960s and ’70s (the civil rights movement, women’s lib, the sexual revolution, the counter-culture and its dismantling of the stifling “Man in the Gray Flannel Suit/June Cleaver” social expectations of the ’50s, gay pride, and men landing on the moon) while conveniently ignoring the less-glamorous collateral effects of their activism and self-discovery (the herpes epidemic, cocaine, the rise of the extreme right wing in the ’80s, a squanderous build up of the military/industrial complex that makes most YA dystopias look like YA utopias, runaway debt that will have our grandchildren striking stones to kindle the remains of their furniture for warmth, disco, and the Carter Administration).
I won’t add to the vitriol, but rather point out that as divorce rates soared in the wake of the late-term war baby and boomer generation’s groping attempts to define themselves in something other than the most rigidly described societal roles — and eventually settled in the 50 percent range — the children of divorce found themselves pioneers of an undiscovered country.
In spite of the influence of the counterculture, the self-help movement, and a growing acceptance of mental health care and psychotherapy as something other than a bitter admission of failure, American society as a whole simply lacked the methods, support mechanisms — even the narrative tropes — to comfort and acclimate a million-fold influx of newly minted latchkey kids to their new dual citizenship in separated, multi-variant families with often-warring authority figures. Feeling abandoned and betrayed, the children of the divorce generation — who form the primary core of today’s top-tier television writer/producers — made the best of a situation for which few adults could provide context because they were too busy figuring it out themselves.
As these kids grew up to become content creators in their adult lives, their preferred collective narrative ceased to be victory in wartime or the conquest of a physical frontier, but rather the taming of an emotional wilderness populated by malfeasant middle-aged parent figures (usually a father, since at the time, they were more often the ones leaving home and hearth and were thus more frequently perceived as the abandoning parent). Most of the delinquent parents in this new master narrative are portrayed as inscrutably trying to reinvent themselves at the expense of their dependents.
If this seems a gross generalization, just Google the murderer’s row of the most iconic dramatic series of the Second Golden Age:
– The Sopranos (a mobster tries to reconcile the criminal lifestyle he loves with the family he begrudges).
– Sex and the City (abandoned by her father at the age of five, Carrie Bradshaw finds the perfect surrogate family in her friends even as she squanders years in an on-and-off relationship with a latter-day equivalent of Daddy Warbucks).
– Breaking Bad (a science teacher becomes a criminal to save his family and discovers that crime was always his true love).
– Mad Men (the abused child of a drunk and a prostitute grows up to be a narcissistic workaholic who neglects his family).
– Lost (every character is an abandoned child trying to remake him/herself in a deserted island where no one knows their true past).
– Orange Is the New Black (a catalogued exhibition of birth mothers extending their abusive pasts to their children and jailed abuse survivors seeking surrogate mothers behind bars).
– House M.D. (a brilliant but narcissistic drug addict abuses his surrogate wife and children in the workplace).
– Grey’s Anatomy (The narcissistic workaholic children of narcissistic workaholic parents wreak endless emotional havoc on one another while excelling in the workplace).
– True Detective (a philandering cop who neglects his family finds the platonic definition of love in a nihilistic, workaholic partner overcompensating for the tragic loss of his own wife and daughter).
– The Good Wife (a case study in the preservation of a marriage for the sake of appearances).
– Sons of Anarchy (Hamlet in a biker gang).
– Six Feet Under (a primer on the neglected children of children of abuse — one whose inciting event is the ultimate act of abandonment by a father).
– Orphan Black (in which not one but a whole gaggle of Tatiana Maslanys learn how dangerous it is to have been raised by strangers).
– The Shield (an amoral philandering cop excuses his ruthless pursuit of his evil moral code by convincing himself that it’s all to protect the family he chronically neglects).
– 24 (Jack Bauer tragically loses his family and can never love again for serving the stars-and-stripes — a vocation that eventually brings him into conflict with his own powerful but corrupt father and catamitic younger brother).
– Battlestar Galactica (the spurned children of humanity form a weird monotheistic cult and then come back to burn down their parents’ home with nuclear weapons).
– The Wire (a longitudinal study of an entire generation’s abandonment by an uncaring patriarchy).
And here’s the kicker: the handful of shows of the era that are not about bad parenting are generally so inversely proportional in their correction (Friday Night Lights and Parenthood, each a celebration of the world’s hardest working, most self-improving parents, and The West Wing, a fantasy in which the world’s ultimate patriarch is an almost preternaturally moral man who actually grapples with the idea of compromise) that they come off as the exceptions that prove the rule.
The Second Golden Age of television owes much of its existence to the Golden Age of American Divorce … and my generation’s final reckoning with our parents may just lie in cursing them with decades of recriminatory reruns depicting their perceived sins.
Which leads to the second factor, which has something to do with what all those latchkey kids were doing in front of the television in the years immediately following 1984.
2. MTV — BUT NOT IN THE WAY YOU THINK
In many media-savvy people, the mere mention of “MTV” triggers Pavlov-like the addition of the suffix “-style editing.” The conventional wisdom among many is that MTV’s historical legacy to the medium is the trickle-down of the rapid-fire editing style of music video to mass-produced narrative entertainment.
The story that usually accompanies this argument goes something like this: “When Brandon Tartikoff first proposed the kernel that became Miami Vice, he merely wrote down the words ‘MTV COPS’ on a cocktail napkin.”
While the role of MTV in introducing a faster paced visual literacy to the general population, especially in the 1980s, cannot be denied, one can also argue that Pablo Ferro had already been cutting that way since the ’60s — and so were the makers of Help, A Hard Day’s Night, The Monkees, and pretty much every commercial director and avant-garde filmmaker in existence. So while MTV may have expedited the arrival of a certain style and fashion to the suburbs, style and fashion have — historically — always found their way there in some other way or another (just ask Elvis Presley, the Beatles, the Sex Pistols, and Halston). The true influence of MTV on the Second Golden Age is the result of something much subtler — yet far, far more subversive.
Having an entire generation of future television show creators grow up on a turbocharged diet of high-density chunks of five minute narrative vignettes — one-after-the-other-all-day-long, all of them wildly varying in genre, all of them wildly experimental in ways once reserved for the art house — created in the collective unconscious a vast and widely shared tolerance for genre-bending in narrative.
The vernacular of modern television — in which cop shows routinely trade in the tropes of procedurals as well as horror movies, or large segments of a show that deals in flash-backs and flash-forwards airs on network television with non-anglophone characters and subtitles, or in which the “musical episode” is not only a curiosity, but, in some cases, an ongoing gesture, or a series that plays entirely in real time with elements of soap opera, action thriller, and police procedural all showing simultaneously on multiple split screens, or in which individual seasons of a show are completely different from previous seasons of the same show, sharing only a title, cast, an aesthetic, and certain thematic preoccupations — would never have existed without MTV.
After 50-plus years of complacently doling out a tightly regimented school lunch tray of cop/doctor/lawyer/family/sitcom, the dawn of the Second Golden Age marks the point when television suddenly — and seemingly collectively — decided that trans-textual promiscuity was a far more satisfying endeavor than the pursuit of haiku-like perfection in the extant formats. This aesthetic shift also dovetails nicely with the promotion to power in the medium of an entire generation of creators whose formative years consisted of obsessively consuming a single channel in which it was possible to go from Godley & Creme to Derek Jarman to David Fincher in a span of minutes: an experience also shared by the audience then rising as the desirable demographic target for new shows.
Imagine an entire crop of impressionable minds watching one monolithic network where the shows changed genre at a speed unthinkable in any previous form of broadcasting. From kitchen sink drama (Madonna’s “Papa Don’t Preach” and Donna Summer’s “She Works Hard for the Money”) to homoerotic Indiana Jones–style action-adventure (Duran Duran’s “Hungry Like the Wolf”), to Lou Reed singing about transvestites, to raunchy, socially reprehensible comedy (let’s say John Parr’s “Naughty, Naughty”), to a piece of pop-art directed by Andy Warhol (The Cars’s “Hello Again”), to a quasi-druidic tone-poem in which a member of Monty Python was tortured by demons (Iron Maiden’s “Can I Play With Madness”), to a Talking Heads clip inspired by the outsider art of Reverend Howard Finster, and a glossy Hollywood musical that just happened to feature Michael Jackson and a cast of line-dancing zombies, MTV was that network, and then some.
By simultaneously being the world’s top provider of pop music — the common tongue of worldwide youth culture — and wedding that to a voraciously (indeed, almost viciously) derivative visual aesthetic that drew much of its energy from the recycling and remixing of anything and everything in the culture, MTV rewired the brains of an entire generation to accept the possibility that narrative could span a multiplicity genres within an integrated whole.
In a feat of mental engineering unprecedented in its speed and scope — a nation-sized experiment that puts the brainwashers of The Manchurian Candidate to shame — MTV facilitated in an entire population a willingness to experiment, first as consumers, and later as creators — with a gonzo fluidity of storytelling and content that echoes in even the most high-minded and erudite of the current age’s narrative offerings.
There’s a reason why Lost was a soap opera, and a sci-fi show, and a spy thriller, and a romantic melodrama, and an occasional police procedural, and a hospital drama, and a wacky comedy about an accidental millionaire, and a paranoid conspiracy thriller that unfolded in a non-linear collage of flashbacks, flash forwards and flash sideways … and there’s also a reason why Lost, as insane as it sounds in hindsight, is not a mere outlier, but very much emblematic of the style and substance of the Second Golden Era of Television: it was just like watching MTV in the 1980s.
But the single most important of the three factors that caused television to “get good” starting in the early to mid-’90s, and brought it to a great blossoming of experimentation and excellence at the dawn of the millennium — the one that ties all of the above together — is the one that gets most easily lost in the land grab for the glory.
In the end, it was technology that brought us all to the Promised Land.
3. THE RISE OF DIGITAL, NON-LINEAR FILMMAKING
When you buy a laptop — or even a smartphone — it will usually include some kind of imaging and editing software, allowing you to take videos, shuffle scenes around, add titles, change the colors, add some filters, maybe a little CGI animation, and then email the entire Kubrickian enterprise to your next of kin. Having grown up in the analog era of filmmaking, I find this a staggering idea on the same level as the notion that the average air conditioning thermostat installed in a new house today carries within it more computing power than the Apollo 11 space capsule.
The first time I walked into a professional postproduction suite was in 1993. In my capacity as an NBC executive, I visited the offices of seaQuest DSV — a series that not only boasted the first dedicated, in-house, computer-generated visual effects company for a television series (Steven Spielberg’s Amblin Imaging) but was also my first encounter with non-linear, non-destructive editing. Sitting alone in a corner by several humming Lightworks workstations — and half-draped in a tarp like a hobbled mechanical pterodactyl — was an old-school Moviola film-editing machine.
That was the moment I realized the revolution was being televised.
In the pre-digital era, there simply was no physical way TV could compete with the quality of the visual image on feature films. The writing, shooting, and postproduction schedules — which persist almost unchanged to this day — simply did not allow a great deal of visual variety in filming, or experimentation in the editing and finishing of the TV product.
Consider a relatively modestly budgeted feature film of the 1980s — say, Paul Verhoeven’s 1987 RoboCop. Made for $13 million, the film was planned for a year before the cameras rolled for two and a half months between August and October of 1986. Reshoots were performed in January of the following year. The film was released in July 1987 after nine months of postproduction.
A similarly themed television series of the same vintage — Max Headroom —had to deliver competitive visuals on roughly one tenth of that budget with eight days of planning, eight days of shooting, and less than a month of editing, scoring, and visual effects production per episode: a punishing grind that most TV programs have to maintain week in and week out to make their air dates and stay on budget.
A TV company usually rented one 35-millimeter film camera (with an occasional second if you had a clever line producer with a keen eye toward maximizing equipment rental budgets). In this environment, even an experienced crew in a soundstage was severely hampered in the number of camera set-ups they could light and capture in a single day. TV crews barely had time to cover all the angles necessary to capture the bare necessities: the truly eye-popping shots were few and far between.
Similarly, analog postproduction was a taxing — and sometimes nigh-artisanal — process. Editors working on film or video found their ability to deliver anything beyond the most workmanlike results severely constrained: every change to the material was “destructive” — requiring the undoing of everything around it without the simple convenience of a “redo” button. In the 1950s and ’60s, an editor’s assistant literally had to pull the work print apart and reassemble the pieces with splicing tape to address a note. In the 1970s and ’80s, the process required the cumbersome overdubbing of edits from two separate tape decks onto a third — effectively erasing all the previous work.
Additionally, every fade, split-screen, and wipe — or simple color correction, and the modest reframing or blow ups of shots: operations that any eight year old can perform in seconds on iMovie today — had to be executed separately in a lab and took a major toll on the series budget. There was no ability to “drag and drop” a transition into a timeline just to see whether it would work or not. Special effects shots had to be planned to the frame and were not only severely limited, but also recycled from episode to episode (watch any hour of the original Star Trek and count how many times you see the same flyby of the Enterprise).
Truly, this was the equivalent of writing a script on a typewriter, and making revisions with onion skin paper, scissors, and Elmer’s glue. It wasn’t a lack of ideas or desire to improve the medium that kept television from greatness, but rather the physical limitations of the one mode of production. Making changes and experimenting with the product took a lot of time and cost a lot of money — and time and money were the two things that were always in short supply.
In the early nineties, however, non-linear, non-destructive editing systems — most of them less powerful than the stock version of iMovie bundled for free with your average Mac today — began to appear in TV postproduction. The change was swift and astonishing. Five years down the line, it became rare to find a film school graduate who had edited on film. Less than eight years later, the very medium was reborn.
How did it happen? With non-linear/non-destructive editing, writer/producers gained the ability to see the assembled work faster. They were able to spend more time thinking the material through, seeing results in real time, and demanding more footage to work with… as well as writing more complicated scripts to accommodate the growing visual palette promised by advancing technology. With non-linear/non-destructive technology, editing went from a mere assembly of a limited set of elements to a final rewrite with the benefit of endless do-overs.
Less than a decade later, the arrival of relatively cheap high definition digital cameras became the second-stage rocket of television’s ascent. With the expense of raw film stock and camera rentals no longer a drain on budgets (broadcast-quality video is now shootable on an off-the-shelf DSLR camera) the number of setups, angles, and scenes has increased exponentially. It’s a simple matter of ratios: we can now capture more at lesser effort and expense, allowing more scenes to be shot per day in a more visually dynamic way.
So while the amount of time we have to do our work hasn’t changed — seven to eight day shooting schedules, 13 to 22 episodes a year, network pilots in January, fall season premieres in September — the amount of work that can be done in that time, and our ability to indulge and render what’s in our minds, has changed by orders of magnitude. Consider this example: the split-screen storytelling technique used in 24 would have taken weeks per individual episode to accomplish in the ’70s using optical printers — and the amount of footage necessary to get the number of shots would have been cost-prohibitive. Simply, the style of 24 was not physically achievable in the 70s, where today, multiple HD cameras, non-linear editing and digital post production accomplish in a short span of time what was once untenable.
These advances in technology ultimately created a cost-effective production infrastructure that today allows established networks to experiment with shows that might not have lasted before. That Friday Night Lights was allowed five excellent seasons in the aughts under a co-production deal between NBC and DirectTV, while the similarly themed Against the Grain was cancelled after two months on NBC primetime in the 1990s, is a reasonable example. The same advances gave basic cable networks the freedom to commission original work to fill their airwaves at budgets that permit smaller and more narrowly defined audiences to produce the kind of advertising revenue that keeps shows on the air — and have also freed pay cable networks to double-down on their own ambitions.
Where once there was Supermarket Sweep and Check it Out!there is now Rectify and Louie.Where once there was First and Ten and Dream On, there now is Game of Thrones and Girls. Where once there was the much-mocked CableACE Award, there are now Emmy sweeps that shame the once-thought-invincible broadcast networks. Technology created the opportunity to make new types of shows cheaply, the 500-channel universe seized on the opportunity, and the deluge of creative competition lifted everyone to a new level of ambition and competence.
The TV-making monkeys weren’t just given typewriters — they happened upon a socio-cultural-technological monolith, and they evolved and multiplied accordingly.
CONCLUSION: “GET OFF THE STAGE, FATBOY!”
My own personal relationship to the “Second Golden Age of Television” up until now can best be summed up in two career-defining events. On the night Lost won the Best Drama Series Emmy, we were quickly hustled to a back-room photo gallery along with the series cast. While technically not the recipients of the award, cast members are nonetheless invited onto stage and attendant press to spare the viewing public the dystelegenic sight of pasty, monastic scribes who seldom see the light of day, strutting their hour upon the stage after some very unfortunate sartorial decisions.
In this room, a large crowd of photographers stood on tiered risers while the producers, writers, and cast posed on a small stage. The reporters, as remains their custom, shouted loudly and endlessly for their subjects to turn toward them so that they could get the best possible picture.
After a while, we handed off the trophies to the cast so they could pose with them. For a moment I found myself slipping into something of a stunned reverie and wandering in front of the stage in a haze of flashbulb lights and loud, demanding voices. My fugue was broken by a shrill shriek from one of the photographers:
“GET OFF THE STAGE, FATBOY, YOU’RE IN MY SHOT OF EVANGELINE!”
Duly reminded of my true place in the entertainment-industrial complex, I meekly stepped out of the way and let the photogs get their pictures of our leading lady. Seven days later, I drove to my parents’s house and gave them the trophy — not for any disdain for the Academy or what it represents, but because they paid for me to go to film school: and because the explosive change I have seen in my time in the industry is a constant reminder that it is the forward movement of creativity that earns awards, not a dwelling on what worked in the past.
My second career-defining event was the 12-episode run of The Middleman, a television series I created and executive produced for the ABC Family cable network in 2008. A deeply — almost ridiculously — idiosyncratic mashup of the popular culture of my youth with my adult desire to express a message of optimism, cooperation, and common decency, The Middleman was a latter-day sci-fi-superhero-comedy produced on a budget only slightly higher than that of Max Headroom 20 years earlier.
While not a commercial success, the series allowed me to marshal a kind-hearted group of mostly like-minded writers and artists to express something I never would have had the opportunity had I been born in the 1940s or ’50s and worked in anything other than the TV medium as it stands today. Six years since, The Middleman continues to live in DVDs and various other download services, where its small, but delightful and devoted, cult audience will presumably continue to enjoy it and recommend it to their own friends.
To be one of what is still a very small number of artists working in this mass medium is a privilege — a collusion of hard work and life-changing luck for which I will always be grateful. To be able to work the medium on so personal a scale as I have had the opportunity: to create something that, if not world-changing, at least presented me with the opportunity to stand before a planet-sized audience, yell “hello!” and know that the statement will linger in a meaningful way rather than to vanish into the electronic ether — that, to me, is the true Golden Age.
While there will always be those who claim that their own specific contribution was the pivot for the descent of quality upon the medium, the memory I carry with me through my own career is that many of us who love television — regardless of where we came from or have been — were given a massive socio-technological boon and rose to the occasion. We took advantage of a newfound technological flexibility to explore our own cultural proclivities and societal anxieties. Having experienced these technological and artistic changes in real time — as a viewer, then a network executive, and currently a creator — I know that the difference between what was possible in 1987, versus what was possible in 1993, versus what is possible today is comparable to waking up one morning with the ability to turn off gravity and truly fly.