Throughout the evening and early morning hours of November 9th and 10th, 1989, I stood near the Brandenburg Gate and watched, transfixed, as the Berlin Wall came down. What put me there was a decision by several colleagues at NBC News with whom I had been closely tracking developments in eastern Europe during the preceding months. Mixed in with a sense of awe at what I was witnessing, there was also, I confess, a wave of professional and competitive gratification. All through that night, I stood at the base of the wall, interviewing Germans in various stages of delirium. As I called up to the revelers, and they shouted down to me, there never was a pause in the chip, chip, chip of their hammers and chisels, nor any escape from the concrete dust that billowed off the wall. At dawn, I made a brief visit to my hotel room to freshen up. As I bent over the bathroom sink and splashed cold water on my face and head, the images of the preceding hours played through my mind. Distracted by these thoughts, I was at first perplexed by the gray, granular liquid circling in the basin. Then, of course, it struck me: I was watching the Berlin Wall go down the drain. Eighteen years have passed since that November morning. The news divisions of NBC, ABC and CBS have not disappeared, despite much talk about the threats posed by cable television, the Internet, and alternative media. For about 12 of those years, I continued doing what I had been doing for so long. Then, for reasons I could no longer ignore, I decided it was time to stop. It is inescapable that there never was a truly golden age of television news. For every “NBC White Paper” or “CBS Reports,” there was also a “Person to Person.” It could not have been otherwise; a commercial enterprise makes commercial decisions. Those of us who grew up in the early days of television had no quarrel with the equation.
REFERRED ARTICLE
This article was edited from a longer version. To read the original article click here.
Throughout the 1990’s and continuing into the new century, I had nursed a growing conviction that something important—perhaps basic—had changed in the world of television news since that night when the world changed. Although I had no way to realize it at the time, I have since come to believe that the broadcast news divisions’ commitment to the news—covering, reporting and explaining it in at least some of its complexity—went down the drain that November morning, along with the bipolar world.
|
Later, as some of us decided to try our own hand at broadcast news, this equation held. Except for the most naive and deluded among us, we knew there would be trade-offs and bargains that, if not precisely Faustian, were sure to trouble us. It seemed a fair enough trade-off: win one, lose one; win one, lose two; even, win one and lose three. For me, the halcyon days were the 1970’s and 80’s, diminished and mitigated by stories not covered, surely, but halcyon days nonetheless.
Thereafter, the calculus changed. I want to reflect on three of the various reasons: One is based on world events, a second on American corporate culture, and a third on technological advances.
There was, of course, the end of the cold war, as we had known it. Whether the collapse of Communism seemed sudden or glacial, the new reality was breathtaking. Events and phrases that had felt contemporary only yesterday took on the coloration of history: MAD, U-2, Bay of Pigs, Iran-Contra, and many more. These buzz phrases faded into the white noise of 1990’s America and, at the same time, the long-standing urgency of mainstream media, in general, and broadcast news, in particular, went into steep decline.
It was understandable, as a half-century of menace—the nuclear specter—seemed to have evaporated. The most piquant coinage of this at the time was Francis Fukuyama’s “The End of History.” For him, the phrase was metaphorical but, for broadcast news, whether consciously or not, it was literal: Contemporary history—with its intimations of world-shaking events—had ended. The world would survive; we were free to turn our attention elsewhere.
The Rise of ‘Talkability’
About this time I started to hear a new buzzword, which I came to consider the most obscene word in the short history of broadcast news. It was “talkability”—invoked repeatedly, almost daily, to refer to a story, person or situation, regardless of its significance, that would elicit intense interest and comment among viewers. This idea ushered us into the age of the water-cooler metaphor: stories that would find someone at the office water cooler saying to colleagues, “Hey, did you happen to see …?”
Inevitably, this trend began to have a significant effect on the balance between domestic and international news coverage. Talkability, it was assumed, emanated from viewer’s natural interests, and those interests were strongest when the story was closer to home. Interest, thus, flagged as distance increased.
A fundamentally misguided argument emerged between those pushing “domestic” news and those advocating “foreign” coverage. What once had been a free-for-all involving competing priorities—commercial vs. editorial imperatives—was replaced by another model altogether. The archetypal American viewer became the sole consideration—a viewer whose interests were known, whose imagination had been mapped, whose wishes had been ascertained, and whose gratification was now the North Star of television news.
There was an irony here. As this process unfolded, another larger process had been put into play. The commercial networks were consumed by huge and vastly successful corporations: General Electric (GE), Viacom, Disney, the News Corporation, Time Warner. More than any other sector of American society, these entities had grasped the global realities upon which their success depended: the interconnection between “domestic” and “foreign.” Simultaneously, the news divisions of these corporations, whose job was to examine the impact of the wider world on the nation, were instead moving in the opposite direction. Parochialism replaced globalism. And with that, network news began its slide toward irrelevance.
Just a few years earlier, in November 1985, scores of NBC colleagues and I were ensconced at the Noga Hilton hotel in Geneva to cover the first Gorbachev-Reagan summit. Huge contingents—battalions of four, five, even six editing systems, along with editors and engineers—were necessitated, in part, by the technical and logistical realities of the time. It made economic sense to bring editing equipment into the field so that then-expensive satellites would be used only to transmit edited stories. Carpenters and technicians would build sets on location. Unit managers would hire local personnel—drivers, gofers, guides, translators, facilitators. Anchormen had assistants. Programs had bookers. News coverage required researchers, and everyone had a hotel room, and everybody had to be fed.
Food—mountains of food—was always available. One morning I arrived at the makeshift newsroom to find a table piled high with cheeses, assorted meats, fresh fruit, eggs, sausages, hot and cold cereals, crepes, assorted breads and rolls, jams, several different juices, cocoa, tea and coffee stretching half the length of the room. A colleague entered the room, slowly walked the length of the table, examined the fare, then picked up a house phone and ordered room service.
|
This moment proved to be an instructive incident when the corporate culture of GE began to take hold at NBC News. In isolation, it was only an anecdote, but similar instances of extravagance were commonplace and legendary. Everybody had a favorite boondoggle, when no expense was spared. I began to muse on how such scenes must have struck the fresh eyes of not only GE managers, but those of the McKinsey management consultants brought in by GE to scrutinize the way things were done at NBC News. The era of hard-nosed corporate management was upon us—its momentum enhanced by an inescapable reality: The cost of covering the news, already ample, was multiplied by the lavishness of our lifestyle. Despite our resentment at being reined in, we knew that to some extent we had brought it on ourselves.
As time passed, the logic of the bottom line pervaded all aspects of the news business as news divisions, for the first time, had to demonstrate their commitment to efficiency and their opposition to waste. Productivity, a central and venerable tenet of corporate culture, began to occupy the world of news in a way it previously had not. It is at this juncture, I believe, that the new culture of the broadcast news business—so justified and potentially beneficial in its inception—began to drive the process in an unfortunate direction.
Covering News ‘Efficiently’
The difficulty arose because the product is the news. What is a quality product? More practically, what is the formula for determining whether too much time and manpower have been invested in producing a news story? Ultimately, of course, such questions are unanswerable because of the capricious nature of what happens in the world. A light bulb is produced in a highly organized environment; a news story is often produced in difficult, even chaotic, time-consuming conditions. Each light bulb is, or should be, identical; all news stories are different, even when they have similarities.
The parent companies of the network news divisions addressed this problem in an oblique way. News programs are the delivery systems for the product—the news story. It’s a simple task to monitor those programs and measure who contributes most and who least. The process is clearest when one examines the news bureaus, both domestic and foreign, which provide the stories that wind up on television. If in a given week, or month, or year, the bureau in Moscow produces X stories, and the bureau in Chicago produces 2X stories, it follows that the Chicago bureau is twice as productive.
In reality, it does not follow at all. In the new paradigm of the post-cold war world, stories from Russia rarely have the urgency that stories from the menacing, nuclear-armed Soviet Union used to have. With some exceptions, they are not as appealing as stories from the home front. Put another way, they lack talkability.
And so the conundrum is total: efficiency demands productivity, productivity depends on the interest of the programs, the programs are driven by what they perceive will be interesting to viewers, viewers make do with what they are offered, and the result—great interest in some things and little interest in others—is driven by decisions that often are based on factors having nothing to do with the newsworthiness of the stories or the skills of the storytellers. The irony is total, too: A process set in motion by the most successful global corporations in the world, whose lifeblood is the international arena, results in a contraction of the very entities meant to examine and report on that arena and the forces that drive it—the news divisions.
The Digital Factor
Early in the 1990’s, the dawning digital era transformed the way TV news was presented. The visual possibilities seemed, and turned out to be, virtually limitless. As the “look” of a program, as well as the stories within it, became a central consideration, the graphics department became a de facto coequal of the newsroom.
Cosmetic considerations didn’t spring into existence. Since the 1950’s, the look of television news had always been important since it was a visual medium. But the digital age was different. Whereas, in the old days, color schemes might be blue or green, and lighting might be softer or brighter, the new technology offered tools that fundamentally altered the way in which a viewer received information.
The opening sequence of the various evening news programs provides a striking illustration. In ancient black-and-white days, John Cameron Swayze or Douglas Edwards would talk into the camera with few supporting visuals. Later came the static photo over the anchor’s shoulder, followed by moving video that served the same informational purpose in a more compelling and visually pleasing way. Then, suddenly, digitally, the change became exponential. If the lead story was the war in Bosnia, what the viewer might see behind the anchorman was a column of tanks crossing left to right, combined with another shot of somber refugees trudging right to left. Often a third, even a fourth shot, would be introduced—an irate politician or statesman shaking a fist, a skyline of blasted buildings. Typically, a few words—“Bloody Day,” for example—would be superimposed over these overlapping images.
While these editorially propelled images were flashed, other effects, aesthetic in nature, were also inserted. A digitally induced pulse might impart a sense of urgency or ominousness. The video might be given a sepia tone or a black-and-white treatment or some other effect, depending on the nature of the story, which in turn invited an emotive response, whether it was rage, satisfaction, nostalgia or patriotism.
As the anchorman spoke for 20 or 25 seconds about the evening’s lead story, I would often monitor my reaction to the combined audio-visual display. Invariably, I noticed that the more I watched the many visual elements on the screen, the less able I was to follow what the anchorman was saying. Despite my television experience, it struck me that my response probably was broadly representative of how a typical viewer might react. If anything, I thought, my insider knowledge would shield me from distraction and confusion. But the opposite proved to be the case, and I wondered then—as I do today—about the impact of such presentations on viewers trying to concentrate on the details of a complex story.
The Iraq War Coverage
These trends coalesced in the television coverage of the Iraq War. Every network and cable news outlet deployed platoons of military, diplomatic and Middle East specialists even before the war began. Advances in video and satellite technology made instantaneous, live transmission from the battlefield not only plausible, but also easy. And the coverage was catalyzed by the hothouse urgency spawned by 9/11.
I offer two examples of coverage whose defining characteristics were that the storytelling trumped the story and that insufficient attention was paid to facts that served to complicate the story.
The first became apparent in May and June of 2004, as the Coalition Provisional Authority prepared to transfer sovereignty to an Iraqi interim government. Daily briefings at the Pentagon were often covered in their entirety, after which reporters on scene would be debriefed by the anchors in the studios. Frequently the Pentagon briefer would talk about casualties, noting that a spike in violence was to be expected at such a pivotal moment, referred to as “a turning point.” Frequently, reporters would return to the theme, reminding viewers of a supposed connection between the approaching turning point and an increase in casualties.
The problem was that the linkage was demonstrably false. U.S. military deaths illustrate the point. In April 2004, there were 135 American fatalities, a very high number. In May, there were 80; in June, 42; in July, 54; in August, 66. Thereafter, in a pattern that has held for the duration of the war, the U.S. military’s fatality statistics rose and fell, rose and fell, then rose and fell again. Why wasn’t this reported? There was the strain of filling so much airtime and the challenge of citing pertinent data in a timely fashion, to name a few. But there was also the seductive quality of a neat story line: If a momentous event is at hand, one that the insurgency would surely oppose, it seemed inevitable that there would be an increase in violence. But it wasn’t so.
At another dramatic moment—the Iraqi elections of January 2005—this dynamic appeared again. Video images were inspiring as Iraqis risked their lives to go to their polling places, then proudly displayed their thumbs bathed in purple ink to prove their participation. Naturally and fittingly, the airwaves were filled with stories about the triumph of the democratic process. Very quickly, however, an editorial pecking order was established. The process itself became the central, overriding story, while the results of the process—the outcome of the voting—became a secondary theme. It had been universally assumed that people would vote for their own ethnic group, which they did. It was predicted that some Sunnis would refuse to vote, either out of fear or a sense of disenfranchisement that the ballot box could not redress, and this also proved to be the case. As a result, Sunni candidates who represented about 20 percent of the population emerged with even less than that minority share.
These facts were reported, but the irony of the situation was largely ignored. The democratic process had institutionalized, even deepened, the ethnic divisions that by that time were becoming more obvious and lethal by the day. If both sides of this electoral coin had been analyzed in tandem, the story line would have been ambiguous and perplexing, certainly, but closer to the significance of the event. As it happened, the great preponderance of television news coverage focused on just one side of the equation: the irrefutable appeal of people voting freely for the first time in their lives.
Competing With the Internet
In war and in peace, television news has developed a highly structured approach to the way stories are told and, for that matter, in deciding which stories are told. The emergence of the Internet has created a new culture, and television news, fearful of becoming an anachronism, has rushed to be part of the process. At first blush, this makes sense, since the computer user and the television watcher have similar experiences: both look at electronic screens and both are exposed to two potent phenomena—instantaneity and limitlessness.
But the differences between the computer and the television are rarely examined. Using a computer is an “active” experience, since the user controls what is flashed on the screen, what length of time it remains there, and what comes next. Watching television is fundamentally a “passive” experience, assuming that the average viewer would prefer not to constantly switch among three or more news programs. The viewer cannot shape or control the experience without resorting to the “change channel” or on/off button.
Paradoxically, television is trying to remain relevant by appropriating the techniques of the computer, while ignoring its unique qualities. In so doing, television news is delegitimizing itself. It deepens the problem by insisting that all stories must have an arc—a beginning, a middle, and an end that is clear and, if possible, have a touch of inevitability, as great stories often do.
The problem here lies in the difference between literature and journalism. Describing the requirements of drama, Chekhov famously observed that a gun seen on the mantle piece in Act I had better be fired in Act III. No such requirement applies in the real world; the gun sometimes goes off, sometimes not. In its natural and commendable desire to present the news in a dramatic form, television conflates simplification with clarification, and in doing so it refuses to acknowledge a self-evident truth, that complexity and confusion are often intrinsic to the story being told.
Driven by ever-tougher economic imperatives, seduced by the digital marvels at its disposal, motivated by an enshrined notion of what an audience wants to see, and fearful that nuance and ambiguity will drive that audience away, television news is at war with itself. What it tells is too simple; what it shows is too complicated. Television journalists have debated and agonized about these questions for a long time. I recall one newsroom discussion many years ago in which a colleague concluded, to universal agreement, “Look, you can’t look down on the American people.” But that is exactly what has come to pass.
Marc Kusnetz, a former NBC News producer, is a freelance journalist and a consultant to Human Rights First.
This article originally appeared in Nieman Reports.
No comments:
Post a Comment