Here’s what the “post-fact” literature has right: the Internet allows us to see what other people actually think. This has turned out to be a huge disappointment. When anyone can say anything they like, we can’t even pretend most of us agree on the truth of most assertions any more.
The post-fact literature is built in part on nostalgia for the world before people like Bigfoot showed up in the public sphere, for the days when Newsweek reflected moderately liberal consensus without also providing a platform for orthographically-challenged wingnuts to rant about the President. People who want those days back tell themselves (and anyone else who will listen) that they don’t want to impose their views on anybody. They just want agreement on the facts.
But what would that look like, an America where there was broad agreement on the facts? It would look like public discussion was limited to the beliefs held by straight, white, Christian men. If the views of the public at large didn’t hew to the views of that group, the result wouldn’t be agreement. It would be argument.
Argument, of course, is the human condition, but public argument is not. Indeed, in most places for most of history, publicly available statements have been either made or vetted by the ruling class, with the right of reply rendered impractical or illegal or both. Expansion of public speech, for both participants and topics, is generally won only after considerable struggle, and of course any such victory pollutes the sense of what constitutes truth from the previous era, a story that runs from Martin Luther through Ida Tarbell to Mario Savio, the drag queens outside Stonewall, and Julian Assange.
* * *
There’s no way to get Cronkite-like consensus without someone like Cronkite, and there’s no way to get someone like Cronkite in a world with an Internet; there will be no more men like him, because there will be no more jobs like his. To assume that this situation can be reversed, and everyone else will voluntarily sign on to the beliefs of some culturally dominant group, is a fantasy. To assume that they should, or at least that they should hold their tongue when they don’t, is Napoleonic in its self-regard. Yet this is what the people who long for the clarity of the old days are longing for.
Seeing claims that the CIA staged the 9/11 attacks or that oil is an unlimited by-product of volcanism is enough to make the dear dead days of limited public speech seem like a paradise, but there are compensating virtues in our bumptious public sphere.
Consider three acts of mainstream media malfeasance unmasked by outsiders: Philip Elmer-DeWitt’s 1995 Time magazine cover story that relied on faked data; CBS News’s 2004 accusations against the President based on forged National Guard memos; and Jonah Lehrer’s 2011 recycling and plagiarism in work he did for the New Yorker and Wired. In all three cases, the ethical lapses were committed by mainstream journalists and unmasked by outsiders working on the Internet, but with very different responses by the institutions that initially published the erroneous material.
In Elmer-DeWitt’s case, he was given what seemed to be an explosive study on Internet pornography, but was in fact largely faked, and which he and the Time staff did not vet carefully. This was the basis for a Time cover story, his first. But the conclusions he drew seemed fishy, and a distributed fact-checking effort formed in response, largely organized on the digital bulletin board system called Usenet. It quickly became apparent that the research was junk; that the researcher that had given the report to Elmer-DeWitt was an undergraduate who faked the data; that the professors listed as sponsors had had little to do with it, and so on.
Elmer-DeWitt apologized forthrightly: “I don’t know how else to say it, so I’ll just repeat what I’ve said before. I screwed up. The cover story was my idea, I pushed for it, and it ran pretty much the way I wrote it. It was my mistake, and my mistake alone. I do hope other reporters will learn from it. I know I have.”
Almost no one saw this apology however, because he only said it online; the correction run by Time sought to downplay, rather than apologize for, misleading their readers, even though the core facts reported in the story were faked: “It would be a shame, however, if the damaging flaws in [the] study obscured the larger and more important debate about hard-core porn on the Internet.”
In 1995, Time could count on there being very little overlap between their readership and the country’s Internet users, so Elmer-DeWitt’s ethical lapse and subsequent apology could be waved away with little fear that anyone else could dramatize the seriousness of the article’s failings.
Contrast the situation a decade later. In 2004, when CBS News based a “60 Minutes Wednesday” story about President Bush’s time in the National Guard. Like the Elmer-DeWitt story, the CBS story was based on faked documents; like Elmer-DeWitt story, the forgery was discovered not by CBS itself or another professional media outlet, but by media outsiders working on the Internet; like the Elmer-DeWitt story, CBS spent most of its energy trying to minimize its lapse.
Unlike the Elmer-DeWitt story, however, the strategy didn’t work. Charles Johnson, blogging at Little Green Footballs, produced an animated graphic demonstrating that the nominally typewritten documents from the early 1970s were actually produced using the default font in Microsoft Word. By 2004, Internet use had become so widespread that the Time magazine tactic of writing off Internet users as a cranky niche was ineffective; Johnson’s work was so widely discussed that CBS couldn’t ignore it. When they finally did respond, CBS admitted that the documents were forged, that they did not check their authenticity carefully enough, that their defense of the reporters involved compounded the error, and that the lapse was serious enough to constitute a firing offense for the senior-most people involved, including Mary Mapes; Dan Rather resigned after some delay.
A more recent example of this pattern, almost a decade after the National Guard memos, was the science writer Jonah Lehrer’s use of recycled, plagiarized, or fabricated material, including, most famously, invented quotes from Bob Dylan. Again journalistic ethics were breached in mainstream publications — in Lehrer’s case, in writings for Wired and the New Yorker, and in his book “Imagine.” His lapses were uncovered not by anyone at Conde Nast, however. His most serious lapse was uncovered by Michael Moynihan, a writer and editor at Reason and Vice, who published his discovery of the Dylan fabrication in Tablet, an online-only magazine of Jewish life and culture. Moynihan’s revelations, the most damning of the criticisms Lehrer was then facing, precipitated his resignation from the New Yorker.
The Lehrer example demonstrates the completion of a pattern that we might call “after-the-fact checking,” visible public scrutiny of journalistic work after it is published. After-the-fact checking is not just knowledgeable insiders identifying journalistic lapses; that has always happened. Instead, the new pattern involves those insiders being able to identify one another, and collaborate on public complaint, and the concomitant weakening of strategies by traditional media for minimizing the effects of such lapses.
The difference between Elmer-DeWitt and Lehrer isn’t that the latter’s lapses were worse, it’s that the ability to hide the lapses has shrunk. The nominal ethics of journalism remain as they were, but the mechanisms of observation and enforcement have been transformed as the public’s role in the landscape has moved from passive to active, and the kind of self-scrutiny the press is accustomed to gives way to considerably more persistent and withering after-the-fact checking.
* * *
“Truth Lies Here” and related laments have correctly identified the changes in the landscape of public speech, but often misdiagnose their causes. We are indeed less willing to agree on what constitutes truth, but not because we have recently become pigheaded, naysaying zealots. We were always like that. It’s just that we didn’t know how many other people were like that as well. And, as Ben McConnell and Jackie Huba put it long ago, the Internet is a truth serum.
The current loss of consensus is a better reflection of the real beliefs of the American polity than the older centrism. There are several names for what constitutes acceptable argument in a society — the Overton Window, the Sphere of Legitimate Controversy — but whatever label you use, the range of things people are willing to argue with has grown.
There seems to be less respect for consensus because there is less respect for consensus. This change is not good or bad per se — it has simply made agreement a scarcer commodity across all issues of public interest. The erosion of controls on public speech have enabled Birthers to make their accusations against the President public; it also allows newly-emboldened groups — feminists, atheists, Muslims, Mormons — to press their issues in public, in opposition to traditional public beliefs, a process similar to gay rights post-Stonewall, but now on a faster and more national scale.
There’s no going back. Journalists now have to operate in a world where no statement, however trivial, will be completely secured from public gainsaying. At the same time, public production of speech, not just consumption, means that the policing of ethical failures has passed out of the hands of the quasi-professional group of journalists employed in those outlets, and has become another form of public argument.
This alters the public sphere in important ways.
The old days, where marginal opinions meant marginal availability, have given way to a world where all utterances, true or false, are a click away. Judgement about legitimate consensus is becoming a critical journalistic skill, one that traditional training and mores don’t prepare most practitioners for.
Journalists identify truth by looking for consensus among relevant actors. For the last two generations of journalism, the emphasis has been on the question of consensus; the question of who constituted a relevant actor was largely solved by scarcity. It was easy to find mainstream voices, and hard to find marginal or heterodox ones. With that scarcity undone, all such consensus would be destroyed, unless journalists start telling the audience which voices aren’t worth listening to as well.
A world where all utterances are putatively available makes “he said, she said” journalism an increasingly irresponsible form, less a way of balancing reasonable debate and more a way of evading the responsibility for informing the public. Seeking truth and reporting it is becoming less about finding consensus, which there is simply less of in the world, and more about publicly sorting the relevant actors from the irrelevant ones. They can no longer fall back on “experts,” as if every professor or researcher is equally trustworthy.
This is destroying the nominally neutral position of many mainstream outlets. Consider, as an example, Arthur Brisbane’s constitutional inability, as public editor of The New York Times, to process universal public disdain for his arguments against fact-checking politicians. His firm commitment to avoiding accusations of partisanship, even at the expense of accuracy, helped raise the visibility of the fact-checking movement in the 2012 Presidential campaign, as pioneered by PolitiFact and its peers. These fact-checking services have now become a new nexus of media power in the realm of political speech.
Yet Brisbane is onto something, though it may have more to do with self-preservation than with commitment to truth: a world where even mainstream news outlets tell their readers when politicians lie, or publicly assess various speakers’ relevance on any given issue, is a world where neither powerful public actors not advertisers will be automatically willing to trust, or or even cooperate with, the press.
Even as the erosion of consensus makes for an unavoidable increase in oppositional reporting, it also makes the scrutiny journalists face from their audience far more considerable than the scrutiny they face from their employers or peers. Trust in the press has fallen precipitously in the last generation, even as the press itself increasingly took on the trappings of a profession.
One possible explanation is that what pollsters and respondents characterized as “trust” was really scarcity — like the man with one watch, a public that got its news from a politically narrow range might have been more willing to regard those reinforced views as being an accurate picture of the world. Since Watergate, however, followed by increasingly partisan campaigning and governance, the lack of shared outlook among existing news producers, coupled with the spread of new, still more partisan producers, may have made this sort of trust impossible.
There’s no going back here either. Each organization will have to try to convince its audience that it is trustworthy, without being able to rely on residual respect for any such entity as “the press.” Any commitment to ethics will involve not just being more reactive to outsiders’ post-hoc review, but being more willing to attack other outlets for ethical lapses in public, more ready to publicly defend their own internal policies, rather than simply regarding ethical lapses as a matter for internal policing.
The philosophy of news ethics — tell the truth to the degree that you can, fess up when you get it wrong — doesn’t change in the switch from analog to digital. What does change, enormously, is the individual and organizational adaptations required to tell the truth without relying on scarcity, and hewing to ethical norms without the ability to use force.
This will make for a far more divisive public sphere, a process that is already under way. It’s tempting to divide these changes into Win-Loss columns to see whether this is a change for the better or the worse — Birthers bad, New Atheists good (re-label to taste) — but this sort of bookkeeping is a dead end. The effects of digital abundance are not trivially separable — the Birthers and the New Atheists used similar tools and techniques to enter the public sphere, as did the Tea Party and Occupy Wall Street. More importantly, the effects are not reversible. Even if we concluded that the collapse of moderate centrism as a neutral position was bad for the U.S., there would be no way to reverse the fortunes of the house organs for that philosophy.
Now, and from now on, journalists are going to be participants in a far more argumentative sphere than anything anyone alive has ever seen. The question for us is not whether we want this increase in argumentation — no one is asking us, and there is, in fact, no one who could ask us — but rather how we adapt ourselves to it as it unfolds. And the two tools we’re most practiced at using — scarcity of public speech, and force applied to defectors from mainstream consensus — are getting less viable every day.
This essay is part of a larger work on digital ethics to be published by Poynter and CQ Press. These ideas will be presented during a symposium in New York next week at the Paley Center for Media, in partnership with craigconnnects, the Web-based initiative created by Craig Newmark. Free tickets are available. The event will also be live streamed on Poynter.org, Tuesday, Oct. 23.
Correction: This post originally referenced Mario Silva instead of Mario Savio.
Comments