With all of the false and misleading claims in last year’s U.S. elections, there was no shortage of facts or fact-checking — even at the state and local level. But finding those facts was often more of a challenge than it should have been.
Poor promotion and a lack of consistent labels and tags by some news media fact-checkers meant this reporting quickly disappeared — washed off homepages and social media feeds in the fast-moving flow of daily news coverage. That’s a big loss for the public, but also for the news companies that invested precious staff time to do this distinctive, in-depth journalism.
Over the past year, student researchers at the Duke Reporters’ Lab reviewed the work of 37 regional media outlets that fact-checked political claims during the election cycle that ended last November. Our most surprising finding was the significant differences in the ways those news organizations presented and organized their fact checks.
There was lots of fact-checking in the U.S. Senate race in Ohio about a claim that the Democratic candidate had wasted $250,000 renovating bathrooms at the governor’s mansion. One ad from the National Republican Senatorial Committee was animated to appear as if it was filmed from the bottom of a flushing commode. The ads hit their mark. Two months later, Democrat Ted Strickland was still defending himself from those attacks on the campaign trail. But any Ohio voters with access to Google could see for themselves that this Republican ad blitz was misleading — assuming they thought to look for it.
The Cincinnati Enquirer had pointed out in August that the bathroom renovations were to a visitor’s center for tourists on the grounds of the mansion — not for the personal use of the former governor and his family, as the ads implied. WEWS-TV (Newsnet5) came to the same conclusion: The NRSC ad “flushes the facts,” added a report by the Cleveland station, which was one of PolitiFact’s state partners in 2016, “so we rate it Pants on Fire!”
Specifically searching for “Ted Strickland toilet” quickly turned up both news organizations’ fact checks. That’s great — if you happened to be specifically searching for a fact check about that particular claim. For Ohioans trying to see which of the many claims in this ad-intensive race were true or not, Enquirer fact checks were far harder to find than WEWS’s were.
There was an election page on cincinnati.com, the Enquirer’s website, and a politics blog too. But we found no one-stop collection of all of the Enquirer’s fact-checks. Using the site’s search tool to look for “fact check” turns up some election-related stories — including the one about Strickland’s toilets, but not others. Searching the site for “ad watch” — a label the Enquirer sometimes used — turns up some other fact checks, but not the Strickland one. And searching 10 pages of Google results for “fact check Ohio” turned up none of the newspaper’s election-year fact-checking.
Things worked differently at WEWS, which was one of several local TV outlets from the Scripps TV Station Group that worked as state affiliates for PolitiFact during the 2016 campaign. Its fact checks were prominently promoted by the well-established, 10-year-old national fact-checking service — with all of the structural consistency and searchability benefits that come with that.
(Disclosure: PolitiFact’s founding editor, Bill Adair, is one of the co-directors of the Reporters’ Lab. He edited this report, but did not supervise this research, and the opinions stated here are the authors’, not Bill’s.)
Ohio was one of at least 21 states the Reporters’ Lab looked at where U.S. voters had a plentiful supply of homegrown, multimedia fact-checking produced by local news organizations. It also was among 10 states where more than one news organization was competing to check the facts. Collectively, these fact-checkers examined the accuracy of more than 1,800 claims by candidates, policymakers and other influential voices in the political process.
The result was a bounty of deep reporting in a variety of media formats for readers, viewers and listeners. But some fact-checking outlets seemed to bury this treasure.
Some state and local fact-checkers, like the Enquirer, did not create the most basic of landing pages to collect all of their reporting in one place. And those that did build those pages missed other opportunities to make the most of their fact checks’ unusually long shelf life. Few outlets offered multiple ways for their audience to navigate their fact-checking collections — for example, presenting fact checks by race, topic, candidate or accuracy. And there was limited effort to cross-link or promote those stories when the claims resurfaced during the campaign, as they often did.
All that may sound like a lot of work to news organizations that only did a handful of fact checks during the two-year election cycles. But most of the fact-checkers we looked at were far more productive than that. On average, the independent local fact-checkers we studied examined nearly 30 fact checks each.
These days, Google and Facebook are at least as likely as a news site to be the primary way any of us will encounter all those fact checks. And since last year’s election, Google in particular has been making efforts to surface those kinds of articles — in part using data derived from a Reporters’ Lab project called Share the Facts. But that does not mean news organizations that invest the time to do this often difficult reporting can’t do more basic rigging to help make sure their audiences find it when it’s most useful.
Our observations tell us that some fact-checkers are not using basic digital publishing practices to help voters find the facts — practices that may also be of use to the growing ranks of national fact-checkers in the United States and around the world.
Landing Pages
The Topeka Capital-Journal has been fact-checking since 2014, but you wouldn’t know it during last year’s election — not even if you went to its state government page, which is strictly a chronological collection of articles. To find 2016 examples of the Kansas Fact Meter on the Capital-Journal’s site, you’d need to use its search function — though its results are less than clear and often direct users to other archive pages on which an old Kansas Fact Meter story may appear. The site has an inactive landing page for its fact checks that was last updated in January 2016. Various site and web searches turned up at least seven more Fact Meter stories published since then.
Sure, two decades of metrics tell us that highly motivated users are among the only people who regularly use secondary or tertiary landing pages on news sites (for instance, archive pages for sub-topics, series and authors). And with an increasing share of clicks coming laterally from social media and search results, landing pages can seem like a downright old-fashioned way to get people to an ongoing series of stories, like fact checks.
But state and local politics is different. Many voters tune into political debates and attacks far later than the candidates, their partisans and the journalists who cover them. In 2014, for example, when there was no presidential race generating broad national interest in politics, exit polls from hard-fought Senate races in five states — Alaska, Georgia, Indiana, Kansas and Louisiana — found that more than a third of the voters did not make their choice until some point in the final month of the campaign.
That makes collections of accumulated fact checks especially helpful to people trying to decide which candidate on the ballot is most (or least) worthy of their trust. Assuming, that is, that there is a collection of fact checks for those voters to find.
In Minnesota, voters had the benefit of two local media companies fact-checking political statements during the 2016 campaign. But that didn’t mean all of that work was easy to find.
When we Googled “fact check Minnesota” or “fact checking Minnesota,” for example, the first screen’s worth of search results included a link to a page of Minnesota-related articles from FactCheck.org, a national fact-checking operation based about 1,200 miles away in Philadelphia.
The immediate search result also included links to a collection of “fact checks” on the blog of a Republican political operative from Minnesota, a 2015 PolitiFact item about the state’s governor, a 2014 news release from the state Republican party and several national fact checks from the Associated Press that appeared on Minnesota Public Radio’s website.
Going one click further into the search results, we did find a link to a page collecting all of the “Reality Check” segments from the local CBS affiliate in Minneapolis. WCCO-TV has been airing Reality Check segments since 2012, so its digital team might reasonably hope to show up a bit higher in Google. But at least they did better than rival station KSTP-TV (5 Eyewitness News). This ABC affiliate in nearby St. Paul has been fact-checking just as long as its competitor, using a rating system it calls the “Truth Test.” But it does not have a one-stop collection to all of its Truth Test stories, and none of that station’s reports showed up in more than 20 pages of Google results.
Of course users who knew to include “Truth Test” or “Reality Check” in their search would have found either of the TV station’s stories much more quickly. And the “check” in Reality Check probably gave the CBS station an edge when we searched for “fact checks.” But so did the station’s landing page.
How do we know it was the landing page and not just the Truth Test’s “fact”-less and “check”-less name?
Well, we also tested another “Truth Test” — which also happens to be the name of a long-standing fact-checking project at NBC affiliate KUSA-TV (9News) in Denver. The Colorado capital had three local TV stations doing fact checks in 2016 and all three had landing pages. And when we ran a Google search for “fact check Colorado,” the KUSA Truth Test did well, appearing on the second page of results. And PolitiFact Colorado, which was operated by KMGH-TV (7News Denver), was the first link on the first page.
But the Colorado test also is a reminder that landing pages, while helpful, are not all that it takes. The “Reality Check” segment from KCNC-TV (CBS4) has a perfectly fine landing page, too, but it did not show up at all in our search results.
So what’s to be done beyond that?
A Tool, Not an Archive
The very term “archive” suggests a dank collection of moldering material. Some people may be willing to rummage through the crates and files, but they’re not for everyone. And the same is true for most online news collections, which are mostly reverse-chronological lists, and the only form of interactive engagement is, at best, a “load more” button at the end of the scroll.
PolitiFact started in 2007 with a different premise. Its inventors structured its site from the start to let readers zig and zag, following their curiosity to ratings by topic, by rating and by person or organization. And by treating its fact checks as data, not just stories, those people and organization pages can also automatically generate “scorecards” that are frequently cited and linked to in other media (sometimes critically). But few others have copied PolitiFact’s approach, even though it offers alternate doorways to PolitiFact’s reporting for both new and current readers.
Among the state and local fact-checkers that are not PolitiFact affiliates, the Cedar Rapids Gazette in Iowa is one of the few that has learned from the larger site’s approach. The Gazette’s fact-checking collection does not have as many sorting features as PolitiFact and its state affiliates. But it is more dynamic than typical fact-checking pages. The Gazette’s collection displays a color-coded letter grade, from A (green) to F (red) for each fact check as well as a helpful summary of its rating system and a clickable tally of the number of grades it has issued since last summer.
That explanation is worth a brief note here, too. Explaining your rating system does not actually help promote your fact-checking process, but it is an important form of transparency that is encouraged by the Code of Principles of the International Fact-Checking Network, which is based at the Poynter Institute. Half of the state and local fact-checkers we looked at in the 2016 election cycle were PolitiFact affiliates whose templates included those explanations. But a majority of the remaining state and local fact-checkers did not.
Will such efforts turn a collection of fact checks into a traffic generator? Probably not. But it may mean that users who do land on a landing page have a reason to stay and click for awhile — and perhaps even return.
Other steps fact-checkers can take to elevate and promote their work include:
Navigation and context: Landing pages don’t need to be fancy — with special layouts or logos. Simply tagging articles “fact check” in most digital publishing systems is enough to create a collection of fact checks. And those tags can also power basic navigation, such as links to additional fact-checks at the bottom of an article.
One potentially effective form of navigation and promotion is to link back to earlier fact checks in other stories when and where they are most relevant. As we’ve noted, fact checks have unusual staying power, in part because politicians often repeat talking points and lines of attacks. Referring in context back to previous fact-checks in text (with links!) or on the air (refer to that helpful landing page!) is a good way for a news organization to continue to hold politicians accountable for repeated falsehoods, and to call attention to the fact that there is more fact-checking to be had. But we found relatively few examples of that. So despite the relatively long shelf life of fact checks, most news organizations let them go stale like a box of open crackers in the pantry.
The original purpose of the Share the Facts widget the Reporters’ Lab developed was to make it easy to embed relevant fact checks in context — like embedding a tweet or a YouTube clip. But it doesn’t take more than a simple link to be effective.
Cross promotion: PolitiFact had as many as 18 state affiliates during the 2016 race. But readers, viewers and listeners of those news partners might not have known about it because many of them did not promote that connection. WEWS in Cleveland did that, re-posting the text of its PolitiFact articles on its website. Instead of re-posting the full articles, the News & Observer in Raleigh wrote short, bloggy summaries of its fact checks and then linked to PolitiFact North Carolina for the full story.
This kind of cross-promotion is important to all fact-checkers on all platforms. It helps tell readers and viewers and listeners there’s more fact-checking to be found online.
Socializing: Social media is more than a way to drive traffic. It’s also a platform fact-checkers can use to make their reporting, past and present, part of the political conversation in their community. Social media also can reach specific audiences with thought-provoking teases tailored to overcome reflexive, partisan reactions to a blunt, conclusive headline. (Note: Thought and care is required here to balance the desire to reach out to broader swaths of the electorate and the need to avoid wording that can unintentionally spread misinformation.) And social channels offer ways for local media to connect a helpful, well-understood description like “fact check” (lowercase) to a more distinctive, “branded” label (“We fact-checked that new ad in our latest #RealityTest”).
Many of the news outlets we looked at did not have social channels specifically dedicated to fact-checking. Instead they relied on general news and political feeds, such as @azcpolitics on Twitter from the Arizona Republic’s AZCentral.com. In other cases individual reporters used personal feeds to promote their fact checks. That’s an effective approach in broadcast, where individual journalists often cultivate bigger social followings than some of their unseen, unheard colleagues who work in other formats.
Any of the above can work. The important thing is to be out there, talking about the facts.
X marks the spot
There’s an element of Digital 101 to all of the advice we offer here. We also know that some of our suggestions, while obvious, are still not routinely done by many fact-checking news outlets. The reasons are many. Sometimes it’s the limitations of the newsroom’s publishing systems. At other news organizations, an election-season fact checking effort just may not be all that high on the company’s long list of broader digital programming priorities and needs. At the local level, many news companies rely heavily on a handful of digital producers, at best. And depending on reporters and editors to consistently produce, label and tag their stories is a challenge that’s as old as digital news. (“We’re going to automate all that … right? Right?”)
But dedicating limited reporting staff to fact-checking is a big investment — and that hard work should be presented and touted in ways that are worthy of its potential impact.
The growing number of state and local fact checkers in the United States shows how this journalism could distinguish a news outlet’s election coverage. That’s especially true in highly competitive local media markets. The positions and reliability of elected officials and political candidates at that level are often big question marks for voters. And the durability of each individual fact check throughout a campaign cycle means this reporting can play a significant role in local politics.
Assuming, that is, people know it exists.
This report is based on contributions from Duke Reporters’ Lab student researchers Jamie Cohen, Julia Donheiser, Amanda Lewellyn, Lizzy Raben, Asa Royal, Hank Tucker, Sam Turken.