The number of fact-checking outlets around the world has grown to 188 in more than 60 countries amid global concerns about the spread of misinformation, according to the latest tally by the Duke Reporters’ Lab.
Since the last annual fact-checking census in February 2018, we’ve added 39 more outlets that actively assess claims from politicians and social media, a 26% increase. The new total is also more than four times the 44 fact-checkers we counted when we launched our global database and map in 2014.
Globally, the largest growth came in Asia, which went from 22 to 35 outlets in the past year. Nine of the 27 fact-checking outlets that launched since the start of 2018 were in Asia, including six in India. Latin American fact-checking also saw a growth spurt in that same period, with two new outlets in Costa Rica, and others in Mexico, Panama and Venezuela.
The actual worldwide total is likely much higher than our current tally. That’s because more than a half-dozen of the fact-checkers we’ve added to the database since the start of 2018 began as election-related partnerships that involved the collaboration of multiple organizations. And some those election partners are discussing ways to continue or reactivate that work— either together or on their own.
Over the past 12 months, five separate multimedia partnerships enlisted more than 60 different fact-checking organizations and other news companies to help debunk claims and verify information for voters in Mexico, Brazil, Sweden, Nigeria and the Philippines. And the Poynter Institute’s International Fact-Checking Network assembled a separate team of 19 media outlets from 13 countries to consolidate and share their reporting during the run-up to last month’s elections for the European Parliament. Our database includes each of these partnerships, along with several others— but not each of the individual partners. And because they were intentionally short-run projects, three of these big partnerships appear among the 74 inactive projects we also document in our database.
Politics isn’t the only driver for fact-checkers. Many outlets in our database are concentrating efforts on viral hoaxes and other forms of online misinformation — often in coordination with the big digital platforms on which that misinformation spreads.
We also continue to see new topic-specific fact-checkers such as Metafact in Australia and Health Feedback in France— both of which launched in 2018 to focus on claims about health and medicine for a worldwide audience.
(Here’s how we decide which fact-checkers to include in the Reporters’ Lab database.)
Fact-Checkers by Continent Since February 2018
Africa: 4 to 9
Asia: 22 to 35
Australia: 3 to 5
Europe: 52 to 61
North America: 53 to 60
South America: 15 to 18
TRACKING THE GROWTH
As we’ve noted, elections are not the only draw for aspiring fact-checkers. Many outlets in our database are concentrating their efforts on viral hoaxes and other forms of online misinformation— often in coordination with the big digital platforms on which that misinformation spreads. And those platforms are also providing incentives.
In one such effort, the Reporters’ Lab worked with Google and Schema.org to develop ClaimReview, an open-source tagging system for fact-checks. Google, Microsoft’s Bing, Facebook and YouTube use this system to help identify and showcase fact-checkers’ work in their news feeds and search results— a process that generates traffic and attention for the fact-checkers. It also provides data that is powering experiments in live, real-time fact-checks that can be delivered to users automatically. (Disclosure: Google and Facebook are funders of the Reporters’ Lab.)
Another driver: Facebook. It has recruited independent fact-checking partners around the world to help identify misinformation on its platforms. The social network began that effort in late 2016 with help from the Poynter’s Institute’s IFCN.
Meanwhile, YouTube gave fact-checking a boost in India when it started putting fact-checks at the top of YouTube search results, which helped contribute to a surge of new outlets in that country. Now India has 11 entries in our database, six of which launched since our February 2018 census. And it’s likely there are others to add in the next few weeks.
KINDS OF FACT-CHECKERS
A bit more than half of fact-checkers are part of a media company (106 of 188, or 56%). That percentage has been dropping over the past few years, mostly because of the changing business landscape for media companies in the United States. In our 2018 census, 87% of the U.S. fact-checkers were connected to a media company (41 out of 47). Now it’s 65% (39 out of 60). In other words, as the number of fact-checkers in the U.S. has grown, fewer of them have ties to media companies.
Among fact-checkers in the rest of the world, the media mix remains about half and half (67 out of 128, or 52% — very close to the 54% we saw in 2018).
The fact-checkers that are not part of a larger media organization include independent, standalone organizations, both for-profit and non-profit (the definitions of these legal and economic entities vary greatly from country to country). Some of these fact-checkers are subsidiary projects of bigger organizations that focus on civil society and political accountability. Others are affiliated with think tanks and academic institutions.
Among the recent additions is the journalism department at the University of the Philippines’ College of Mass Communication, which was the coordinator of Tsek.ph, a political fact-checking partnership that also involves two other academic partners.
In the United States, the Duke Reporters’ Lab joined forces last year with PolitiFact’s North Carolina partner, The News & Observer in Raleigh, to report and freely distribute fact-checks to other media across the state. Two of PolitiFact’s other recent local news partners are affiliated with academic institutions, too: West Virginia University’s Reed College of Media and the University of Missouri’s journalism program. The Missouri School of Journalism also has a similar link to KOMU-TV, a local NBC affiliate in Columbia whose investigations unit did some fact-checking of its own during the 2018 midterm elections.
RATINGS
About 70% of the fact-checkers (131 of 188) have well-defined rating systems for categorizing the claims they investigate— similar to what we’ve seen in past years.
As usual, we found many of the rating systems to be entertaining. One of our new favorites comes from Spondeo Media in Mexico, which launched in December. It supplements a basic, four-point, true-to-false scale with a mascot— NETO, a cartoon lie-detector who smiles and jumps for joy with true claims but gets steamed with false ones. Another, India Today Fact Check, rated claims using a scale of one-to-three animated crows, along with a slogan in Hindi: “When you lie, the crow bites” (also the title of a popular movie: “Jhooth bole kauva kaate”).
We decided to time this year’s fact-checking census to correspond with the sixth annual Global Fact Summit, which begins next week in Cape Town, South Africa. About 250 attendees from nearly 60 countries are expected at this year’s gathering— which is yet another measure of fact-checking’s continued growth: That’s five times the number from the first Global Fact in London in 2014.
Joel Luther, Share the Facts Research and Outreach Coordinator at the Duke Reporters’ Lab, and former student researcher Daniela Flamini (now an intern at the IFCN) contributed to this report.
FOOTNOTE: ANOTHER WAY TO COUNT FACT-CHECKERS?
A challenge we have each time the Duke Reporters’ Lab conducts our annual fact-checking censuses is that our final tally depends so much on when we happen to discover these outlets. Our counting also depends on when fact-checkers come and go— especially short-term, election-focused projects that last several months. If a fact-checker was hard at work most of the year covering a campaign, but then closed up shop before we did our census, they’ll still be counted, but in our list of inactive projects.
That inactive list is an interesting trove of good ideas for other fact-checkers to mine. It also provides an entirely different way for us to tally fact-checkers: by counting all the projects that were active at some point during the year, not just the ones that make it to winter.
This approach might better showcase the year in fact-checking. And it also would show that fact-checking was, in fact, growing faster than we even thought it was.
Here’s a chart that compares the number of fact-checkers that we know were active in certain years— even the ones that ultimately closed down— with the subsequent census number for that year.
There are reasons why the Reporters’ Lab would still need to keep counting fact-checkers the way we have since 2014. For one, we need current lists and counts of serious fact-checking projects for all kinds of reasons, including academic research and the experiments that we and others want to try out with real-world fact-checkers.
And yet it’s still great to see how fast fact-checking is growing— even more than we sometimes thought.
(The small print for anyone who’s fact-checking me: The adjusted numbers shown here combine any fact-checker in our database that was active at some point during that given year. Most of our census reports were meant to count the previous year’s activity. For example our February 2018 census appears in this chart as our count of 2017 fact-checkers, even if some of those 2017 fact-checkers were only counted in last year’s census as inactive by the time the census was published. The number shown for 2018 is the 16-month 2018-19 number we are releasing in this report. You also might note that some other numbers here are slightly off from data we’ve previously shared. The main reason is that this proposed form of counting depends on having the dates that each project began and ended. In a handful of cases, we do not.)