Misinformation has long found a home on Facebook.
In 2018, Facebook referrals were a huge driver of traffic for fake and hyperpartisan news sites. Last year researchers found that about one-quarter of the U.S. reads fake news and one-quarter reads fact checks — and those groups don’t overlap much. That finding was echoed in a study published last week, which found that older Americans disproportionately share both fake news stories and fact checks and Facebook.
We know that misinformation gets plenty of engagement on Facebook. But how does the reach of corresponding fact checks compare?
As it enters its third year, measuring the impact of Facebook’s partnership with fact-checkers has become increasingly important. The program has been around since late 2016 and enables fact-checkers to reduce the reach of false stories, photos and videos in News Feed. But fact-checkers told Poynter last month that they’re still unsure of how effective the arrangement is. (Disclosure: Being a signatory of the International Fact-Checking Network’s code of principles is a necessary condition for joining the project.)
So starting this week, Poynter will analyze five of the top-performing fact checks on Facebook each week to see how their reach compared to the hoaxes they debunked. And while the numbers show a few successes since last Tuesday, they also reveal that fact checks are still falling short of containing misinformation on the platform.
Below are the top fact checks of the week in order of how many likes, comments and shares they got on Facebook, according to data from the audience metrics tool BuzzSumo. None of them address spoken statements because they aren’t tied to a specific URL, image or video that fact-checkers can flag (i.e. “Donald Trump falsely says there’s ‘never’ been so many border apprehensions”). Read more about our methodology here.
1. ‘Claim about 63,000 Americans being killed by illegal immigrants is still wrong’
Fact: 16.4K engagements
Fake: 188 engagements
PolitiFact hit it out of the park with this fact check, amassing tens of thousands of engagements on Facebook — even though the original fake only collected 188.
The article debunked a Facebook post from a user claiming that 63,000 people had been murdered by undocumented immigrants since 2001. The post was accompanied by a photo of the 9/11 terrorist attacks, during which 2,996 people died.
PolitiFact flagged the Facebook post as part of its partnership with the company, according to its fact check. Under that partnership, flagged posts are supposed to give users a warning before sharing them, but Poynter was still able to share the false post uninhibited.
2. ‘Did Schumer and Pelosi Help Obama Hand $150 Billion to an ‘Enemy of the US’?’
Fact: 4.5K engagements
Fake: 188.9K engagements
It’s been a bad few weeks for the reach of Snopes’ fact checks relative to the hoaxes they debunked. But this one takes the cake.
In a text post published in late December, a Facebook user claimed that Democratic lawmakers Chuck Schumer and Nancy Pelosi helped former president Barack Obama give Iran $150 billion but have refused to grant Trump $5 billion for a wall along the U.S.-Mexico border. Snopes debunked that post Jan. 8 — after it had already amassed thousands of engagements.
In its fact check, Snopes reported that Obama never gave Iran $150 billion; the government simply unfroze Iranian assets as part of the 2015 nuclear deal. Trump himself also cited the number in a tweet in mid-December. Snopes wrote that $150 million is a high estimate of the assets that Iran got back after the deal — not new cash paid to the government using American dollars.
Snopes’ fact check didn’t say whether or not it had debunked the false post as part of the Facebook project. When asked, CEO David Mikkelson told Poynter that they had not submitted the debunk into the platform’s fact-checking system. That explains why, despite the debunk, Poynter was still able to share the post on Facebook without receiving a warning.
3. ‘The claim that the minced meat in BİM is additive’
Fact: 1.9K engagements
Fake: 19.8K engagements
A viral false video about mincemeat got nearly 10 times more engagements than a fact check debunking it this week.
Turkish fact-checker Teyit debunked the false post, which was posted by a page with more than 5,000 likes and claimed a retail company was adding color additives to mincemeat, last Wednesday. The nearly four-minute video shows someone handling a package of meat and dunking it in a glass of water to rinse off the alleged additive. Other users even tried the same process for themselves.
But in its fact check, Teyit reported that the meat changed color not because it had additives, but precisely because it was meat. According to the story, the meat’s myoglobin protein structure is rapidly soluble in water, making it turn a whitish color. Additionally, Teyit met with the Ministry of Agriculture and Forestry’s Department of Food Control Laboratories, which said that the color change was not from food coloring.
Teyit debunked the fake viral meat video as part of its fact-checking partnership with Facebook. Poynter was not able to share the post without receiving a warning about the debunk, but Facebook did not append the fact check in a related articles section as it does for links.
4. ‘It is false that Lula’s son has a $50 million plane paid ‘with the people’s money’’
Fact: 1.1K engagements
Fake: 1.4K engagements
Months after a presidential election that divided the country, Brazilian Facebook continues to be marred by misinformation.
Brazilian fact-checker Agência Lupa debunked a viral post Thursday that claimed the son of former president Luiz Inácio Lula da Silva had a $50 million airplane that was paid for with taxes. A hyperpartisan page posted the image on Facebook earlier that day.
In its fact check, Lupa reported that the false claim about Lula’s son’s plane has been circulating the internet since at least 2013. The aircraft posted on Facebook is an image of a Gulfstream III from 1983, which — according to public property history — was never owned by a Brazilian. Wells Fargo owns the plane, Lupa wrote.
Lupa flagged the false post as part of its partnership with Facebook, which decreased its future reach but still got more engagements than the fact check. Additionally, Poynter was still able to share the post without a warning.
5. ‘No, this optical illusion was not created by a Japanese neurologist to evaluate your stress’
Fact: 948 engagements
Fake: 191.5K engagements
This fact check from the Agence France-Presse got squashed in the face of such a viral Facebook post.
The post, which a user first published in November, claimed that an optical illusion was created by a Japanese neurologist to determine whether or not a person is stressed. According to the post, if you look at the image and see it moving, you’re stressed, if it’s still, you’re calm.
That’s bogus, according to the AFP’s fact check — it wasn’t created by a neurologist at all. A Ukrainian designer published the image in 2016, which AFP traced by doing a reverse image search. It found the photo on Shutterstock, a popular stock photo service, and the designer told AFP that he created the optical illusion in Adobe Illustrator and that it was one of his best-selling images.
AFP reported that the false post has been shared in several countries since the original in November, including the United States, the Philippines, Spain, Turkey and France. The AFP’s debunk appears below the original Facebook post and, when Poynter tried to share it, a warning message appeared.