Last month, I reported that health misinformation, particularly anti-vaccine conspiracies, is rampant on Facebook worldwide. The problem isn’t confined to one country or platform.
A little more than a week later, the company outlined a plan to curb antivaxxer content. In it, Facebook announced that groups and pages that share anti-vaccine misinformation would be removed from its recommendation algorithm. It will not remove those groups and pages altogether, though.
The results of that plan remain to be seen. And in the meantime, bogus medical cures — which Facebook hasn’t taken specific actions against — are prolific.
According to BuzzSumo, an audience metrics tool, hoaxes claiming to solve specific medical ailments are getting massive reach on Facebook. These false claims are posted in a variety of formats, but can be as simple as a text post from a regular user. And, because they’re often “zombie claims” — or misinformation that doesn’t die out after it’s been debunked — they often continue getting shared for years after first being published.
Below is a chart with other top fact checks since last Tuesday in order of how many likes, comments and shares they got on Facebook, according to data from BuzzSumo and CrowdTangle. Read more about our methodology here.
On March 15, Full Fact debunked a false Facebook post that had more than 60,000 engagements as of publication. In it, a user claimed that stabbing victims should use tampons to stop the bleeding and save their lives.
That claim, which Full Fact said their readers had asked them about, is false. First aid experts told the fact-checker there’s no evidence it would work — and it might even do more harm than good.
But the post, which was only a status posted by a regular user instead of a page, still racked up about 55 times more Facebook engagements than the fact check. That’s despite Full Fact’s partnership with Facebook, which enables fact-checkers to decrease the reach of false posts in the News Feed. (Disclosure: Being a signatory of the International Fact-Checking Network’s code of principles is a necessary condition for joining the project.)
And it’s not the only bogus medical rumor to get more reach than a fact check on Facebook.
Last month, Chequeado debunked a viral video that falsely claimed pricking someone’s fingers and ears while they’re having a stroke can save their life. Fact-checkers reported that health experts told them there is no scientific basis for the claim and flagged the story as false as part of its partnership with Facebook.
That video had been circulating online since at least 2003, when Snopes published a fact check about it. Spanish fact-checking site Maldito Bulo also debunked the hoax. But it racked up more than 500,000 engagements on Facebook — about 165 times more engagements than Chequeado’s debunk.
These kinds of health hoaxes are dangerous. They promote bogus cures in an information environment where health misinformation is known to go viral — often in countries where access to treatment is scarce. The consequences can be dire.
On Facebook, health misinformation is king. And it’s a global problem.
And bogus medical cures present a number of challenges for fact-checkers.
First, they often linger on the internet for years in spite of repeatedly being debunked. Chequeado’s story about stroke cures is a good example of that, as are similar fact checks about everything from using cayenne pepper to stop bleeding and bogus cures for HIV in Africa. Since they’re frequently not tied to a specific news event and give users an action item, bogus medical cures have a longer shelf life than most hoaxes.
Second, fact-checking claims about medicine, even those that are legitimate, can be tricky. As Africa Check noted in its guide to debunking health misinformation, academic qualifications can be fudged fairly easily online; several fake academic journals claim to publish real research. Then there’s the fact that just because there’s no conclusive proof for a treatment does not mean it’s ineffective.
Last week, we reported that two new fact-checking projects are trying to address some of these problems head-on. By crowdsourcing from certified scientists, both Metafact and HealthFeedback try to answer readers’ specific questions about health. Then, that work is published in a fact-checking format.
That approach could help address specific questions that Facebook users have about purported medical cures. But absent any organized collaboration with tech platforms, it’s unlikely that outlets like Metafact or HealthFeedback will be able to scale their work to the volume of bogus medical cures online. And it’s clear that simply banning recommendations for anti-vaccine content won’t cut down on all the different kinds of health misinformation out there.
Interesting and discouraging. Social media is everyone talking at the same time. The challenge is to learn who is real and who is just noise. Which, by the way, is one that journalists face when publishing health care information sourced from “experts”. Electronics and print media have for years been prone to zipping out “latest news” from an “authority”, a researcher, or a clinical study summary. Just on the subject of coffee, chocolate, and fats in food, any week features exciting and often changing insights into the advantage or dangers on one or the other. What is new s what is “exciting” is a judgment needing maturity and analysis. Too often writers and editors apply neither. Experts are often driving hidden agenda, researchers are famously in error or publish bad information, and medical journal now report up to 50% of the papers published to be incorrect, not peer reviewed as indicated, and part of an industry where professionals publish to maintain or elevate their status. The work is harder than ever for journalism – but that is no excuse to continue to write inflammatory and dramatic health articles which too often are utter nonsense.
Don’t sweat the Social Media world – it’s a madhouse. Worry about what you write. If you can’t confirm news, it should not run. Is that so hard?