In the early days of the pandemic, a Facebook page for Earthley promised an easy way to avoid getting COVID-19. The wellness company promoted a vitamin D cream and an elderberry elixir to strengthen the body’s immune response and “help fight off the possibility of you or your family getting the coronavirus.”
The post appeared to violate Facebook’s policy for false claims of coronavirus treatments and cures. It was serious enough that the Federal Trade Commission mentioned it in a warning letter amid the agency’s crackdown on coronavirus scams.
Nearly one year later, Earthley is still spreading false information about COVID-19 and the vaccines — despite Facebook’s new rules against that content.
The page has published out-of-context news stories about vaccine side effects and skirted moderation by misspelling the word “vaccine” in posts questioning whether vaccination is safe. We reached out to Earthley for a comment, but we haven’t heard back.
“It doesn’t take an advanced, opaque algorithm like Facebook likes to use to predict this behavior,” said John Gregory, deputy health editor at NewsGuard, a firm that tracks online misinformation. “The pages we’ve seen spreading these myths … they are pages that have been spreading health misinformation in most cases for years.”
Facebook’s strategy for tamping down misinformation about vaccines is to single out certain categories of anti-vaccine claims, rather than taking a broad approach. The result is that many types of false or exaggerated claims against vaccines are allowed to remain on the platform.
As Facebook faces criticism from both parties in Congress, it is reluctant to take additional action that could be seen as a violation of free speech.
“People often say things that aren’t verifiably true, but that speak to their lived experiences, and I think we have to be careful restricting that,” CEO Mark Zuckerberg said during a March 25 House hearing.
What Facebook promised
From the outset of the pandemic, Facebook promised a tough approach to coronavirus misinformation — one that didn’t solely rely on findings from its third-party fact-checking partners, including PolitiFact.
Facebook issued a broad prohibition in January 2020 on false claims or conspiracy theories about the coronavirus “that could cause harm to people who believe them.”
As vaccinations began in the U.S. in December, the company announced an additional ban on coronavirus vaccine claims “that have been debunked by public health experts.” Almost two months later, the company pledged to take down more specific claims. The full list includes posts that say the vaccines “kill or seriously harm people” or include “anything not on the vaccine ingredient list.”
Facebook says it has removed millions of posts that violate those policies since December, “including 2 million since February alone,” spokesperson Kevin McAlister told PolitiFact.
The company uses artificial intelligence to find false claims about the coronavirus and track copies of those claims when they are shared, wrote Guy Rosen, Facebook’s vice president of integrity, in a March 22 blog post. “As a result, we’ve removed more than 12 million pieces of content about COVID-19 and vaccines.”
But Americans who logged on to Facebook and Instagram, its sister platform, over the past few months may still have seen posts that appear to violate those terms. The posts include:
- The COVID-19 vaccine could lead to prion diseases, Alzheimer’s, ALS and other neurodegenerative diseases. (Pants on Fire)
- “If you take the vaccine, you’ll be enrolled in a pharmacovigilance tracking system.” (False)
- There are nanoparticles in the COVID-19 vaccine that will help people “locate you” via 5G networks. (Pants on Fire)
PolitiFact fact-checked those posts as part of its partnership with Facebook to combat false news and misinformation. When we rate a post as false or misleading, Facebook reduces its reach in the News Feed and alerts users who shared it. Unless the posts violate other policies, they remain on the platform. (Read more about our partnership with Facebook.)
PolitiFact is one of 10 news organizations in the U.S. alone that fact-check false and misleading claims on Facebook, a program that began in December 2016. Facebook’s moderation policies, including those that prohibit misinformation about COVID-19 vaccines, are independent of its fact-checking program.
Facebook’s rules against COVID-19 vaccine misinformation were created to address the most extreme cases of misinformation: claims that the vaccine kills you, causes autism or infertility, changes your DNA or “turns you into a monkey.” After we sent Facebook the posts we found about a “pharmacovigilance tracking system” and 5G nanoparticles, the company removed them for violating its policies against COVID-19 and vaccine misinformation.
But that kind of enforcement has been uneven.
“Our enforcement isn’t perfect, which is why we’re always improving it while also working with outside experts to make sure that our policies remain in the right place,” McAlister said.
Policy creates gray area
Some of the most popular anti-vaccine posts are first-person testimonials about purported side effects.
In December, when vaccines from Pfizer-BioNTech and Moderna were approved for emergency use, videos emerged on Facebook claiming to show a nurse passing out after getting a shot (even though she gets dizzy with any kind of pain) and a nurse saying she developed Bell’s Palsy (even though there was no record of such a reaction in Tennessee). Other users have made unproven claims that the vaccines caused them to shake and convulse. (Public health officials say there’s no evidence to suggest such a connection.)
Those viral videos surfaced through Facebook’s fact-checking partnership with PolitiFact. The effect of a negative rating is a demotion in News Feed and in some cases a warning label. Although Facebook doesn’t allow posts that falsely claim the COVID-19 vaccines “kill or seriously harm people,” first-person testimonials are allowed to live on the platform because they are “personal experiences or anecdotes.”
“The goal of this policy is to reduce health harm to people, while also allowing people to discuss, debate and share their personal experiences, opinions and news related to the COVID-19 pandemic,” Facebook says.
Experts worry about bad actors abusing the system.
“Being comprehensive about what does violate and what doesn’t violate (Facebook’s policy) is a strategy, but it can be something that’s easily worked around,” said Kolina Koltai, a postdoctoral fellow at the University of Washington who studies anti-vaccine misinformation. “Other platforms have very simple policies that are very stringent about their moderation.”
One example is Pinterest, which bans “anti-vaccination advice” and “misinformation about public health or safety emergencies.” The company doesn’t have a set list of anti-vaccine claims that are against the rules, so Pinterest can be flexible about which posts it removes.
Other social media companies use policies in between the anti-misinformation policies of Facebook and Pinterest. Twitter prohibits tweets that “advance harmful false or misleading narratives about COVID-19 vaccinations” and has said it removed 8,400 posts since December. Over the past six months, YouTube said it has removed more than 30,000 videos with misleading or false claims about the vaccines.
Those numbers are a fraction of the millions of posts Facebook says it has removed for violating its COVID-19 misinformation rules. In his blog post, Rosen said that Facebook has “over 35,000 people working on these challenges,” an apparent reference to the number of employees who work on the platform’s “safety and security,” which also include hate speech and harassment.
But with about 2.8 billion monthly users, Facebook is the most-used social media platform in the world and the biggest vector for misinformation about vaccines.
An analysis from First Draft, a nonprofit that studies online misinformation, found that at least 3,200 posts containing claims explicitly banned by Facebook were published on the platform between February and March, amassing thousands of likes, shares and comments.
“The approach Facebook really should take is looking at the history of these sources and these pages — and using that in order to inform their policies, rather than the kind of whack-a-mole approach later on,” Gregory said. “If the pandemic has proven anything, it’s that people who promoted misinformation before are going to promote misinformation about this.”
‘Super-spreaders’ are few, but power many false claims
Multiple independent reports have found that a small but well-followed group of misinformers are responsible for much of the anti-vaccine posts that Facebook users see.
Facebook took action to stop some of these accounts, but not all.
In November, NewsGuard published a report that listed “super-spreaders” of misinformation about COVID-19 vaccine development. Of the 14 English-language pages that the company cataloged, five are now gone from Facebook, Gregory said. (Facebook confirmed to PolitiFact that two of the pages, WorldTruth.tv and Energy Therapy, were removed for violating its policies against coronavirus vaccine misinformation.)
Among the NewsGuard super-spreaders that are still on the platform include the Truth About Cancer and Dr. Christiane Northrup, with each having hundreds of thousands of followers. The Truth About Cancer has published articles saying “the vaccine is the pandemic” and that the Pfizer vaccine is “untested and dangerous.” Northrup, a Maine gynecologist, called coronavirus vaccines “experimental weapons against humanity” in a March 14 Facebook video.
“This thing was invented to get humanity injected,” she said of COVID-19, just before telling her followers to watch a video that claims the vaccines cause autoimmune diseases and death. (They don’t.)
On March 24, the Center for Countering Digital Hate, a nonprofit that has criticized technology companies’ misinformation policies, published its own report on vaccine misinformation super-spreaders. It found that “just 12 individuals and their organizations are responsible for the bulk of anti-vaxx content shared or posted on Facebook and Twitter.” The list includes Robert F. Kennedy Jr., Joseph Mercola, and Ty and Charlene Bollinger, the couple that runs the Truth About Cancer.
During the House hearing on disinformation and extremism, Rep. Mike Doyle, D-Pa., cited the report to criticize the way that tech platforms address misinformation about vaccines.
“If you think the vaccines work, why have your companies allowed accounts that repeatedly offend your vaccine misinformation policies to remain up?” Doyle asked. “You’re exposing tens of millions of users to this every day.”
When asked if Facebook would remove the 12 super-spreaders, Zuckerberg demurred.
“Congressman, I would need to look at the, and have our team look at the exact examples to make sure they violate our policies. But we have a policy in place around this,” he said.
This article was originally published by PolitiFact, which is part of the Poynter Institute. It is republished here with permission. See the sources for these fact checks here and more of their fact checks here.