The big lie was planned well in advance. Before he became president in 2016, Donald Trump was saying that U.S. elections were rigged. In 2020, months before voting began, Trump said the only way he’d lose would be because of voter fraud. After Election Day, he claimed it over and over again, irregularities and illegalities that simply never happened.
We’ve entered an intense period where false words are inspiring violent deeds, as we saw when Trump supporters stormed the U.S. Capitol, following his and others’ claims that the election result they wanted was stolen.
A stolen election that actually wasn’t stolen may be the most consequential of phony claims that fact-checkers like me have spent years debunking. But it’s not the ugliest.
The QAnon conspiracy theory alleges child abuse and harm on an unimaginable scale. Its adherents believe that people like Joe Biden, Barack Obama and George Clooney are Satan-worshiping child molesters who evade arrest because of their celebrity. It’s a crazy, made-up conspiracy. But if you honestly believed it to be true, you’d feel compelled to take action, right?
Under the cover of political debate and discussion, we’ve moved into a dangerous space in U.S. politics, where online lies are motivating dangerous, real-world action. As editor-in-chief of PolitiFact, I’ve been documenting and correcting political lies since 2007. As they’ve festered and spread, I’ve come to believe that we can’t dismiss fabricated conspiracy theories as just talk. There needs to be a response to the problem of misinformation, and it needs to come from every sector of society. But most especially, it demands responses from technology companies and government.
Because the First Amendment to the U.S. Constitution protects most free expression and forbids government censorship, it’s tempting to throw up our hands and say there’s nothing that can be done. But that is shortsighted. We’re at a turning point where we all need to think creatively for new rules of the road for how we communicate, especially on the internet.
As a fact-checker, I have three suggestions for actions that could stop lies without censorship.
First, the social media companies — especially YouTube, Twitter and Facebook — have to be more consistent in their penalties for spreading misinformation, and that must include politicians and candidates.
The tech companies need to give their current policies on misinformation a thorough scrub from top to bottom, because they clearly can and should do more. They’ve shown us that in recent weeks by taking away Trump’s Twitter account and other online postings, as well as large-scale removal of QAnon groups. Their outlook needs to be preventing real-world harms first, with a more realistic sense that potential harms are imminent. There must be more tangible penalties in the form of reduced influence for creating and sharing lies.
PolitiFact has worked with Facebook since 2016 on fact-checking misinformation on its platform. The program does a lot of good: It slows the spread of misinformation by downgrading content, and it gets fact-checking in front of people who don’t know they should be looking for it. But Facebook needs to rethink exemptions for political candidates and elected officials. Clearly some of these people are the ones who are most effective at spreading false information.
Twitter recently published a policy that forbids manipulating or interfering in elections or other civic processes, with escalating penalties. That’s great, but Twitter has a track record of only taking enforcement action when everyone is paying attention. Enforcement of its policies over the years has been wildly inconsistent. And while YouTube says it takes action internally to remove content, it’s impossible to quantify its efforts from the outside. YouTube’s policies, like Twitter’s, appear to be deployed unevenly and mainly after public outcry.
Second, regulation of the tech companies by the federal government should not be taboo. We need legislation or detailed regulations from a federal agency. Once upon a time, broadcasters were subject to a fairness doctrine, a requirement to present public controversies in a way that was fair, equitable and balanced. That policy was later considered overreach and discarded, but it showed that government is fully capable of exerting its influence on an information ecosystem.
Why not balance the protections of Section 230(c) of the Communications Decency Act (which gives internet companies legal immunity from speech on their platforms) with a requirement that they have published, specific misinformation policies with proof of consistent enforcement? Such rules can be created in a nonpartisan and fair way — though if one side breaks the rules more, they’ll obviously face more penalties. We should all be OK with that.
Finally, let’s get serious about addressing threats of violence and menacing intimidation. Threats are closely related to lies and conspiracy theories. They’re attempts to smother corrections of false narratives, and sometimes the threats work.
These threats happen because misinformers don’t have facts on their side, so their response to challenges are often threats of violence. My team has received death threats (“Your next time on Facebook will be on Facebook Live, begging for your life”), threats of harm (“If I’m pushed I will find your physical address, and then s— can get real fun!”), and vaguely ominous predictions (“You are done. And the world will know it very, very soon”). And it’s not just fact-checkers who get threats; it’s anyone who challenges a conspiracy theory these days: journalists, politicians, clergy and recently, even election workers.
New, sensible laws against cyber-stalking and cyber-harassment across state lines are desperately needed. If you threaten people with a Gmail, Hotmail or Yahoo account, shouldn’t you lose your account? We need more civic remedies that can happen fast. Phone threats need remedies, too. If law enforcement stepped up, and phone and tech companies had practical and effective approaches to check or even remove accounts used to threaten people, we’d see a reduction in threats.
All the approaches I’m suggesting need to be thoughtful and proportionate to the problems we’re facing. We don’t want cures that are worse than the disease. Improvements might require more time and money. But I believe and have seen that there are ways to reduce political lies without political censorship, to balance people’s First Amendment rights with protections for democratic discourse. We should approach information policy in a nonpartisan way, where differences of opinions are respected, but differences of fact are not.
If we want to live in a democracy where we’re capable of reasoned debate and decision making, every American needs to think about solutions. Today, we can all take responsibility for the quality of information we read and consume, and we can all take responsibility when we stay silent in the face of lies. If we don’t take action, we’ll see more events like the riot at the U.S. Capitol, and next time, they will be worse.
This article was originally published Feb. 2, 2021.