May 29, 2019

If you were on Twitter last week, there’s a good chance that you saw the media debunk an altered video of Nancy Pelosi — and criticize Silicon Valley’s response to it.

On Thursday, a hyperpartisan Facebook page posted the video, which appeared to depict the U.S. Speaker of the House of Representatives slurring her words at a Center for American Progress event. The Washington Post then published a story about the post, but it was too late: The video had more than 2 million views as of this publication.

That’s bad enough. But then Rudy Giuliani, Donald Trump’s personal attorney, shared it (though he later deleted it). Trump also shared a version of the falsehood, tweeting a selectively edited video montage that makes it look like Pelosi is stammering.

But it’s bogus — the video Giuliani shared was manually slowed down and edited to make Pelosi’s speech sound warped and distorted. At least five fact-checking sites that partner with Facebook to reduce the reach of misinformation debunked the hoax. (Disclosure: Being a signatory of Poynter’s International Fact-Checking Network code of principles is a necessary condition for joining the project.)

While salacious, that kind of misinformation is pretty common online.

Fact-checkers worldwide regularly debunk videos that are lightly edited or simply taken out of context to advance a false claim on social media. It happens in the United States (no, these videos don’t prove Hillary Clinton has Parkinson’s disease) as it does in any other country with a large proportion of Facebook users.

Aside from its virality, the Pelosi video was no different. But American journalists treated it as if it were a litmus test for the current state of online political information — a treatment that lacks the nuance required to better understand the problem.

During a Friday interview with Monika Bickert, Facebook’s vice president for product policy and counterterrorism, CNN’s Anderson Cooper repeatedly asked why the company decided to leave the altered video up instead of removing it. Bickert cited Facebook’s public community standards, which don’t provide for the removal of content just because it’s false.

But Cooper pressed on.

“I understand it’s a big business to get into of trying to figure out what’s true or not, but you’re making money by being in the news business. If you can’t do it well, shouldn’t you just get out of the news business?” he asked.

And Cooper wasn’t the only one to question why Facebook and Twitter left the altered Pelosi video up while YouTube removed it.

“Facebook’s refusal to pull a doctored video of House Speaker Nancy Pelosi highlights how the internet now threatens truth instead of spreading it,” a columnist wrote for USA Today.

“A business like Facebook doesn’t believe in fakes. For it, a video is real so long as it’s content,” The Atlantic wrote.

Pelosi’s colleagues also went after Facebook, tweeting that the company can and should do more to eliminate the false post.

Anyone covering misinformation on Facebook knows why it left the bogus Pelosi video up: False posts are not against the company’s community standards. Only false information that violates another rule, such as hate speech, incitement to violence, terrorism, etc., would meet the standard for removal.

Instead, Facebook relies on more than 50 independent fact-checking organizations worldwide to find, review and rate the truthfulness of questionable posts on the platform. Once a fact-checker has deemed a post to be false, Facebook opts to reduce its reach in News Feed, append related fact checks below it and notify users who try to share it.

How Facebook deals with misinformation, in one graphic

That policy has been in place since the creation of Facebook’s fact-checking partnership in December 2016. And it isn’t arbitrary — the company said at the time that it didn’t want to remove content simply because it’s false, which would open it up to allegations of censorship.

“(The fact-checkers) can dispute an article and link to their explanation and then provide context on Facebook so people and the community can decide for themselves whether they want to trust an article or share it,” Adam Mosseri, former vice president for product management for News Feed, told BuzzFeed News when the partnership launched.

Having journalists question the impact and logic of this process is important — it keeps Facebook on its toes and defending its anti-misinformation policies, which even its fact-checking partners have said they wish were more transparent. And there’s a legitimate debate to be had over whether or not Facebook’s community standards should even be the end-all, be-all for fighting misinformation on the platform.

“Facebook’s community standards are not regulations. They are not laws,” The New Yorker wrote on Tuesday. “They are arbitrary and fuzzy guidelines developed by employees of a private company that are then open to interpretation by people paid by that company, and enforced — or not — by other employees of that company.”

But that debate needs more nuance.

Several outlets have suggested that Facebook is a media company, which many other journalists agree with. By selecting which content to leave up and take down, the tech giant does already make some editorial decisions. If it were considered a media company, that would inherently give Facebook more responsibility for the veracity of content on its platform.

And, as my former boss Alexios Mantzarlis pointed out on Twitter this week, the company could then develop a kind of rapid-response team that downranks hoaxes whose engagement outpaces Facebook’s fact-checking partners.

But is a Facebook actually a media company, or just a platform on which media are shared? Is it a newspaper or a newsstand? Is it the public square or the town crier?

Those questions still need to be definitively answered — ideally by academics in concert with regulators (The Post reported that the Pelosi debacle could increase lawmakers’ scrutiny of the platforms.) Any solution needs to involve multiple stakeholders, including users, nonprofits and the targets of misinformation — not just news outlets and Silicon Valley.

“The take-it-down brigade might consider developing an alternate set of Facebook community standards for public consideration,” Casey Newton wrote in his newsletter for The Verge on Tuesday. “I have no doubt that there are better ways to draw the boundaries here — to swiftly purge malicious propaganda, while promoting what is plainly art. But someone has to draw those boundaries, and defend them.”

Before that time comes (if it does at all), journalists should refrain from presupposing that the onus is on Facebook to remove false content in all circumstances. In an era when dictators worldwide use the term “fake news” to systematically undermine the press, there needs to be a more nuanced debate about whether one of the largest companies in the world should be removing content that it deems to be false without input from third parties.

Otherwise, journalists risk inadvertently feeding the partisan machine that creates misinformation like the doctored Pelosi video in the first place.

“It’s easy to imagine creating a rule like ‘Take down false examples of hate speech,’ but far harder to come up with a rule that requires taking down the Pelosi video but not other forms of mockery, satire, or dissent,” Angela Chen wrote for the MIT Technology Review on Tuesday.

Franklin Foer, a staff writer for The Atlantic, said it best on PBS Newshour on Monday.

“We want (Facebook) to come down on the side of reality and of truth, but we don’t want them tipping — using their power in order to influence political outcomes, because I think that that’s — that’s too much responsibility to have in one corporation,” he said.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Daniel Funke is a staff writer covering online misinformation for PolitiFact. He previously reported for Poynter as a fact-checking reporter and a Google News Lab…
Daniel Funke

More News

Back to News