YouTube has come under fire over the past few weeks for enabling the spread of anti-vaccine conspiracies on the platform. Even the United States government has pressured the technology company to do better.
Now, the company is taking steps to surface fact checks alongside questionable content.
BuzzFeed News first reported on Thursday that the technology company had started testing a feature in India that automatically displays “information panels” when users search for topics that are “prone to misinformation.” Those panels display fact checks from “eligible publishers.”
“As part of our ongoing efforts to build a better news experience on YouTube, we are expanding our information panels to bring fact checks from eligible publishers to YouTube,” said a YouTube spokesperson in an email to Poynter. “We are launching this feature in India and plan to roll this out in more countries as time goes on.”
So how does it work?
BuzzFeed reported that the information panels will only show up on search pages — not individual videos. The platform will surface the panels when a user’s query seeks information about the accuracy of a claim. Content with misinformation may still show up in results, but it will be contextualized with any matching fact checks at the top.
YouTube has been using panels to contextualize videos since at least July, when it started pulling information from Wikipedia to add more context about the creators of certain videos. But where will the platform surface fact checks?
YouTube told Poynter that the company is using the Schema.org ClaimReview markup to identify fact checks related to specific kinds of searches. Parent company Google has been using ClaimReview, essentially a few lines of code that fact-checkers like Snopes add to their articles, to highlight fact checks in search since 2017. The code serves as a kind of stamp that makes it easier for Google to identify fact checks.
Not anyone can use ClaimReview. Google has published rules on who can include the code in their articles, which include things like “discrete, addressable claims and checks must be easily identified in the body of fact-check articles” and “readers should be able to understand what was checked and what conclusions were reached.”
That’s different than what BuzzFeed characterized as YouTube’s “verified fact-checking partners.” YouTube is basically just employing technology that Google is already using to surface fact checks in search results.
It’s also a more hands-off approach to misinformation than that taken by Facebook, which launched a program in December 2016 to individually partner with fact-checking organizations around the world. That initiative relies on fact-checkers to manually debunk discrete posts in a custom dashboard on the site, thereby decreasing false posts reach in the News Feed. (Disclosure: Being a signatory of the International Fact-Checking Network’s code of principles is a necessary condition for joining the project.)
Both Google and Facebook’s approaches to misinformation have been widely cited for their efforts to surface content from fact-checkers. And while the former has not been as widely scrutinized as the latter, it hasn’t been flawless.
In January, an online furor began after Google search results erroneously appended a Washington Post fact check to a story from The Daily Caller. The fact check, which was displayed in Google’s “Knowledge Panel” feature for The Daily Caller — similar to YouTube’s information panels — debunked a statement that was not made verbatim in the outlet’s story.
Google previously told Poynter that that application was a mistake made in part due to ongoing bugs with the Knowledge Panel feature, which was derived from the ratio of fact checks to what is covered on a specific news site. The company later suspended the feature.
Still, to see another tech platform take concrete steps toward surfacing third-party fact checks is promising.
“YouTube has been operating in a black box with no means for anyone sampling their videos to know if it’s true or false. This is the first serious attempt they have made to incorporate fact checks of any kind,” said Jency Jacob, managing editor of Boom Live, an Indian fact-checking project whose Hindi work will be surfaced in YouTube’s information panels, in a WhatsApp message. “We will have to see how it works.”
Right now, YouTube’s fact-checking feature is confined to English and Hindi and will only be visible to a limited number of users in India, where elections will be held in April and May. Misinformation has plagued the country over the past few weeks as its ongoing conflict with Pakistan escalates.
YouTube said it plans to expand the fact-checking feature to other countries in 2019, but it declined to clarify a timeline.