Facebook is working with more than 60 researchers around the world to study how misinformation and other problematic content spread on the platform.
In a blog post published Monday, Facebook announced the winners of a research grant from Social Science One and Social Science Research Council, a collaboration that bridges the gap between academics and the private sector. Those researchers, which hail from 30 institutions in 11 countries, will get access to a broad swath of Facebook data that will reveal how content is shared on the platform.
“Over the past several months, we’ve begun building a first-of-its-kind data sharing infrastructure to provide researchers access to Facebook data in a secure manner that protects people’s privacy,” wrote Elliot Schrage, Facebook’s vice president for special projects, and Chaya Nayak, strategic initiatives manager, in Monday’s blog post.
As part of that infrastructure, winning researchers, who were chosen by an “international body of peer reviewers,” will be given three tools that could help bolster their research about Facebook.
First is the CrowdTangle application programming interface (API), which will reveal public Facebook and Instagram posts in a data-friendly format. Journalists and researchers use the Facebook-owned tool to measure how pages, groups and specific pieces of content are performing compared to others on the platform.
[the_ad id=”667826″]
Second is Facebook’s advertising library API. Using that tool, researchers will be able to see ads “related to politics or issues on Facebook in the U.S., U.K., Brazil, India, Ukraine, Israel and the EU.”
Finally, grantees will have access to Facebook’s URL dataset, which aggregates and anonymizes data about specific URLs that users have shared. The dataset includes all URLs that have been publicly shared by at least 100 unique users and has engagement numbers like the top country where the link was shared and how Facebook’s fact-checking partners have rated it.
In short: Researchers are getting a peek under the hood to see how Facebook really works. And that’s pretty rare.
RELATED ARTICLE: CrowdTangle now lets users report potentially false news
In the past, most of the research that has been conducted about Facebook has been qualitative, relying on surveys or unrepresentative samples to draw conclusions. Unlike Twitter, which is frequently the subject of misinformation research, the company has fairly restrictive APIs, meaning it keeps an enormous amount of data secret.
And it’s not just the public that doesn’t have much access to Facebook data. Since launching its partnership with fact-checking sites in December 2016, Facebook has shared precious little public data about how those efforts have limited the spread of misinformation. (Disclosure: Being a signatory of the International Fact-Checking Network’s code of principles is a necessary condition for joining the project.)
[the_ad id=’667872′]
Facebook’s partnership with Social Science One, which was originally announced last April, is the company’s answer to those transparency concerns. In December, Tessa Lyons, a product manager in charge of Facebook’s anti-misinformation efforts, told Poynter that she expected the company’s partnership with academics — which functions independently of the company and is funded by a slew of foundations — to reveal more about how misinformation spreads.
And taking a look at the projects that won the Social Science One grant, she might be right.
Of the winning abstracts, five specifically address the role of false and misleading content on Facebook. Their aims run the gamut from measuring why people share fake news stories to how misinformation affects voters in Chile and Germany — but they all plan to use Facebook’s data to study the mechanics of misinformation.
However, most of those projects aren’t specifically studying fact-checking, despite the fact that the URL database contains information that could produce such a project.
Of the five researchers Poynter reached out to, only one responded saying that fact-checking was in the scope of their project for Social Science One. But for Sebastián Valenzuela, a visiting professor at the University of Wisconsin-Madison, studying how fact checks affect misinformation on Facebook is still tough even with the data-sharing tools.
“It’s a bit more tricky for our project because the information on whether the shared link on Facebook was sent or not to a third-party fact-checker (which is the easiest way of measuring whether fact checks affected fake news sharing) is not available for Chile,” said Valenzuela, the lead researcher for one of the winning abstracts, in an email to Poynter.
Valenzuela said he’s working around that by teaming up with El Polígrafo, a Chilean fact-checking organization, to manually analyze how fact checks compared to specific pieces of misinformation on Facebook. That’s not dissimilar to what Poynter does every week in its Fact vs. Fake column.
Then there’s the fact that Facebook’s ad library API might not be all that useful. In an analysis published on Monday, Mozilla found that the tool doesn’t meet more than 60 researchers’ standards for a good API.
“The API doesn’t provide necessary data,” Mozilla wrote in a statement. “And it is designed in ways that hinders the important work of researchers, who inform the public and policymakers about the nature and consequences of misinformation.”
[the_ad id=”667878″]
It will probably be a while before any of the Social Science One grantees publish their findings. In the meantime, below are the titles and links for all the winning abstracts.
“Measuring the Effects of Peer Sharing on Fake and Polarized News Consumption”
“How Hyperlink Sharing on Facebook Influences Civic Engagement and Elections in Taiwan”
“Understanding Problematic Sharing Behavior on Facebook”
“Mapping Disinformation Campaigns across Platforms: The German General Election”
“The Role of Facebook in Legislative Campaigns in Chile (2017)”
“Characterizing Mainstream and Nonmainstream Online News Sources in Social Media”
“The Demographics of the Sharing of Hyperpartisan News in Brazil”
“SHARENEWS: Predicting the Shareworthiness of ‘Real’ and ‘Fake’ News in Europe”