Meta has asked its Oversight Board whether its existing policy on the removal of COVID-19 misinformation — “introduced in extraordinary circumstances at the onset of the pandemic” — should be updated to lessen penalties for violations. The move, announced Tuesday, is part of the company’s ongoing efforts to find a balance between free speech and public safety.
The 23-member Oversight Board, which helps decide on Meta’s content policy and enforcement, regularly selects, deliberates on, and posts updates about its decisions on individual policy cases. It has been likened to Facebook’s Supreme Court. Meta describes the board as “an independent body that people can appeal to if they disagree with decisions we made about content on Facebook or Instagram.”
The tech giant cited life “returning to normal” in countries with high vaccination rates as one reason for its request to the Oversight Board. But it also acknowledged that the course of the pandemic varies worldwide and that any policy changes should take into account regions still dealing with lower vaccination rates.
“Now that the COVID-19 situation has evolved, we’re seeking the Oversight Board’s opinion on whether we should change the way we address this type of misinformation through other means, like labeling or demoting it,” Meta wrote in a statement. “Resolving the inherent tensions between free expression and safety isn’t easy, especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic. That’s why we are seeking the advice of the Oversight Board in this case.”
The same day, the Oversight Board announced on Twitter it is accepting Meta’s request. It is asking for public input on the criteria for removing emergency interventions going forward and the use of algorithms to carry out the labeling or removal of content.
Verified signatories to the International Fact-Checking Network said that they would be “keeping a close watch” on developments.
“Many countries, including Ghana, have been experiencing surges in the number of active COVID-19 cases, and the number of fatalities continues to rise,” said Rabiu Alhassan, the managing editor of the fact-checking outlet Ghana Fact. “Even more worrying is the fact that we are struggling to get a significant number of the population vaccinated because of high hesitancy, which is largely due to the barrage of misinformation and conspiracy theories they were exposed to on social media. So, in the interest of public safety, internet companies should continue to sanitize their platforms.”
Meta gave the Oversight Board four policy options to consider: “continuing to remove misinformation;” “reducing the distribution of misinformation;” “sending misinformation to fact-checkers to check, demote and label;” and “adding labels linking to trusted info.”
Meta’s current policy is to remove about 80 different claim types related to COVID-19 and vaccines. Saying vaccines cause autism, sudden infant death syndrome or are generally ineffective are some current examples of removable claims. Other claims worthy of deletion, per the policy, are asserting that wearing a face mask “does not help prevent the spread of COVID-19,” or that “wearing a face mask can make the wearer physically ill.”
Facebook has deleted 25 million claims during the pandemic so far, according to the company.
“As the pandemic has evolved, the time is right for us to seek input from the Oversight Board about our measures to address COVID-19 misinformation, including whether those introduced in the early days of an extraordinary global crisis remains [sic] the right approach for the months and years ahead,” the statement from Meta reads.