The Organization of American States (OAS) has studied the phenomenon of misinformation during election cycles in Mexico, Colombia and Brazil to create a list of best practices to be taken by powerful actors across the continent. In a new report that has yet to be publicly released, the OAS’s panel of experts advises political parties, intermediary companies, journalists and universities on how they should prepare for the threat of false news throughout election periods. The suggested actions address the ways in which the informational infrastructure that citizens have access to can be improved, from making political campaigns more transparent to re-thinking the ways in which citizens engage with digital technology.
The OAS is comprised of 35 member states, including Canada and the United States.
The IFCN was able to obtain the latest draft of the report, “Disinformation in Electoral Contexts: A guide to address the spread of misinformation based on Inter-American standards.” Below are summaries of the OAS’s suggestions for all non-government actors who are directly or indirectly involved with the phenomenon of misinformation. This is the second of a two-part series. You can read the guidelines for the executive, judicial and legislative branches here.
Best practices for political parties
- Avoid negative campaigns that make use of false information. Political parties act as intermediaries between citizens and their representatives, and in no context is this relationship more important than during electoral campaigns.
- It is essential for political parties to abstain from promoting themselves through use of misinformation campaigns, whether directly or involuntarily. Principal political actors should not contribute to the diffusion of false information, and activists and other practitioners should also reject these kind of practices.
- Make the electoral campaign transparent. This will allow citizens to adequately distinguish nonpolitical campaigns from the ones that are.
Best practices for intermediary companies
- Make transparent the criteria that’s used to moderate content on platforms. Just because these companies don’t have legal responsibilities to do so doesn’t mean they don’t occupy a special space in the free exchange of information on the internet.
- Deepen transparency in political advertising, especially during election periods. It is essential that intermediary companies operate with transparency, as the phenomenon of misinformation is more effective the more it is hidden.
- Collaborate with independent researchers. Because misinformation is a complex phenomenon, it is necessary that companies that provide services to political campaigns collaborate with researchers to better understand how mis/disinformation work.
- Collaborate with electoral authorities. Collaboration between electoral authorities and the companies that provide Internet services will allow for the establishment of efficient channels of dialogue that facilitate legitimate action against misinformation. Therefore, it is necessary for companies to continue their efforts to work with electoral authorities, as long as they are legitimate authorities that operate democratic electoral processes that have not been challenged by the international community.
- Support quality journalism. Within the actions of self-regulation already being undertaken by global companies, those of support for independent and high-quality journalism stand out, as this strengthens an open and robust public debate. It is desirable for these efforts to continue and expand in the region.
- Review the search algorithms that users utilize to access content. In general, companies that offer Internet services desire for their users to be able to access relevant information. Part of this is achieved through algorithms that, based on what users are interested in, recommend similar content under the premise that this information is “relevant” to them. It is important for companies not to build these algorithms solely for commercial reasons. Due to the role these companies play in shaping public debate, it is crucial that they adopt positive approaches aimed at promoting high-quality information. This is less risky than taking actions to remove content, and can be more effective in combating disturbing phenomena like misinformation. Given that algorithms are largely responsible for the information that people “see” or access on platforms, it is advisable to increase transparency regarding the criteria that companies use to build these mechanisms.
- Review policies on bots and automated publishing tools. These automated tools, which are not controlled by humans, have been identified as factors that help spread false information. While bots are not problematic within themselves, when they operate as part of a disinformation campaign, they may be moderated by platforms. It is thus recommended that platforms continue to work on the identification of problematic uses of these kinds of technological tools.
Best practices for telecommunications companies
- Review “zero rating agreements” to combat misinformation. Over the last few years in Latin America, zero rating agreements have become common, contracts in which certain social networks, platforms or messaging systems agree with telecommunications companies that their services will not count for the computation of “data” that users consume in their mobile telephone services. This allows users with the most affordable data plans to access these services even when they exceed the quota of data they’ve paid for.
- These agreements violate the principle of net neutrality under the premise that they allow expanded Internet access, albeit a very limited version of it. This is problematic in regards to mis/disinformation campaigns, as those who receive false information through social networks or private messaging services are unable to verify whether that information is true or not because they do not have access to the Internet in its entirety. It is advisable that telecommunications companies expand these zero rating agreements to allow users to verify the information they access via these services.
Best practices for media companies and journalists
- Strengthen high-quality journalism and avoid political polarization. It has been proven that the media are relevant actors in the phenomenon of misinformation. At times, their interventions have allowed mis/disinformation to expand more rapidly, and in other cases, they have been effective in circulating verified information in response to false information. In any case, political polarization affects the media and, according to studies conducted in the United States, the polarization of the media itself can cause mis/disinformation campaigns.
- It is important that the media and journalists remember the role they have to fulfill in a democratic society, as “watchdogs” of public actors and privileged conduits of public debate. This imposes special obligations, such as those of contributing to awareness efforts on the existence of the phenomenon of misinformation and producing independent and professional journalistic content.
Best practices for fact-checkers
- Unify definitions of misinformation and strengthen regional networks. Verification agencies that have grown exponentially in recent years within the region play an important role in combating misinformation. By verifying public discourse, they offer a service that can help citizens navigate a complex public debate that is often tainted with false information.
- It is important that the verification agencies use precise definitions and terms in the work they do. In this sense, it is important to base qualifications and judgments on previously established definitions, and not to apply qualifying adjectives regarding the falsity of materials that do not deserve it strictly. This could produce undesired effects, such as unintentionally increasing the dissemination of false information or contributing to the discrediting of professional media, both of which occur especially in highly polarized political contexts.
Best practices for companies that do business with personal data
- Respect existing legal frameworks, and participate in the conversation about misinformation. As noted before, one of the elements that drives mis/disinformation is the transformation of the advertising market driven by the Internet. Thanks to the use of personal data that users share with various services, the industry has managed to develop tools to reach recipients with highly targeted messages, based on precise profiles of preferences, age groups, income, and so on.
- Therefore, it is important that companies engaged in digital advertising and design advertising campaigns, both commercial and political, participate in the discussion around mis/disinformation in the countries where they operate. Although it seems obvious, it is important to insist on the need for these companies to abide by the existing legal frameworks in terms of data protection and, for example, not to use personal databases outside of the cases in which the law allows.
Best practices for universities and research institutions
- Expand empirical research on mis/disinformation. We still know very little about the extent of misinformation, its scope, causes and effects. In this sense, it is essential that the academic world deepen its research on mis/disinformation, a particularly pressing need in Latin America, where research is scarce. These investigations should have a solid empirical basis, focus on specific events in the region and be carried out in a comparative manner. In order to accomplish this, it is essential to expand the collaborative networks between universities and research centers in the countries of the region.