June 20, 2019

Peter Cunliffe-Jones is the founder of the fact-checking organization Africa Check. He delivered the keynote address at Global Fact 6, the annual meeting of the International Fact-checking Network. Below is an edited version of his remarks.

I first became interested in the harm done by misinformation because of a false rumor about vaccines that emerged, not online – in a WhatsApp group, or a hidden space on the dark web – but which started in a Nigerian mosque or mosques, spread to local newspapers, was picked up by a prominent local politician, reported as fact by national papers, and, when the false claims when unchallenged, saw him create bad policy — a vaccine ban — in his state of Kano in the north of the country.

Misinformation is often described as spreading like a virus.

Two years after those events of 2002, I saw how it caused the spread of a real virus.

Following the vaccine ban, polio surged in Nigeria and across the Sahel. I’d by then left Lagos and was on a work trip to Jakarta for the AFP news agency.

There I learned that polio had been wiped out in Indonesia. But then an Indonesian pilgrim on the hajj met someone with the Nigerian strain of the virus and brought it back. 7,000 miles from Kano, on the other side of the world, people had become victims of polio again. In a very real sense, they — and the sufferers in Nigeria and the  were victims of misinformation.

In May 2016, more false rumors about vaccines spread, this time on Facebook in Kenya, Cameroon and Senegal.

My Africa Check colleagues in Dakar and Johannesburg investigated. It turned out that, this time, the origin of the rumors was a 1994 medical study in India that had been misread to suggest the tetanus vaccine contains an anti-fertility agent. Reports about this, which failed to debunk the false claim, alarmed a medical doctor at a Catholic mission hospital in Tanzania, who passed on the claim as fact to colleagues at a Catholic medical convention.

Since then, the false theory has been put forward repeatedly by the Catholic Doctors Association in Kenya. In 2016 it popped up on the other side of the continent in Senegal and Cameroon. In 2017, the claims were repeated by a candidate running for Kenyan president.

What is the result? Well, cause or correlation, Kenya is one of just 16 countries in the world that have not yet eliminated maternal and neonatal tetanus. It’s estimated that, on average, one Kenyan child dies from tetanus every day.

How do we end the harm that that sort of misinformation causes? Rumors that spread in on- and off-line communities, and are turned into bad practice, or — if they make it to politicians — into bad policy?

What Bill Adair says is true. Fact-checking does keep growing.

But look at our budgets, our staffing, our resources, and you have to ask, how can we tackle this sort of misinformation – effectively – when we are still so small? Can we do it alone?

Most academic work on fact-checking has been focused, to date, on the question of whether presenting the public with corrective information – a fact-check – will get them to update a false view.

There’s a good reason for that. It’s how most of us work.

And despite all the gloomy “post truth” headlines of 2016, there is growing evidence that doing this, in the right format, and repeating it, does work, for a while at least, in helping people to update their views.

But what if the goal of fact-checking was to stop the false claims being made in the first place?

How do you make that happen? How do you tackle the “supply side”? In what circumstances would that work, and when would it not?

From the work that Africa Check and others do, I believe there are ways to achieve this, in certain circumstances.

But what you need to do is different from, or rather additional to, traditional journalistic fact-checking.

It involves reaching out to the people who make the false claims, seeking corrections, discussing why you believe the claim is wrong and why their voice matters. It involves presentations, public policy work, and sometimes uncomfortable collaborations.

It certainly won’t work everywhere or every time. But in the right circumstances, I believe evidence shows us fact-checkers’ work can have an impact on what politicians, public figures and institutions say, improving the way public communication happens.

Sometimes, of course, the false claims are deliberate – and there is no way they’ll be withdrawn.

This is one of the first articles I ever wrote as a journalist. It was 1991. (Believe it or not, it was written on a typewriter. Cut and paste? I’ve used the scissors and the glue.) It was an interview of men employed by the British Army in the 1970s to spread false information about Republicans in Northern Ireland.

This conference is taking place here, today, in South Africa. And I want us to take a moment to think about how disinformation has been used here – in the past and more recently – as well as our own countries – to shore up political regimes against their critics. This is one of the reasons this matters.

In the 1970s, the government here provided secret funding for propaganda wars at home and abroad – setting up publications here and attempting to buy established foreign newspapers to improve the press that apartheid received overseas.

Eventually, news of the program leaked, and caused a scandal, but nothing changed. In 1982, when security forces killed an activist called Ruth First with a parcel bomb, a propaganda unit, STRATCOM, planted a report in the media that she had been killed by her activist husband.

The goal was both to absolve the state of responsibility for a killing, and to tarnish the reputation of its critics. Old news.

After passing through immigration here, you may have seen in the airport bookshops a series of titles referring to what today’s South Africans call “State Capture.” It is a story of power and corruption centered on business cronies of former President Jacob Zuma.

Zuma’s allies were under pressure from investigative journalists and the judiciary, so they hired a British PR firm to run a disinformation campaign. They sought to avoid scrutiny by attacking their critics as “agents of white monopoly capital.” Some tactics don’t change.

Tackling disinformation of this sort is, of course, different again from tackling the sort of rumor mill I talked about earlier, or the perhaps unintended false claims that politicians and others can be persuaded to correct or withdraw.

For a start, exposing it requires forensic skills, establishing who are the perpetrators, what are the links between them, where the funding comes from, what drives and what enables it.

Do we all, here, have the time, skills and resources to do this? If we don’t, who does?

I have been talking, so far, about different types of misinformation.

Over the course of today and tomorrow, we are going to hear about many exciting projects trying to reduce the harm misinformation can cause.

To succeed, it is important that we, they, think about who the misinformation affects and why.

Let’s think about health misinformation again. There are examples of health misinformation that affect the information recipient – the consumer – directly.

Fake health cures promoted online are just one example. Let me tell you of an optometrist in Nigeria I spoke to recently, who was treating a man who had believed a hoax that he could cure his conjunctivitis by bathing his eyes in diluted battery acid. The patient, who was reduced to trying this sort of “cure” because he couldn’t afford a medical visit, is now partially blind and that isn’t going to change. This is direct recipient harm.

Then there are types of health misinformation – false claims about the prevalence of certain diseases for example – where the damage to society comes mainly if the claims are believed by policymakers.

If we tackle both types of health misinformation the same way, using the same tools, the same approaches, will we be as effective as we need to be? I don’t think so.

And of course, there are many more types of misinformation to tackle.

There’s political misinformation. Already this year, from India to Africa and beyond, fact-checkers have exposed a series of political smears and attempts to undermine the electoral system.

The political impact of this misinformation is worrisome, if as yet unclear.

It is clear that misinformation can cause economic harm. Anyone who comes from Brexit Britain, as I do, knows that uncertainty is anathema to business.

There is the harm that another type of misinformation can do to social cohesion – leading, or apparently so, to attacks on individuals and even civil unrest. We have all, I imagine, read about lynch mob murders in India, said to be sparked by misinformation on WhatsApp. We have seen similar divisive, dangerous content in Africa, too.

Taking things to another level, misinformation can spark and sustain a war.

I remember when I was a reporter in Bosnia-Herzegovina in the 1990s, talking to a Bosnian Serb militia leader, who later ended up in The Hague, about why he felt it was worth fighting. Talking about my translator, he said the Serbs were fighting because in cities like Tuzla or Sarajevo, run by a Muslim-majority government, she would not be able to dress as she pleased in a western style. They had to fight for Serb freedom.

This wasn’t true. My colleague, though a Bosnian Serb, lived happily in Tuzla with no restrictions on what she could and couldn’t wear. But he believed and spread the propaganda, and this sustained the war effort.

When (former IFCN director) Alexios Mantzarlis (remember him?) first talked with me six months ago about this event, we discussed who should be keynote speaker.

I suggested reaching out to Trevor Noah. Alexios reached out.

He’s busy. I’m sorry.

When Baybars approached me, a few weeks ago, I said, “No, Baybars, get Trevor Noah. He’d be a whole lot funnier.”

When that failed, I started thinking about what to say, searching around for a metaphor for what I think fact-checking is, or needs to be.

I already mentioned one analogy. Fighting misinformation is like fighting a virus. Fact-checking is a vaccine.

Another comparison I heard last week is with gardening: Fact-checking is pulling up the weeds in the information space.

I see the sense in both analogies but decided I needed another one.

You see, my contention is that the reason there is no silver bullet – no single vaccine for the virus, no weed-killer that will, alone, make the garden beautiful – is that misinformation is not one problem. It is many.

It’s a problem of:

  • False information
  • A problem of lack of access to, and trust in, reliable information
  • And a problem of we understand our world

And as if that was not bad enough, false information causes harm in different ways in different contexts.

And this is why, to help me with this talk, I ended up Googling the phrase: “How to battle an octopus.”

The first image I came up with was Super Mario. The second was an octopus and a shark.

Based on my research, there is good news and bad news.

The good news: in the gaming world Super Mario has it sorted. Octopus defeated.

The bad news: In the real world, octopi are formidable creatures. After the second image was a video of an octopus eating a shark.

But if you will bear with my analogy for a minute, how do you battle an octopus, and not lose – like that shark.

To my mind, a multi-tentacular problem needs a multi-tentacular solution.

First, we do need to keep on fact-checking, but smarter.

We need to keep on publishing fact-checks, in all sorts of different formats. But we need to think more than we have so far about who we send what fact-checks to, how and why. If the misinformation is the sort that harms the recipient, and if it has had a mass audience, we need to reach that audience. If it is more niche, we need a different approach — to take a different, more targeted route.

Before we publish our fact-checks, we need to think about them from the perspective of the person who believes the false claim.

If we want to be heard, we need to either acknowledge a kernel of truth (if there is one) in a false claim, or, if not, acknowledge the reason why someone might believe something to be true.

Second, we do need to tackle the supply at source.

Where there are networks or systems that produce misinformation, consciously or not, we need to expose that. If we don’t have the resources to do this, we need to work with those that do.

Where misinformation comes from politicians and other public figures we can identify, we need to reach out and explain why we think they are wrong. I know this does not work everywhere and with every politician. There are some notable outliers. But the evidence I see suggests that in many countries, it is possible through our work to persuade some politicians to change the way they communicate for the better.

We need to reach out to the media, through training and awards programs – to help the media do its job better — and reach out to the platforms, through our work with them.

And, getting smarter still, we need to multiply the reach of our fact-checks to Internet scale, automating our current game of “whack-a-mole” – as will be explained better than I can this week.

Third: We need to make it easier to find reliable information.

Misinformation flourishes when reliable information is scarce or mistrusted – often the case and not just in Africa, Asia or Latin America, but also in Europe and North America, too.

To do this, we need to work with people who hold data – statistics agencies, institutes – and with academic experts and others who can help us to verify it, and point the public to these sources.

We need to explain why some data is perhaps flawed but other is good.

Fourth: We need to help the public get better at assessing good and bad information themselves – news literacy.

Can we do this all by ourselves. How much more effective will we be if we develop curricula and then work with teachers and schools to take this to a national scale?

In short, for a many-sided problem, we need a many-sided solution. And this means, as still small scrappy organizations, we cannot do it all on our own.

The number of people in the room for the Africa Facts 2 meeting yesterday thrills me more than I can say. But still, even if we double our numbers again, we will still be small compared to the challenges. So, where are the solutions?

Well, one solution, let’s be clear, is funding. For the work we do, seeking to tackle a serious social problem, our work is woefully underfunded. We need, we all need, proper funding to be able to operate as effectively as we can.

That’s essential. It is also not enough.

In France, Brazil, Nigeria, Indonesia, Argentina and more, specialist fact-checkers have come together in recent months with mainstream media for projects– to collaboratively “cross check” elections.

(Here a big shout out is due to First Draft).

But I think we need to do more. If I may, I’ll give you one example.

If all goes to plan, Africa Check will launch a project in Nigeria this year to tackle health misinformation – by working not only with other factcheckers and the media, to identify health misinformation that is circulating in the country and publish reports. Collaboration as we have done it before.

But also bringing together others – the health ministry, the medical association, the Nigerian Centre for Disease Control, community groups – to go through the examples of health misinformation found to agree the 10 that, if unchecked, would cause most harm to public health.

And then take this alert back to their communities – to translate into whatever language and format is appropriate for them. This is a sort of reach we as fact-checkers, working alone, could never hope to match.

Will this particular project work? We will see. If it works in health, will it work in every field? I imagine not. But will some form of collaboration with organizations and communities that have skills and reach and resources we don’t? I think we have to try.

  • Reaching out to the people we fact-check, to stop bad claims at source
  • Working with those who hold reliable data, to make it more accessible
  • Developing curricula and engaging with schools to help young people, at a national level

Already, I see many of us are doing parts of this. One shark versus an octopus. I put my money on the Octopus. Many sharks – our chances become better than even.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
I have been a Visiting Researcher at University of Westminster in London since 2019, researching types, drivers and effects of misinformation and the status and…
Peter Cunliffe-Jones

More News

Back to News