SARAJEVO, Bosnia and Herzegovina — Nobel Peace Prize laureate Maria Ressa opened GlobalFact 11 with words of encouragement to fact-checkers all over the world and appeals to the tech companies sponsoring the event.
In a June 26 conversation with AFP Global News Director Phil Chetwynd, Ressa spoke about her experience of promoting fact-checking in the Philippines through collaboration, fighting back against attacks and building “deliberative” technology.
While she thanked tech companies Meta, TikTok and YouTube for funding fact-checking, she also urged them to “do something” to protect democracy and prevent genocide.
Here is the text of Maria Ressa’s keynote conversation with Chetwynd, edited for brevity and clarity.
Phil Chetwynd: I was gonna start by just saying, how are you? I mean, you’re, you’re out on bail.
Maria Ressa: Yes, things can get better. So, first of all, please, we were walking over here, and I was just saying, I wanted you to know how what you do — it sounds really mundane to say you’re a fact-checker, but in a world of exponential lies, you literally are the first line of defense for everyone. And I think what we will find out is that this moment, this moment in time, when we look back, we are going to look at this as the moment of either the beginning of victory or the beginning of the end.
And that’s — so how am I? I’m chasing my shadow. I decided I’m not going to sleep all of 2024 — you know what that’s like — until we know. (Because) 2024 is the tipping point year for democracy globally. We are electing illiberal leaders democratically, so it is choice. And part of that is because of the way we consume news, the way we get facts.
Sorry, I’ll go to my cases. Things can get better. This is the best part right. Under (former Philippine president) Duterte, where our first casualty in the battle for facts was exactly how many people have died in the six-year brutal drug war. If you talk to the police they can say as little as 2,000, but as of December 2018, our Commission on Human Rights said nearly 30,000. 2018, right? It’s 2024.
Anyway, so from Duterte to Marcos we had an election in May 2022. Where fact checkers were out in force, and we literally built a four-layer pyramid with our partners. Some of you are here.
And guess what? We moved from hell to purgatory. It’s not perfect yet, because nothing is perfect.
But my cases? From a high of 11 criminal charges, criminal cases that were ongoing at the same time, which meant I spent more time with lawyers than journalists, to two. There are two left, and one of them can still shut down Rappler, but the other one, yeah, I can go to jail for seven years. But you know what? Seven years (compared) to 103 years (which she was facing at one point). So things can get better.
Chetwynd: Maria, I want to sort of think how we got here.
Ressa: Sure.
Chetwynd: I was saying to Maria yesterday, I first met Maria in 2013 I think, and I went down to visit her offices in Manila. Just founded Rappler at this time. You know incredible optimism of how to harness social media for storytelling and great journalism, you know. And I remember going back to my team and going, I’ve seen the future. And so we have to remember that, don’t we? Before we got here, there was a moment of optimism there. So what happened? What was your journey there?
Ressa: We drank the Kool Aid, right? We at Rappler, I mean, I came from managing the largest news organization in the Philippines, which has now lost its franchise to operate. So it’s, it’s part of the regression of democracy, to running — from running 1000-journalist news organizations to starting with 12 people. It was, it was both a shock and extremely invigorating.
At that time, in 2012 I saw Facebook, I saw— YouTube wasn’t really a factor then, but it was Facebook and Twitter as enablers of democracy, as empowering platforms. But that, what happened? The business model shifted. Sheryl Sandberg joined Facebook and … commoditized the people and the content. So we watched news, when the incentive structure of the distribution platforms, and this is where you’ve heard me say this a lot, but please look at the structure of the technology that connects us by design.
In 2013 we were using Twitter for disaster risk reduction, right? We built a tech platform for something called Project Agos, where our civic engagement arm was working with the Office of Civil Defense, and we were, all the news groups, were using the same hashtag because the Philippines gets an average of 20 typhoons every year and in real time for real people, the geotagging of Twitter helped rescue people, right? Helped save lives. But that was before all the crap came in.
And when did that happen? By 2018, MIT had a study that said lies spread six times faster on social media. They figured out through AB testing that you can keep people scrolling longer if you feed them lies, and then if you incite fear, anger and hate — this is data we found in the Philippines in 2017 — if you, if you push fear, anger and hate i.e. the worst of humanity, then it spreads even faster.
That’s the outrage economy that we are all living in today. I’ve gone as far as calling it, you know it is pumping toxic sludge directly into our nervous system, and there is the work you do that we do at Rappler. Oh, that fact check. People will tell you it doesn’t matter if you fact-check, because people don’t care. But guess what? You are the only anchor. Facts are the only anchor in our shared reality.
And that’s what’s happened since 2018, it’s gotten even worse, because, because and we’ve seen this, right? In 2016 I was in Mountain View, and I was telling the people there, journalists at that point that you know what is happening to us in the Philippines, where I was getting an average of 90 hate messages per hour. Per hour. I’ve never lived through anything like this. And I said, you know what’s happening to us? It’s coming for you, because these are global platforms. We just happen to be the petri dish right now.
But Slovakia in Europe is the petri dish. Georgia is a petri dish, right? What’s the petri dish? Anywhere where you have a significant social media penetration rate. You saw Canada go from 37% to 85% social media penetration rate. And that’s when things changed. You can almost track the rise of the lies to that, but sorry, let me finish. So where are we today?
Chetwynd: That’s what I was gonna come to. Because I was gonna come to today, if I track your, my Maria tracker of the way you frame what’s now, it’s pretty dystopian, right? I think several phrases always stick out, which I think are worth repeating: we’re standing on the rubble of the world that was and that an atom bomb has exploded in the information ecosystem. Yeah. Can you unpack those two phrases? Because that’s where we are in your mind at this point, right?
Ressa: So the atom bombs silently exploding in our information ecosystem. I think I said that in 2021 at the Nobel lecture, right? And guess what? It got worse. So how can it get even worse?
We all have to be energized, because the battle is now, and it is a battle for facts, which means you really are important.
Maria Ressa
You’re not gonna walk out depressed because we all have to be energized, because the battle is now, right, and it is a battle for facts, which means you really are important. Why do I say that? Especially post-COVID, even before COVID, what we saw was that, because lies spread faster, literally you know the Stranger Things, that Netflix series where they have to go into the upside down, the world is upside down, and it’s literally called “the upside down.” And the upside down kind of looks like our world, except that it’s gooey and slimy and yucky and it’s full of monsters. I think that’s the world we live in today. And what do I mean by this atom bomb exploding?
The power structures are completely different. You’re seeing it play out, news journalists will know this. Phil. We saw Putin visit North Korea twice. We saw Modi, I mean, let’s just talk about the way the world is shifting geopolitically that is directly connected to the vote, right? V-Dem in Sweden said that, as of January this year, 71% of the world is now under authoritarian rule. 71%. And the tipping point this year, why? Because half the world is gonna vote.
You don’t have to worry about the machines, the infrastructure only, which is what, say, the United States, that’s what (the Cybersecurity and Infrastructure Security Agency) does. But you have to worry about the people who are going to vote, because social media has become a behavior modification system where, if you pound a lie a million times, it becomes a fact. You cannot tell the difference between fact and fiction, and that’s before generative AI. So the atom bomb exploding.
You have to worry about the people who are going to vote, because social media has become a behavior modification system where, if you pound a lie a million times, it becomes a fact.
Maria Ressa
Let’s start with, I actually told this to the Pope, isn’t it against the 10 Commandments for people to lie? And yet, he listened. He was listening very closely, but, but think about it, right? And yet, who is being rewarded with distribution on social media? The liars, the people who lie. Right? Because six times faster. And I would wager that when Elon Musk bought Twitter and turned it to X, that it got even worse.
And then, here’s the other part that’s shocking to me, and I know, thank you. Meta, TikTok, for funding fact-checking, but I mean, frankly, you just wanted distance from actually doing the work yourself.
(Applause.) We’re frenemies.
Chetwynd: You sort of nicked my next question because yeah, I was coming to responsibility. Right, responsibility, and certainly in this room and society in general is our awkward interaction with the tech platforms, as you say our funders.
Ressa: Thank you for funding us.
Chetwynd: At the same time, certainly, in your mind, I mean some of the language you’ve used around Mark Zuckerberg’s responsibility, personal responsibility for the things that you’ve seen in the Philippines and elsewhere is strong. So how do we reconcile that? Rappler remains a member of the third-party fact-checking project. I shouldn’t think Meta has any kind of client, partner, whatever that gets quite the amount of shit that it gets from you. So, how does that work? Why are you still there?
Ressa: So first, I’m being, not to be facetious, but truly, I stepped away from the editorial arm of Rappler after I kept getting all of the lawsuits and the attacks on social media because being a journalist means you’re handcuffed, right? I’m not, you’re not going to curse. I’m not going to curse at people who are lying, at the power. Never underestimate the power of arrogance and ignorance, especially if the algorithms favor them, right? So for me, starting in 2020 I moved away. So Meta does not deal with me in terms of our fact-checking program. You have Gemma Mendoza, who is here, who is their partner.
I am, you know, really, I try very hard not to let what I say impact Rappler. Please, Meta, don’t let it impact Rappler. Or TikTok. I mean, I do have a story about TikTok that shows you kind of the difference in world change.
But look, I wrote a book, How to Stand Up to a Dictator. It was just translated to Georgian, and it’s so interesting, the translation is one of the most aggressive. It’s How to Defeat a Dictator. I said how to stand up, we weren’t sure yet we were gonna win. But Georgians want it to be defeated, right? So that was interesting. But in my book, I talked about two dictators, Mark Zuckerberg and Duterte. And really frankly, Rodrigo Duterte is out of office, but the bigger dictator is Mark Zuckerberg, and part of it is because he’s not an elected official.
And yet he along with Elon Musk, what has Elon Musk done to Twitter? I can’t even call it X yet. I was on the stage right after him in Cannes, and he has a vision for news and facts that talks about the wisdom of crowds. But I did challenge that, because the Wisdom of Crowds requires trust, which is that, completely off. There are five things, you know Wisdom of Crowds is a book by James Surowiecki.
I think X now only has two of the five, which is people were there, right? So what next? You can’t have the wisdom of crowds. What you’re seeing on social media is a mob, and a mob creates a chilling effect. A mob changes reality. So to the, to the sponsors, I will appeal to you the way (the Association of Southeast Asian Nations) appealed to the dictators, and we know it had limited success, but I really hope you look at the world today and look at constructive engagement, right, that you are the only ones with the power and the money to literally do something right now to protect democracy, to prevent genocide.
And I’ll go back to 2018 which is when, you know, in Myanmar in my part of the world, both the U.N. and Meta sent investigative teams to Myanmar, and they both came out with the same result, which is that Facebook had enabled genocide. Why has nothing been done? Every day of inaction causes harm. You don’t have to take responsibility for it. We will. Because we want it to change.
Chetwynd: Maria, I want to come back to this.
Ressa: Sorry, sorry, not to put Facebook on the spot. But I’m sure we will speak. There’s a lot of things we say privately. We really are frenemies, but now, please again, to go back to that moment, the window is closing, and what I’m worried about is that you say you don’t have the incentive to fix it. We’re not asking you to fix it. We’re asking you to just put in place little measures that you put in place in 2020 when you turned the news ecosystem quality towards facts, and it worked.
And when you stopped block, block invitations in Facebook groups, right? That’s a critical one, because when you block invite, they realized they had a break glass measure where, after the 2020 elections in the United States, they put friction in the mix. They made every person joining a group have to be admitted by the administrator. And that little thing was actually very useful, because when they took it off, and they took it off shortly after the 2020 U.S. elections. It helped the growth of Stop the Steal, a new Facebook group, it grew to 360,000 a day, right? So we can do, please do those little measures now, when it matters.
Chetwynd: I wanna come back to this question of integrity of facts. Which again is at the heart of the issue where we’re at. Sometimes it’s a sort of surreal conversation. It’s something that most people in this audience battle with every day. (Maria: Thank you.) It’s something that’s often not appreciated I think by mainstream media, it’s something that’s often politicized. Hence, the move to the almost surreal use of the word censorship when you’re just trying to get to the facts. How do we get towards, it seems almost hopeless at this point in time with the sort of competing narratives, how do we get towards some sort of integrity of facts?
Ressa: So first of all, we have to embrace that everything is political, right. and for journalists — AFP has journalists in Gaza — it becomes like this. We cannot say what we see and what we mean now without a chilling effect.
So folks, welcome to our world in the Philippines under Duterte, right? Americans have an idea that they have free speech. But, you know, I gave the Harvard commencement speech, and unless you’re willing to be attacked by both sides, or any side, you don’t speak.
So that is alarming. Information integrity. So a big picture, a macro one, is that, thank you Brazil for– Brazil is the leader of the G20 Dialogues, and it has put information integrity as part of the G20 agenda. And we had G20 Dialogues in Brazil itself.
Why is Brazil so critical of this? Because, like the Philippines, they moved from hell to purgatory. They went from Bolsonaro to Lula, right? But the interesting thing here is information integrity, the core of what we do as fact-checkers, changes reality. And let’s go to this one idea that is embedded in every tech company, especially ad tech. Where they say, look, personalization is the be all, and end all. We will personalize everything for you. I mean, when this came out on Twitter in 2014, 2013, 2014, I was like, really, personalization? Well, okay, yeah. But I never thought that this, you know, in 2014 if you click on a sneaker, you’ll be followed by sneakers for two weeks.
But today, personalization, the tech has gotten so good that personalization means every single person in this audience can have their own personalized reality, right? You can have your own reality with your belief systems, with your cognitive biases kicked in. You’re not gonna listen to facts in your own personalized reality. You know, a world where all of us have our own personalized reality in this room, this room would be called an insane asylum.
Really, we cannot have our own personalized realities. If you don’t have facts, I mean this I’ve said over and over – without facts, you can’t have truth. Without truth, you can’t have trust. Without these three, we have no shared reality. You can’t begin to solve any problem. We cannot have journalism or democracy or solve, oh yeah, that big thing that we’re all facing, climate change. It’s existential. So if we don’t do something right now, the world will tip towards, and I’ll use Madeline Albright’s word, fascism. When it moves to fascism, would you trust your dictator to be better than the chaotic role of a democracy? Democracies aren’t perfect. They never have been, but in the world’s experiments with systems of governance, they have given us all a voice. Now, technology, we haven’t talked— Sorry. Let me just throw in generative AI, right?
Chetwynd: I was about to come to that. So this, this, your dystopian vision. It’s terrible to say Maria has a dystopian vision, because you’re one of the most positive people you’ll ever meet. But the point here is before we’ve really looked at what generative AI brings to the table in terms of disinformation and in terms of the information ecosystem, it feels like we’re standing on a precipice at this point. How are we, fact-checking community, journalism community, supposed to interact with the AI companies, knowing that we really haven’t resolved the issues that we have with the tech platforms previously? We’ve not resolved issues, you’ve just enunciated the issues that we still have, as it were. What are we supposed to do?
Ressa: So, it’s gonna get worse. We already know this, right? Like we’ve already seen generative AI. I have several generative AI that sound like me, that look like me. I’m selling crypto that was seeded by a Russian scam network, and I’m trying to figure out whether that’s connected to politics. So, if you see me selling crypto, don’t believe it. But, you know, some CEOs and CFOs in the Philippines called me, asking me about the crypto.
So, what is it? Cory Doctorow coined the term the “enshittification” of the internet. It’s in full steam. As of January this year. 57.1%, this is an academic paper, and I’ll tweet it, I’ll X it. Anyway. 57.1% of the internet is now low-quality content. So if the lies don’t push you out of the public information ecosystem, if the hate doesn’t push you out of the public information ecosystem, the enshittification will. And that’s part of what we are going to deal with today. Right. We need to keep, you need to be the foot soldiers. To keep checking the facts.
Because, what we did in the Philippines. Our May 2022 elections. And Google News Initiative was our partner, Meedan was our platform that took 150 different organizations and we built a four-layer whole-of-society approach for facts. 16 news organizations fact-checking, the second layer we called it the mesh, the distribution layer. Because our biggest problem was gonna be the distribution. So the distribution layer were 116 different civil society groups, NGOs. Human rights groups were in, artists joined us, the church joined us, business finally joined us. We need business in the group. That layer is critical, I’ll come back to it.
The third layer were academics, so Meedan had helped take all of the data which we processed for the academics in the third layer. Every week, they did a webinar where they showed, these are the lies being spread, these are the candidates who are benefiting or being attacked.
The fourth layer is the critical one: law groups. Legal groups, left, right and center. Because they spent, in three months they filed more than 21 cases to protect the four-layer pyramid.
You know when we came out with this in the Philippines, and you know, again, I have to thank our partners in doing this, because the solicitor general of the Philippines immediately filed a case at the Supreme Court calling fact-checking prior restraint. They were trying to make fact-checking illegal. It took months, but the Supreme Court threw it out. And you know what? Somebody’s gonna have to stand up and take the blow. So I happily did, right? I mean, what’s one more case at that point? But I guess, why is that important? Because the fact-checking was the foundation.
The fact-checking was the foundation of reality. Sorry, let me answer your question. What are we gonna walk into? What I hope to do now is— before I answer that, I see all the other—
Let me do equal opportunity, like weakness points and appeals, because I appealed to Meta. Let me talk to YouTube, also, because they’re also a sponsor. I’m just looking at the sponsors. Again, the thank you for everything you’ve done in the past, but you can do more. YouTube was fantastic on Ukraine right? They pivoted very quickly and took down the lies. Are they as good on Gaza?
And YouTube and TikTok, let me do an example. I think my NDA already expired. But, but during the 2022 elections in the Philippines, you know, we came out, like many of you do, with the spreadsheets of thousands of liars that are creating disinformation networks and their video – because video, is it, right? And we gave these thousands, thousands. We gave it to both TikTok and YouTube. TikTok took 98% down in 36 hours. I’m not completely on your side, TikTok, you have a lot of work to do also. YouTube took a longer time before they did. And the reason, when I brought it up to YouTube, I just said, TikTok already took them down. And they were like, yeah, but they’re China. Not to, because I know Bytedance is also an American group.
But again, please, there is no room to duck. I said this to Rappler, when we got the shutdown case. This is the moment when every person in every one of the tech companies and everyone here – give me two seconds.
My journalist friends get upset when I get emotional. Journalists have no feelings. I think this is the moment, and this is what I told Rappler when we got our, the shutdown order. This is gonna be the moment where a decade from now, you’re gonna wanna look up and know you did everything you could, everything, because the world will be transformed.
Chetwynd: Let’s move to just something calmer. Again, we’ve spoken here a lot about the ecosystem, the moment we’re at. You’ve been incredibly eloquent about the moment we’re at, the need for action. I’m gonna bring in here the issue around regulation. You’ve been saying for a long time, section 230 of the Communications Decency Act in the States. You worked a lot on helping to get the Digital Services Act. The discussions on AI legislation at the moment are sort of, we’re running behind as it were. You know, is this something that we can feel confident about? The DSA surely at this point is a test, right? The legislation is down. See what happened in Gaza. Just took one look at X during Gaza. The moment is now, right, what’s gonna happen now? What’s the follow up?
Ressa: So I think first, there’s so many layers in this. You have to understand, I know that fact-checkers, we were created by the tech companies, so I appeal to tech companies to continue the funding for fact-checkers. There’s a big debate among journalists, and this was a debate I had in Georgia. Should journalists take money from tech? If you’re running a news organization, that’s not, that’s not a debate at all. It’s academics who aren’t in the battle itself. …if you think about it, fact checking is part of journalism, but the distribution system has separated this, right? So we need to make sure facts are there, that the world is anchored in facts. That’s the first.
The world still thinks that everything is about content. Our biggest battle in the EU was really to separate content from data, the transparency of data, the algorithms and the tech design itself. You have to tell people while you are in the trenches, putting out the facts, most everything more important, I think, to me, than actually the content— because I don’t care if my crazy neighbor says something that is crazy. It isn’t censorship, and it isn’t about free speech being stifled. It’s about what gets the widest distribution.
My crazy neighbor, who believes ducks can talk, should not be the front page. And often what happens now is the most outrageous lies, the most incendiary content, are the ones that get the front page. And when I say front page in today’s day and age, the widest distribution. So we go back to the design of the companies. You can make a fraction less money to preserve the reality of the world, right? And who’s, and I say it, as a business owner and a business head who’s actually spent a lot more in legal fees to protect the public information ecosystem, that’s the second. Beyond the actual work that we do, where are the news groups? You know, in terms of collective action, both for governments and for our public. The Writers Guild Association, after generative AI came out, they came up and acted collectively. We news heads, we’re not acting collectively on anything, because I think we still have the vestigial tail, we still think we have power. We’re the first in battle.
We had a really nice discussion this morning, because in the end, it’s about the business model. It is about power and money. We are continuing to do our jobs as journalists, it’s kind of like the dam is already falling, and we’re putting our finger in the dam to hold it up. But we have to find a sustainable business model. We have to stop the breakdown of trust, which the tech companies, the platforms, continue to erode, because they can make more money. So how do you fix it? Legislation.
Chetwynd: I was gonna say, where ultimately is the money gonna come from? For media, for journalists at these points of crisis. We see collapse in referral traffic from Google, from Meta, to business sites.
Ressa: Yeah, we haven’t talked about– I think digital news will die in less than a year.
Chetwynd: So where is that money gonna come from?
Ressa: We’re doing knee-jerk reactions when you know, I think about the information ecosystem as a polluted river. So here’s this river. And what we tend to do is we take a glass of water from the river. We clean it up, and then we throw it back into the river. We have to go up to the pollution from the factory of lies and shut that down.
So the answer to your question is medium, in the long term, it is education, right? It’s education on AI, in elementary schools, all the way to policy makers. It is education in what journalism is. So education, medium-term, legislation. The legislation, the AI act, doesn’t take effect for another two years. And yet, we are Pavlov’s dogs, we are actually, and you know, if you are from the Global South, if you are a woman or LGBTQ+ in the Global South, we’ve been colonized twice now. Chris Wiley, the Cambridge Analytica whistleblower, was the first to say this. He said, colonialism didn’t die. It moved online.
And that code that came to us in the Global South that didn’t have our culture, our language, our customs, it didn’t have any of that, and yet that was an overlay on how we were. So colonization. But the second one is, we’re the ones who actually are in those factories cleaning it up for the tech companies the first time, but the second time on generative AI, it’s the Global South that is cleaning up the tech so that you can use it in the Global North.
But sorry, medium-term legislation. Right now, the legislation, the DSA, the DMA, the companies just leapfrogged over it. And yes, they are, it’s making some small headway, but it isn’t enough. And that’s part of the reason we need the companies with us. It’s part of the reason I’m here. Our task is actually to make sure the public knows and understands that your business decisions for profit, surveillance for profit, brings with it harms that governments are gonna have to deal with, that the people are gonna have to deal with. So there’s that, we will continue doing that in the medium term.
In the short term, it is just us, which is why we did this four-layer pyramid. It’s an influencer marketing campaign for facts. Think about it like that, right? But it’s gotten worse. So what, what else are we going to do? Collaborate, collaborate, collaborate. This is a brave new world, worse than 1984 because we’re just in the process of turning into the upside down.
I keep thinking, okay, post 2024, what if democracy dies? What will we do? The battle doesn’t end. It just becomes harder, and it starts for decades.
Maria Ressa
I always prepare for worst case scenarios. And I keep thinking, okay, post 2024, what if democracy dies? What will we do? The battle doesn’t end. It just becomes harder, and it starts for decades. This is, that’s why, while the window is still open today, our tech partners: join us in this. Because it will not leave you and your children alone. It will come for you. This is our collective battle. So in the short term, if you have a cell phone, you’re with us, the question is, are you gonna fight the battle smartly? Are you going to fight it collaboratively? Are you going to build a better world for your children?
Chetwynd: Maria, I think a lot of people in the audience, you know fact-checkers are often really in the line of fire. Even within AFP, you know the people who work on fact-checking can get way more harassment just than regular journalists. Things have become banal. You know, the death threats, whatever they may be. You’ve really been in the eye of that storm for a long time. What advice can you give to this community that’s been, many of them who are suffering? How do you get through it? Have you managed the levels of hate, threat and so on that you’ve had to deal with now for a decade?
Ressa: So first, no, you are not alone. When I was getting the worst of it, you know we had a senator who once said she eats death threats for breakfast. We have death threats all the time and in 2016 in the first six months, when we were getting attacked on, it was largely Facebook at that point, we had to increase security six times in a year and a half.
Let me say this, what you are feeling, what you are going through, you’re not alone. And if we collaborate and bring that together, then we can do better. That’s the first. I think the second thing is that it is normal to be afraid. It’s normal to be afraid. But that’s also another reason why collaboration becomes critical. Having said that, the people in the room with you when you make a critical decision.
Courage spreads. And part of what we found in Rappler is, I had a core team of the four founders who we knew we were under attack. I knew I could go to jail. I knew. I was wearing a bulletproof vest when I’m in Manila.
And yet, it’s gallows humor. Sorry, I won’t even use gallows. You find a way out of it by holding the line. And what did we do? We decided that only one of the four of us can be afraid at a time. You have to pass the fear, right?
Chetwynd: Literally how does that work?
Ressa: Like, literally, there’s a squeegee that you pass the fear right. And part of it is because you’re leading a team. You don’t want the fear cascading. And the other part is, if you’re a leader, anytime you’re not clear, it’s harder for your team, their fear grows. So pass the fear, have a team of trusted people. And what gives you courage? You shared values, your shared values.
Because this is the moment right? Like literally we decided that when I look back, when we look back a decade from now, we’re gonna know we did everything we could. So pass the fear. Arm your people with knowledge. Know that they’re going to be attacked. And what we did at the beginning was we did something called the swarm. And frankly, we could do this together, we would be far more effective. If you’re the one under attack, it doesn’t help for you to defend yourself, because it’s exponential, right? Like, what am I gonna do?
Like, at one point I went to Facebook. I said, you know, I’m getting 90 hate messages per hour on this and they were like, yeah, but you’re a public figure, you should respond. There’s more than 24 hours in a day, is there? 90 hate messages per hour. I couldn’t even physically respond. So these are some of the things. What we did is the person being attacked sits back, and everyone else swarms. Just that alone.
Standards and ethics are both our mantra and our armor.
Maria Ressa
We didn’t do it with bots, by the way. That was the other thing. Someone told me, you know, Maria, with everything you’re doing in tech, you could have used bots. I was like, it’s deceptive. It’s unethical. I don’t know how else to say that even louder. Sorry. So standards and ethics are both our mantra and our armor. So what else can we do? We’re not punching bags. We are, please, no learned helplessness, because that’s the other thing that happens.
Understand that as the tech companies are redoing, rebuilding the world, we can too. We rebuild it. We build it to be what we want, right? And this world is toxic sludge injected into your bloodstream, that’s not what I want, right?
So last, I’ll just say on the big picture thing, because don’t get lost in the nuts and bolts of where they bring everything. There are three big things we need to keep in mind. And these were the demands in the 10 Point Action Plan that Dmitry Muratov and I rolled out in April 2022. The first: stop surveillance-for-profit. Surveillance-for-profit, surveillance capitalism. On Facebook, the Consumer Reports just said that on every account on Facebook has at least 2,300 separate different data points. 2,300. And then these companies take our data and build a “model” of us that knows us better than we know ourselves. Take, get rid of the word model, because these euphemisms take it away from what it really is. Use the word clone. They cloned us. And then artificial intelligence, the first time, takes all of our clones and that becomes the mother load database for microtargeting. Microtargeting is not the old advertising. Microtargeting is taking your weakest moment to a message and selling it to a company or a country. So stop surveillance-for-profit.
The second: stop coded bias. Because if you are a woman, LGBTQ+, if you are brown or Black, if you are marginalized in the real world, you are marginalized even more online. That’s the second point. Gendered disinformation, these recidivist networks that attack women are literally pounding women to silence and many more, aside from journalists and activists, women politicians are opting out of public life. Maia Sandu in Moldova is under intense information warfare. Russian. In Georgia, you have women who are being attacked by both Russian disinformation and Chinese disinformation, right? The playbook is going around, all right. So stop coded bias.
Third, because all of this, these two things lead to information warfare, which is pounding to silence. The third, the third: journalism as an antidote to tyranny. Because not everyone is created on social media. What happened to the experts? What happened to the standards and ethics? You know, it’s funny, because Elon Musk was talking about wisdom of the crowds that turned into a mob. But I think what we’re living in now is the cult of the amateur. Let’s elevate voices as journalists, right? Let’s, anyway, sorry, I’ll shut up. I can talk forever, but please just know you’re critical because you have to tell your family and friends, we need to move from the virtual world to the real world. We need to talk to our partners and every one of the tech companies and appeal to their consciences.
Chetwynd: Maria, we could go on. I think I got through about a third of my questions, but we could go on. But I just want to actually finish by saying, which always comes back to me when I speak to you at this point, that as you keep saying, it can get better. I love your hell to purgatory thing. I’ve never been to purgatory, but it figures it’s better than hell. So that’s incredibly inspiring. So I’d love to, we’ve got about seven or eight minutes. I think there’s a microphone somewhere, I’m told.
Ressa: Sorry. Can I add one last thing, but please get the microphone to the, to the questions, because I’d love to hear to answer your questions. The last thing is that, you know, big tech is also helping us build, right? Because in the end, the solution for all of us is not just content, it’s tech, so we need to build deliberative technology. And you know, again, kudos to the companies that have helped us. We are now building something called the matrix protocol chat app. The Matrix protocol is an end-to-end, encrypted, decentralized system that is used by France and Germany. God help them in their elections.
But we connected it to the login of a news organization so that you get rid of all of the, you know, the DNS servers, all of that stuff that Mastodon will ask for. We are now, we’ve rolled this out December last year, around Christmastime and, and we are getting ready for the 2025 elections in the Philippines. Please take a look at it, because it is safe.
I’m making the bet that the enshittification of the internet is gonna be so bad that people who want information for their vote, who want information about where the flooding is happening, who want information, are gonna come to our app in the Philippines. We’re rolling it out to anyone who is interested. I think it’s that kind of moment and, and I thank again, the tech companies who have supported us through this. We’re frenemies.