Recently, I tried a little experiment on social media. I set up two fake Twitter accounts. On one, I followed a bunch of left-leaning accounts. On the other, a bunch of right-leaning accounts.
The “For You” feed on both accounts was filled with only political content within just 10 minutes. Here’s the catch: My left-following account only saw left-leaning content and my right-following account only saw right-leaning content.
I had created two echo chambers — in just minutes. An echo chamber is a place where you only find posts, videos, memes or anything that supports your beliefs. One point to make: My two test accounts created political echo chambers. But this can happen with any type of information — religious, scientific, cultural … the list goes on.
Here’s how it happened and how to avoid it.
What is an algorithm?
The posts you see on your social media feeds are fed to you via algorithms, a computer process that figures out what you may want to see or read.
An algorithm will “serve up stuff it predicts you’re going to love or clutter your feed with stuff you hate to try and provoke a reaction,” according to this article from the CBC.
The goal is that you spend more time on the site. This isn’t to say that all algorithms are bad. If you’re really into Minecraft — and so is your echo chamber — you’re going to see the most relevant posts to your interests: posts about Minecraft. And there’s nothing wrong with that.
The problem is that misinformation thrives in echo chambers. If you only see posts that you agree with, you’re less likely to be critical of false or misleading videos or memes. This is called confirmation bias. When all the posts you see reinforce your point of view, you will not encounter opposing opinions, and you may end up spreading false information.
And sometimes that can be dangerous.
Robert Stanford is one scary example. He pleaded guilty to assaulting Capitol police officers during the Jan. 6 Capitol riots. His defense? That he got sucked into an online echo chamber of conspiracy theories that messed with his head, influencing him to join the riot.
The experiment
Let’s take a closer look at my experiment. The two brand new Twitter accounts were linked to two new emails and named test 1 and test 2. Test 1 followed a host of left-wing users such as President Joe Biden, Vice President Kamala Harris and Nancy Pelosi. Test 2 followed right-wing accounts such as Donald Trump, Marjorie Taylor Greene and a conservative political influencer named “Catturd.” Both followed Elon Musk as well.
When new users join Twitter, they are recommended a host of accounts to follow. It’s not uncommon for people to follow politicians they align with or accounts that seem to align with their political interests initially.
These accounts were created shortly after the Nashville school shooting, and both pages were flooded with content about that on the first day.
But each side had very different takes, with content on the left-wing account lobbying for gun restrictions and content on the right-wing account speculating about the shooter’s gender identity.
The next day, Trump’s indictment was announced and that became the subject of most tweets on both accounts.
The left-leaning account was flooded with memes celebrating the indictment, while the right-wing account was flooded with accounts supporting Trump and warning that the arrest may be a part of a larger leftist agenda.
Clearly, the tweets I was seeing on the left side of the political spectrum were completely different from those on the right. It was as if the two sides were living in totally different worlds — as if everyone on Twitter is just talking to themselves and not really listening to anyone else.
How to avoid echo chambers
So how do we avoid echo chambers?
First, diversify the types of news and entertainment sites that you follow to get a diverse range of perspectives. Follow left-leaning and right-leaning sites. Read news stories, analyses and opinion pieces.
Critical ignoring can also be a good solution. Do not interact with any posts that aim to gain traction through controversy, and the algorithm will be less likely to put them onto your feed, weakening the chances of an echo chamber.
When you encounter political, conspiratorial or scientific information, fact-check it before sharing by asking yourself these three questions:
- Who is behind the information?
- What is the evidence?
- What do other sources say?
Finally, this Wired article has one tip that I really like: Like everything to confuse the algorithm so it cannot tailor echo chambers. This can be especially effective for new accounts.
Here’s the thing to remember: Consuming the same viewpoints or content can distort one’s perspective on reality, making critical thinking difficult. While spending time on social media, be careful not to get sucked into an echo chamber.
NOTE TO TEACHERS: This article is featured in a free, one-hour lesson plan that teaches students what an echo chamber is and how to avoid them. The lesson is available through PBS LearningMedia, and includes a lesson summary and a handout, among other resources.