November 5, 2024

Last week, a video went viral claiming to show Haitian immigrants voting more than once in Georgia. It turns out the Frankenstein’s monster of recent misinformation narratives was the work of Russian operatives.

Welcome to Election Day 2024, which feels eerily similar to Election Day 2020. “Zombie claims” about voter fraud, noncitizens voting and nefarious ballot dumping by elections workers have already flooded social media. Just take a glimpse — I implore you not to linger — at X’s Election Integrity Community and you’ll find a dizzying feed of conspiracy theories and targeted harassment of elections officials and postal workers.

But, even with the threat of artificial intelligence-generated images, audio and video, the same simple media literacy tips that helped identify misinformation in previous elections still work today. In fact, if you’re a teacher reading this, Poynter’s digital media literacy initiative MediaWise just released a curriculum to help students separate fact from fiction.

Check your emotions and embrace uncertainty

It might take days for the final results of the election to be tallied, which also means days filled with uncertainty and anxiety.

Disinformation often spreads by triggering strong emotional responses.

Watch out for common manipulation techniques that are designed to prey on your emotions and the uncertainty of the election:

  • Emotional or inflammatory language: Claims about “dead voters” employ this tactic to frighten you into engaging with misinformation
  • Cherry-picking: Posts about voting machines “flipping votes” or otherwise malfunctioning may contain a kernel of truth, but are rare occurrences with simple explanations
  • Scapegoating: Allegations of noncitizen voting schemes aim to cast unwarranted blame on a group of people

Take a breath, step away from social media, and check fact-checking websites, official election websites and trusted news sources for the facts.

Use lateral reading to verify social media posts

If you come across a video on X claiming that a voting machine won’t let a Kentucky voter cast a ballot for Donald Trump, check out who posted it. Are they an expert? A journalist in Kentucky?

Leave X, which is timeless advice, open a tab and use a search engine to find out more about the user, in this case, an internet personality named Nick Sortor. The most recent news involving Sortor was about a misleading video he posted about actor and wrestler Dwayne “The Rock” Johnson. And, according to his X bio, Sortor is based in Washington, D.C. — far from Kentucky.

The key here is to open multiple tabs and read laterally about the person or organization behind the post.

Check the timestamp — context matters

Old photos and videos are frequently recycled to make false claims about current events. Before sharing anything about polling places, voting machines or election workers, check when the image or video was first posted.

Do a reverse image search using Google Lens or TinEye. Both will find matching images from around the web, which can help you pinpoint the original date, time and context behind a photo. Pro tip: For videos, take a screenshot and reverse image search a prominent frame.

This technique can also confirm whether a photo has been manipulated, as PolitiFact found in this fact check debunking an altered image claiming to show Kamala Harris and Sean “Diddy” Combs,

Read upstream and check out the original source

When content goes viral, it often gets filtered through multiple social media posts, each adding its own spin. That’s why reading upstream — finding the original source — is crucial.

A recent Threads post claimed Joe Rogan would not endorse Donald Trump after having the former president on his podcast. But, while a lot was discussed during the three-hour interview, Rogan never mentioned who he would be voting for — or who he wouldn’t.

You may also see videos of politicians or celebrities that have been clipped out of context. Make sure you’re watching the full original video before engaging with the post.

Watch for AI-generated content

While generative AI misinformation hasn’t been a major threat in elections around the world, there have been notable examples ahead of the U.S. election. Fake robocalls impersonating President Joe Biden in New Hampshire and AI-generated images targeting Harris come to mind.

Here’s what to look for when checking election content that might be AI-generated. But keep in mind, as the technology advances these won’t always be useful:

In images:

  • Warped text in campaign signs
  • Oddly placed shadows or lighting
  • Misaligned facial features and incorrect number of fingers
  • Irregular backgrounds or blurry edges

In audio and video:

  • Unnatural voice patterns or robotic undertones
  • Lips not matching speech
  • Strange body movements
  • Inconsistent audio quality
  • Abrupt changes in background noise

While platforms like TikTok and Meta now require labels on AI-generated election content, enforcement isn’t perfect. When you think something might have been created with AI:

  • Check if the post has an AI disclosure label
  • Search the image using Google Lens or TinEye and see if you end up on an AI forum or subreddit
  • Use lateral reading to investigate the user who shared it
  • Check the candidate or election official’s social channels
  • Consider whether the content is trying to provoke an emotional response

The most important thing to remember? You don’t have to share everything you see about elections on social media. When in doubt, verify first — or don’t share at all.

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Alex Mahadevan is director of MediaWise, Poynter’s digital media literacy project that teaches people of all ages how to spot misinformation online. As director, Alex…
Alex Mahadevan

More News

Back to News

Comments

This site uses Akismet to reduce spam. Learn how your comment data is processed.