_
_
_
_
DISINFORMATION
Tribune
Opinion articles written in the style of their author." These texts are to be based on verified facts and must be respectful towards people, even though their actions may be criticized. shall feature, along with the author's name (regardless of their greater or lesser renown), a footer stating their office, academic title, political affiliation (if any) and main occupation, or the occupation related to the topic being assessed

Voters: Don’t be disempowered by disinformation

The most powerful way to support a cleaner, reality-based information ecosystem is not by adding yet more clutter and vitriol to it, but by learning how to slowly and deliberately engage online

Voters disinformation
Former president Donald Trump speaks to the media in New Hampshire, on January 23, 2024.Matt Rourke (AP)

2024 has been dubbed the “year of elections,” when over two billion people will have the opportunity to cast ballots in consequential contests around the world. Alarmist headlines abound. “Artificial Intelligence will power disinformation in 2024,” you have likely been told. Or, perhaps, a gloomier version: “Disinformation will be unstoppable during the year of elections.” Neither assertion is entirely false, but both deny voters any agency. And in an age in which lies are told for power or for profit, it is critical we protect and assert that agency as we head to the ballot box.

I have studied disinformation for the better part of a decade. My first job in Washington was working on democracy support programs to Russia and Belarus, watching as the Kremlin auditioned the tactics on its own citizens it would later employ in America. It explored them further in its first invasion of Ukraine. I served as an advisor to the Ukrainian Foreign Ministry in 2016-17, watching from Kyiv as my own country reeled in response to revelations that Moscow had interfered in our democratic process; my colleagues in Ukraine were not surprised. Since then, I’ve dedicated my work to bringing the lessons our allies have learned the hard way to bear at home. One lesson that has always come through is that we must help people learn to navigate an increasingly polluted, confusing, and fast-paced information environment. I’m sad to say we haven’t made much progress on it.

We too often jump to technical solutions to try to solve what are inherently human problems. Let’s take the recent deepfake audio robocall of Joe Biden ahead of the New Hampshire primary election, for example. Someone used artificial intelligence to generate fake audio of the U.S. president urging democratic voters not to turn out for the state’s January party primary; doing so would help the Republicans, fake Biden told them. (Not only was Biden’s voice faked, the sentiment was false as well. In New Hampshire, separate ballots are drawn up for Democratic and Republican voters, so turning out for the Democratic ticket would have no effect on the Republicans.) Within weeks of the robocall surfacing, the Federal Communications Commission banned the use of AI-generated voices in robocalls, an action rare in American policymaking for its speed and decisiveness.

But AI-generated audio, photos, and video still have plenty of other vectors to penetrate America’s information sphere this election. They might be messaged user-to-user, or in closed groups on Facebook, WhatsApp, or Telegram. And there, it will be much more difficult to track the provenance and distribution of such fakery, not to mention more difficult to crack down on. That’s why it’s critical individuals reject the passive information consumption that has become endemic to the digital era and begin to think critically about the context and content of the information they consume. In the case of the AI robocall, I don’t mean simply listening for the hallmarks of AI-generated audio, which are difficult for most people to detect. I mean thinking about the circumstances surrounding the call. Would the democracy-loving Joe Biden we know really urge voters to stay at home under any circumstances? Do the robocall’s allegations about “helping the Republicans” even make sense?

Beyond that specific incident, voters must consider how the information they’re consuming makes them feel. We know that social media feeds operate on emotion —the more enraging the content, the more engaging the content, and the more viral it is likely to become. So when we feel ourselves getting worked up about something we see online, we should step away from our devices. Take a walk. Calm down. If, after a few minutes, we’re still thinking about the content, there are a few simple things we can do to assess how to proceed. First, consider the source. Is the poster or author well-known? Is it an organization or an individual? If it’s an individual, does their account seem legitimate? (Was it created recently? Does it have friends or followers? Does it post in a way that seems human and organic?) Second, if you’re looking at newsworthy information, check to see if other well-known outlets across the political spectrum are reporting it. Third, if you’re looking at an image, using a reverse image search tool, which tells you when the first instance of an image was posted online, can clue you in to whether it has been misattributed, edited in a misleading way, or perhaps even deepfaked.

This list of questions isn’t exhaustive or foolproof, but it will get you to do one important thing as you browse: slow down. Not only is today’s information environment polluted, it is fast-moving. We have seen reputable news organizations make glaring mistakes in their reporting and attribution in the incessant struggle for clicks and views, and we know that disinformers share alarming or sensational content for power or profit.

We needn’t play into that. This “year of elections,” the most powerful way to support a cleaner, reality-based information ecosystem is not by adding yet more clutter and vitriol to it, but by learning how to slowly and deliberately engage online, and by rewarding the politicians who approach their jobs and campaigns with the same ethos.

Nina Jankowicz is an expert on disinformation, democratization, and digital hate, and the Vice President, U.S., at the Centre for Information Resilience. She is the author of two books: How to Lose the Information War and How to Be a Woman Online.

More information

Archived In

Recomendaciones EL PAÍS
Recomendaciones EL PAÍS
_
_