The Limited Times

Now you can see non-English news...

Voters: don't let misinformation take away your power

2024-02-28T04:56:40.233Z

Highlights: The year 2024 has been dubbed the “year of elections,” in which more than two billion people will have the opportunity to vote. John Sutter: In a time of lying for power or profit, it is essential that we protect and assert that agency when we head to the polls. He says we must help people learn to navigate an increasingly contaminated, confusing and fast-paced information environment. The most powerful way to support a cleaner, reality-based information ecosystem is not to add more disorder and vitriol, he says.


The most powerful way to support a cleaner, reality-based information ecosystem is not to add more disorder and vitriol, but to learn to slowly and deliberately engage with the Internet.


The year 2024 has been dubbed the “year of elections,” in which more than two billion people will have the opportunity to vote in races of great importance around the world.

Alarmist headlines abound.

You may have been told, “Artificial intelligence will drive misinformation by 2024.”

Or, perhaps, a darker version: “Disinformation will be unstoppable during the election year.”

Neither claim is entirely false, but both deny voters any agency.

And in a time of lying for power or profit, it is essential that we protect and assert that agency when we head to the polls.

I have been studying misinformation for almost a decade.

My first job in Washington was working on democracy support programs in Russia and Belarus, and I watched the Kremlin rehearse on its citizens the tactics it would later use in the United States.

He explored them further during his first invasion of Ukraine.

I was an advisor to the Ukrainian Foreign Ministry in 2016-2017, and I watched from kyiv as my own country shook in response to revelations that Moscow had interfered in our democratic process;

My colleagues in Ukraine were not surprised.

Since then, I have dedicated my work to exposing in the United States the lessons our allies have learned the hard way.

One lesson that has always persisted is that we must help people learn to navigate an increasingly contaminated, confusing and fast-paced information environment.

I am sad to say that we have not made much progress in this regard.

Too often we turn to technical solutions to solve intrinsically human problems.

Take Joe Biden's recent manipulated

robocall

ahead of the New Hampshire primary.

Someone used artificial intelligence (AI) to generate fake audio of the president of the United States urging Democratic voters not to attend the party's primary in the state last January;

If they did, they would help the Republicans, the imposter Biden told them.

(Not only was Biden's voice false, but so was the forecast. In New Hampshire, Democratic and Republican voters vote separately, so voting for Democrats would not affect Republicans.)

A few weeks after the

robocall

came to light , the Federal Communications Commission banned the use of AI-generated voices in robocalls, an unusual action in American politics for its speed and forcefulness.

But audio, photos and videos generated by artificial intelligence still have many more vectors to penetrate the information sphere of the United States in these elections.

They could be sent from user to user or in closed Facebook, WhatsApp or Telegram groups.

And there it will be much more difficult to trace the origin and distribution of these falsehoods, not to mention more difficult to crack down on them.

That is why it is essential that people reject the passive consumption of information that has become endemic to the digital age and begin to think critically about the context and content of the information they consume.

In the case of AI-generated

robocall

, I'm not just talking about listening to the characteristics of AI-generated voice files, which are difficult for most people to detect.

I also mean thinking about the circumstances surrounding the call.

Would the democracy-loving Joe Biden we know really urge voters to stay home under any circumstances?

Do the robocall's claims about “helping Republicans” even make sense?

Beyond that specific incident, voters should consider how

the information they consume makes them

feel .

We know that social media news plays with emotions: the more

infuriating

the content is, the more

attractive

it is and the more likely it is to go viral.

So when we feel upset by something we see on the internet, we should step away from our devices.

Take a walk.

Let's calm down.

If, after a few minutes, we are still thinking about the content, there are some simple things we can do to evaluate how to proceed.

First, let's consider the source.

Is the uploader or the author known?

Is it an organization or an individual?

If it is an individual, does their account appear legitimate?

Has it been created recently?

Do you have friends or followers?

Do you post in a way that feels human and organic?

Secondly, if we are seeing newsworthy information, let's check if other well-known media outlets across the political spectrum echo it.

Third, if it is an image, using a reverse image search tool, which tells us when an image was first published on the internet, can give us a clue as to whether it has been misattributed, deceptive or even manipulated by AI.

This list of questions is neither exhaustive nor foolproof, but it will help you do something important while browsing: slow down.

Today's information environment is not only polluted, it moves quickly.

We have seen prestigious media outlets make blunders in their information and attributions in the incessant fight for clicks and visits, and

we know

that disinformers share alarming or sensational content to obtain power or benefits.

We don't have to play along.

In this “election year,” the most powerful way to support a cleaner, reality-based information ecosystem is not by adding more disorder and vitriol, but by learning to participate slowly and deliberately on the internet, and by rewarding politicians who approach their work and campaigns with the same ethics.

Nina Jankowicz

is an expert on disinformation, democratization and digital hate, and vice president for the United States of the Center for Information Resilience.

She is also the author of two influential books: one on misinformation (

How to lose the information war

) and another on sexist cyberbullying (

How to be a woman

online).


Source: elparis

All news articles on 2024-02-28

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.