The Limited Times

Now you can see non-English news...

Disinformation is an unprecedented threat to the 2024 elections and the US is less prepared than ever

2024-01-26T02:57:42.765Z

Highlights: Disinformation is an unprecedented threat to the 2024 elections and the US is less prepared than ever. The US presidential election comes at a time when circumstances are ideal for the spread of falsehoods, propaganda and conspiracy theories. A growing number of voters have proven susceptible to the misinformation spread by former President Donald Trump and his allies. The lie that the 2020 election was "stolen" has proven surprisingly effective with Republicans, with consequences such as distrust in future elections. Solutions to the enormity of that threat are piecemeal and distant: reviving local news and creating information literacy programs.


The US presidential election comes at a time when circumstances are ideal for the spread of falsehoods, propaganda and conspiracy theories.


By Brandy Zadrozny -

NBC News

Disinformation presents an unprecedented threat to democracy in the United States in 2024, according to researchers, technology experts and political scientists. 

As the presidential elections approach, experts warn that a convergence of events at home and abroad, in traditional and social media - and amid an environment of growing authoritarianism, deep distrust and political unrest and social - makes the dangers of propaganda, falsehoods and conspiracy theories more serious than ever.

The US presidential election takes place in a historic year, in which billions of people vote in other elections in more than 50 countries, including Europe, India, Mexico and South Africa.

And it also comes at a time of

ideal circumstances for misinformation

and the people who spread it. 

The lie that the 2020 election was "stolen" has proven surprisingly effective with Republicans, with consequences such as distrust in future elections.Chelsea Stahl / NBC News;

Redux;

Getty Images

A growing number of voters have proven susceptible to the misinformation spread by former President Donald Trump and his allies;

artificial intelligence technology has become ubiquitous;

social media companies have sharply scaled back efforts to curb misinformation on their platforms;

and attacks on the work and reputation of academics who track disinformation have chilled research on the topic. 

“On the one hand, this should look like January 2020,” said Claire Wardle, co-director of the Information Futures Lab at Brown University, who studies disinformation and elections, referring to the presidential contenders from four years ago.

“But after a pandemic, an insurrection, and a strengthening of the belief that the election was stolen, as well as congressional investigations into those of us who work in this field, it feels completely different.”

[Trump defies the judge again by testifying in the trial for defaming the writer E. Jean Carroll]

Research suggests that misinformation has little direct effect on voter decisions, but its spread by political elites, especially national candidates, can influence how people decide on issues.

It can also provide false evidence for claims with conclusions that threaten democracy or public health, when people are persuaded to take up arms against Congress, for example, or refuse to get vaccinated.

Solutions to the enormity of that threat are piecemeal and distant: reviving local news, creating information literacy programs, and passing meaningful legislation around social media, among others. 

Fixing the information environment around the election involves more than just “tackling misinformation,” Wardle said.

“And the political violence and the aftermath of January 6 showed us what is at stake.”

Prepared for misinformation

The most likely Republican presidential candidate is also the previous president, whose time in office was marked by lies spewed in a failed effort to stay in office, falsehoods to which Trump continues to cling.

Disinformation in the service of the lie that the election was “stolen” – spread through a network of television, radio and online media – has been astonishingly effective among Republicans.

One consequence is distrust in future elections.

On the one hand, disinformation spreaders have suffered real consequences.

Lies about the coronavirus and the elections have cost prominent doctors and news anchors their jobs.

Civil courts have awarded

millions of dollars to victims of disinformation

.

Hundreds of federal criminal convictions have stemmed from the January 6, 2021 insurrection. And people who allegedly participated in a plot to overturn President Joe Biden's victory, including state GOP officials, lawyers and Trump himself, They face criminal charges. 

It remains to be seen whether networks like Fox News or individuals like Rudy Giuliani would be so willing to promote disinformation in 2024 in the face of such consequences.

But some predictable actors and newcomers to the right-wing media have already signaled their willingness to contribute.

“Right-wing media is seeing a demand for content that is pro-Trump and leans toward conspiracy theories,” said AJ Bauer, an assistant professor of journalism at the University of Alabama who studies conservative media. 

In addition to national websites known for misinformation, new “hyperpartisan” local news organizations could also be a factor, Bauer said, with claims acting as fuel for larger national conspiracy theories. 

“These outlets could look for examples of voter fraud or intimidation at a very local level,' even if they are not real,” Bauer said. 

What is at stake

It is not just voters, but also small influence groups made up of state legislators, election officials and election workers motivated by misinformation, who can affect the next election. 

“Election denialism and misinformation coming from the far right was clearly on display at the federal level” with the 2020 election, explained Christina Baal-Owens, executive director of Public Wise, a nonpartisan voting rights organization that tracks local election officials who have questioned the legitimacy of the 2020 election. “What was less clear was

a threat hiding in plain sight

, a movement working at the local level.” 

[Peter Navarro, former Trump advisor, sentenced to 4 months in prison for hindering the investigation into the assault on the Capitol]

Public Wise has counted more than 200 people who attended, financed or organized the January 6 insurrection attempt and won office in 2022. In Arizona alone, more than half of the electors are represented by state legislators who openly deny the validity of the elections.

“We are looking at a well-organized movement that is working to influence elections across the country,” Baal-Owens explained.

“They have the ability to determine how people vote, how votes are counted and whether they are certified or not.” 

Trump supporters in Manchester, New Hampshire, on January 20, 2024.Getty Images

The attack on the Capitol was the most visible example of political extremism spilling over into real-world violence.

But 2020 was also marked by violence, or the threat of it, at state capitols and protests against coronavirus shutdown measures, a trend that experts fear will continue. 

“We are monitoring voter 'vigilantism',” said Joan Donovan, an assistant professor of Journalism and Emerging Media Studies at Boston University, who studies political violence.

“People organizing on Telegram channels and

showing up at the polls with weapons

,” Donovan said, in states that allow it, was a tactic of activists who said they were fighting voter fraud during the 2020 presidential elections and in the legislative elections. . 

“I think that's going to be the next wave,” Donovan said. 

Old lies, new technology 

Over the weekend, far-right political activist and Trump ally Laura Loomer planted a conspiracy theory about the Iowa caucus recount. Loomer's complicated claim about corruption mirrored previous unfounded rumors floating around in 2020. 

The falsehoods may remain the same for now, but the technology used to create the propaganda has improved.

Advances in artificial intelligence, from

chatbots

to audio and video generators, have made easy-to-use media manipulation tools available to the public.

A survey by the World Economic Forum identified disinformation and the use of artificial intelligence for this purpose as the main global risk in the next two years, ahead of climate change and war. 

Scammers have had success with so-called

deepfakes

, especially in making artificial intelligence-generated videos of celebrities promoting products such as health supplements or cryptocurrencies.

Although campaigns are beginning to use artificial intelligence in ads and states are rushing to legislate around it, the much-hyped threat of this technology to elections has not yet materialized.

More often, cheap artificial intelligence is being used to create propaganda, especially from Trump loyalists.

Content from self-described “meme teams,” who act as volunteers, according to the Trump campaign, is already being shared by Trump on his social media platform, Truth Social.

These memes defame other candidates and their spouses, lawyers and judges involved in the impeachment of Trump, journalists, and politicians and state election officials considered enemies of the Trump camp.

“Granted it's bungled and not credible in any way, shape or form, but it's just a matter of time until something works,” said Ben Decker, CEO of Memetica, a digital research company.

“The misinformation narratives, the meme wars, are back.

That content is going to clutter certain parts of the public forums.” 

The effect on the world at large is clear, Decker said: “

Harassment of public officials

, members of the media and civil society groups is going to proliferate.”

According to Laura Edelson, an associate professor at Northeastern University and co-director of Cybersecurity for Democracy, which studies political misinformation, a larger potential threat lies in the ability of generative artificial intelligence tools to personalize misinformation, making it difficult for authorities to moderate it. social media platforms because it looks authentic.

[In Florida they want to prohibit social networks for children and in New York they say they are a “toxin”]

“It's going to be a lot harder this cycle as people are laundering misinformation through generative AI tools,” Edelson said.

“Disinformation will be more effective in isolated communities and harder to detect.

“Platforms must create new tools.”

Instead, Edelson and others say the platforms are cutting moderation teams to the bone.

Since 2021, the largest social media companies have deprioritized efforts to prevent falsehoods from going viral, according to critics. 

Elon Musk's social network Rose Lang-Maso, campaign director for Free Press, a digital civil rights organization. 

“Without policies to moderate content and without enough content moderators actually doing the moderation, propagators are more likely to increase online and offline abuse,” Lang-Maso said.

“The platforms are really abdicating responsibility to users.”

Meta, YouTube and X have denied reports that they are

ill-prepared to prevent the spread of electoral disinformation

.

“Content that misleads voters about how to vote or encourages interference in the democratic process is prohibited on YouTube,” YouTube spokesperson Ivy Choi said in a statement to NBC News.

“We continue to invest heavily in the policies and systems that connect people to high-quality content, and our commitment to supporting the 2024 elections is strong.”

A Meta spokesperson declined to comment, but shared a press release about the company's plans for the 2024 election.

Who watches?

The first challenge in the fight against disinformation in the 2024 cycle could be identifying it. 

The social media space has become fragmented with the rise of alternatives such as Substack, Telegram, Threads and Rumble as viable spaces for political actors and extreme content.

And a pressure campaign by conservative activists may affect the number of trained eyes available to watch.

Republican politicians and activists responded to the wave of misinformation in 2020 by attacking the researchers, universities, tech companies and journalists who pointed it out.

Using social media campaigns, courts, and congressional committees, far-right critics have published unfounded accusations that efforts to curb misinformation around the election and the pandemic were part of a plot to censor conservatives. .

Some researchers said such partisan campaigns, which have included onerous requests for information and threats of reputational damage and legal action against institutions, have had a

chilling effect on new research

ahead of 2024. 

The lack of transparency from social media companies poses an additional challenge for observers.

The so-called black boxes surround the algorithms that serve the content, and the inability to see what is happening on the platforms in real time has only gotten worse. 

“We are flying blind,” said Mike Caulfield, a research scientist at the Center for an Informed Public at the University of Washington who studies election rumors.

According to Caulfield, the delay in early detection of false narratives is giving misinformation an advantage, and could delay fact-checking and context-checking efforts by journalists. 

Risks to national security, safety, and voting rights aside, the biggest threat from the next wave of misinformation could lie in widening

partisan divisions and weakening public trust

.  

“The direct effect of misinformation may not be as high as we think,” said Joshua Tucker, co-director of the Center for Social Media and Politics at New York University, referring to voting preferences.

“But the indirect effect is that people lose confidence in journalism, they lose confidence that there is an objective truth out there and they believe that anything can be disinformation.”

Source: telemundo

All news articles on 2024-01-26

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.