The Limited Times

Now you can see non-English news...

False image of Pentagon explosion briefly goes viral

2023-05-22T19:59:18.645Z

Highlights: A fake image showing an explosion at the Pentagon briefly went viral on Twitter on Monday. The image was apparently made with a generative AI program (capable of producing text and images from a simple query in everyday language) The image caused the markets to drop slightly for a few minutes, with the S&P 500 losing 0.29% compared to Friday before recovering. An account of the QAnon conspiracy movement was among the first to relay the false image, the source of which is not known.


The fake photograph was allegedly made with a generative Artificial Intelligence program.


A fake image showing an explosion at the Pentagon briefly went viral on Twitter on Monday, causing markets to slight slump for ten minutes, and reigniting the debate around the risks associated with artificial intelligence (AI).

The fake photograph, apparently made with a generative AI program (capable of producing text and images from a simple query in everyday language), forced the US Department of Defense to react.

Source unknown

«

We can confirm that this is false information and that the Pentagon was not attacked today," a spokesman said.

Firefighters from the area where the building is located (in Arlington, near Washington) also intervened to indicate on Twitter that no explosions or incidents had taken place, either at the Pentagon or nearby. The image seems to have caused the markets to drop slightly for a few minutes, with the S&P 500 losing 0.29% compared to Friday before recovering.

«

There was a drop related to this false information when the machines detected it," said Pat O'Hare of Briefing.com, referring to automated trading software that is programmed to react to social media posts. "But the fact that it remained measured in relation to the content of this false information suggests that others have also considered it muddy," he added for AFP. An account of the QAnon conspiracy movement was among the first to relay the false image, the source of which is not known.

Read alsoMathieu Bock-Côté: "Conspiracy chic and QAnon mundan"

The incident comes after several fake photographs produced with generative AI were widely relayed to show the capabilities of this technology, such as that of the arrest of former US President Donald Trump or that of the Pope in a down jacket. Software like DALL-E 2, Midjourney and Stable Diffusion allow hobbyists to create compelling fake images without needing to master editing software like Photoshop.

But if generative AI facilitates the creation of fake content, the problem of its dissemination and virality - the most dangerous components of disinformation - is the responsibility of platforms, experts regularly remind us. "Users are using these tools to generate content more efficiently than before (...) but they still spread via social media," Sam Altman, the boss of OpenAI (DALL-E, ChatGPT), said at a congressional hearing in mid-May.

Source: lefigaro

All news articles on 2023-05-22

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.