The Limited Times

Now you can see non-English news...

General outrage after the distribution of fake pornographic images of Taylor Swift

2024-01-26T20:27:39.158Z

Highlights: False pornographic images featuring Taylor Swift were widely shared on social network X. One of these images has been viewed more than 47 million times on the social network. X assured that it had “a zero tolerance policy” on the non-consensual publication of nudity images. The platform declared that it was “concretely removing all identified images” of the singer and “taking appropriate measures against the accounts that posted them” Representatives of the American singer have not yet commented.


Created using generative artificial intelligence, one of these images was viewed more than 47 million times on the social network X, before being deleted.


The American political class and fans of Taylor Swift expressed their indignation on Friday as false pornographic images featuring the singer, and created using generative AI, were widely shared in recent days on X (formerly Twitter) and other platforms.

One of these images has been viewed more than 47 million times on the social network.

According to American media, the image remained on X for more than 17 hours before being deleted.

False pornographic images (

“deepfakes”

) of famous women, but also targeting many anonymous people, are not new.

But the development of generative artificial intelligence (AI) programs risks producing an uncontrollable flow of degrading content, according to many activists and regulators.

Make authorities aware of the problem

The fact that such images this time affect Taylor Swift, second among the most listened to artists in the world on the Spotify platform, could however help to raise awareness of the problem among the authorities, given the indignation of her millions of fans.

“The only 'good thing' about this happening to Taylor Swift is that she is influential enough for a law to be passed to eliminate this.

You are sick

,” Danisha Carter, an influencer with an audience of several hundred thousand people on social networks, posted on X.

X is known for having less strict rules on nudity than Instagram or Facebook.

Apple and Google have a right to control the content circulating on applications via the rules they impose on their mobile operating systems, but they have tolerated this situation on X, until now.

In a press release, X assured that it had

“a zero tolerance policy”

on the non-consensual publication of nudity images.

The platform declared that it was

“concretely removing all identified images”

of the singer and

“taking appropriate measures against the accounts that posted them”

.

Representatives of the American singer have not yet commented.

“What happened to Taylor Swift is not new, women have been the target of false images without their consent for years

,” recalled the elected Democrat Yvette Clarke, who supported a law to fight against the phenomenon.

“With advances in AI, creating these images is simpler and cheaper”

.

A study carried out in 2019 estimated that 96% of deepfake videos were pornographic in nature.

According to Wired magazine, 113,000 of these videos were uploaded to major porn sites during the first nine months of 2023.

Source: lefigaro

All news articles on 2024-01-26

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.