The Limited Times

Now you can see non-English news...

“You are sick”: fake porn images of Taylor Swift cause general outrage

2024-01-26T19:58:19.776Z

Highlights: One of these images has been viewed more than 47 million times on the social network. X assured that it had “a zero tolerance policy” on the non-consensual publication of nudity images. “What happened to Taylor Swift is not new, women have been the target of false images without their consent for years,” recalled the elected Democrat Yvette Clarke, who supported a law to fight against the phenomenon. A 2019 study estimated that 96% of deepfake videos were pornographic in nature.


One of these images has been viewed more than 47 million times on the social network. According to American media, the image remained on


Glaucous.

The American political class and fans of Taylor Swift expressed their indignation on Friday as false pornographic images featuring the singer, and created using generative AI, were widely shared in recent days on X (formerly Twitter) and other platforms.

One of these images has been viewed more than 47 million times on the social network.

According to American media, the image remained on X for more than 17 hours before being deleted.

Fake pornographic images (“deepfakes”) of famous women, but also targeting many anonymous people, are not new.

But the development of generative artificial intelligence (AI) programs risks producing an uncontrollable flow of degrading content, according to many activists and regulators.

Read alsoTaylor Swift, the American pop star, named personality of the year 2023 by Time magazine

The fact that such images this time affect Taylor Swift, second among the most listened to artists in the world on the Spotify platform, could however help to raise awareness of the problem among the authorities, given the indignation of her millions of fans.

“The only

positive thing

about this happening to Taylor Swift is that she is influential enough to pass a law to eliminate this.

You are sick”, posted on X Danisha Carter, an influencer with an audience of several hundred thousand people on social networks.

“What happened to Taylor Swift is nothing new”

X is known for having less strict rules on nudity than Instagram or Facebook.

Apple and Google have a right to control the content circulating on applications via the rules they impose on their mobile operating systems, but they have tolerated this situation on X, until now.

In a press release, X assured that it had “a zero tolerance policy” on the non-consensual publication of nudity images.

The platform declared “concretely remove all identified images” of the singer and “take appropriate measures against the accounts having posted them”.

Representatives of the American singer have not yet commented.

“What happened to Taylor Swift is not new, women have been the target of false images without their consent for years,” recalled the elected Democrat Yvette Clarke, who supported a law to fight against the phenomenon.

“With advances in AI, creating these images is easier and cheaper.”

VIDEO.

Taylor Swift named 2023 Person of the Year by Time magazine

A 2019 study estimated that 96% of deepfake videos were pornographic in nature.

According to Wired magazine, 113,000 of these videos were uploaded to major porn sites during the first nine months of 2023.

Source: leparis

All life articles on 2024-01-26

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.