The Limited Times

Now you can see non-English news...

What is 'deepfake' pornography and how many have appeared in explicit videos, with few laws to protect them

2023-02-15T21:41:15.912Z


Digitally manipulated pornography has used the image of Twitch stars without their consent and has experienced a sharp increase. Only New York, Virginia, Georgia, and California have laws that specifically address this type of content.


By Bianca Britton -

NBC News

AI-generated pornography featuring the faces of non-consenting women is increasingly widespread on the internet, and the problem is spilling over into the world of influencers and

streamers

.

In January, British

streamer

Sweet Anita, who has 1.9 million followers on Twitch, where she posts videos of her gameplay and interacts with her followers, was notified that a barrage of fake, sexually explicit videos featuring the faces was circulating online. of Twitch streamers.

The first thing he thought was, "Wait, am I in on this?"

He quickly googled his name along with the term

deepfake

, a word used to describe a highly realistic but fake digitally manipulated video or image, and a technique that is increasingly being used - often without consent - for pornographic purposes.

Anita's initial search turned up several videos in which her face was edited onto another person's body.

[Fines, Penalties and Even Jail Time: Super Bowl Betting Winnings Must Be Reported to the IRS]

“Obviously this has been going on for quite some time without my knowing, I had no idea.

From what I know, it could have been years,” said Anita, 32, who did not want to share her full name with NBC News, Noticias Telemundo's sister network, out of concern for her security and privacy offline.

Hany Farid, a professor of computer science at the University of California at Berkeley, says that

deepfakes

are a phenomenon that is "absolutely getting worse" as it becomes easier and easier to produce sophisticated and realistic videos through automated applications and web pages.

The risk of making purchases on social networks goes beyond losing money: these are the tips

Feb 13, 202302:15

The number of manipulated pornographic videos available online has seen a sharp rise, nearly doubling every year since 2018, according to research by

livestreaming

content analyst Genevieve Oh .

In 2018, only 1,897 videos had been uploaded to a well-known deepfake streaming site

,

but

by

2022 this number had risen to over 13,000 with over 16 million monthly views.

Before, celebrities were the main targets of

deepfakes

.

“Now suddenly the vulnerable people are the ones with very small footprints on the internet,” Farid says.

“The technology is getting so good that it can generate images from relatively small training footprints, not those hours and hours of video that we used to need.”

[Twitter blocks hashtags promoting the sale of child sexual abuse material]

Anyone interested in creating

deepfakes

can quickly access a plethora of free and paid face swapping apps available from the Google Play and Apple App stores that make it easy for anyone to upload a photo and edit it into a photo or video on question of seconds. 

Some major platforms like Reddit, Facebook, TikTok, and Twitter have attempted to address the spread of

deepfake

porn

with changes to their policies.

Although each of the platforms specifically prohibits the material, some have had trouble moderating it.

A Twitter search, for example, found doctored pornographic videos claiming to feature Twitch stars, along with hashtags promoting this content.  

In January, the proliferation of deepfake

pornography

caused an internet sensation when a popular Twitch

streamer

with more than 300,000 followers admitted paying for explicit material featuring AI-generated versions of his peers. 

The number of deepfake porn videos available online has risen sharply, nearly doubling every year since 2018, according to research by livestreaming analyst Genevieve Oh. Justine Goode / NBC News / Getty Images

On January 30, in a tearful apology video that was re-shared on Twitter and garnered millions of views, Brandon Ewing — who goes by the Twitch username Atrioc — said he had clicked on an ad for deepfake pornography

while

browsing by a popular porn website.

He said he then went on to subscribe to and pay for content on a different website featuring other female

streamers

after becoming "kinky curiosity."

[Meta returns Trump's Facebook and Instagram accounts after more than two years suspended]

In a longer statement posted to Twitter on February 1, Ewing addressed Twitch

streamers

Maya Higa and Pokimane, whose image briefly appeared in a tab on a website hosting manipulated pornography during one of her livestreams.

"Your names were dragged into it and you were sexualized against your will," he said.

"I am sorry that my actions have led to further exploitation of you and your body, and I am sorry that your experience is not uncommon."  

Ewing did not respond to a request for comment.

Pokimane also did not respond to a request for comment, but wrote in a January 31 tweet: “Stop sexualizing people without their consent.

That's it, that's the tweet."

Higa said she had no further comment other than her January 31 Twitter statement, in which she wrote, in part, that the “situation makes me feel disgusting, vulnerable, nauseating and violated and all of these feelings are too much for me.” relatives”. 

[Is your child addicted to social media?

In California you can sue Facebook or TikTok if this controversial law is approved]

The incident highlighted the growing prevalence of AI-generated non-consensual pornography and the ethical issues it creates.

There has been an "uptick" of websites "being willing, eager and monetizing hosting this stuff," Farid said.

QTCinderella, another Twitch

streamer

who learned that he had appeared on the deepfake web, said she found it especially hurtful because Ewing is a close friend. 

“I think that's the most unfortunate thing: I didn't find out from Atrioc.

I found out about it because it was being talked about on the internet,” said QTCinderella, 28, who also did not share her full name with NBC News to protect her privacy and security offline.

He said he quickly located the video content on a subscription website account and issued a takedown notice, but the videos continue to spread like "wildfire." 

In the United States, while most states have laws prohibiting revenge pornography, only New York, Virginia, Georgia and California have laws specifically addressing deepfakes

,

according to the Cyber ​​Civil Rights Initiative.

For its part, the United Kingdom announced in November last year that it planned to criminalize this explicit content without consent.

QTCinderella said the current legal framework is "disheartening".

“Every lawyer I've talked to has essentially concluded that we don't have a case.

There is no way to sue the guy, ”he assured.

While much of the

deepfake

porn can appear amateurish and low-quality, Farid said he's now also seeing accounts that offer to create sophisticated custom manipulations of any woman for a small fee.

After seeing doctored videos of her being sold on the internet, Anita said she felt numb, tired and dissociated.

“They sell me against my will.

I did not consent to being sexualized,” she stated.

QTCinderella said she experienced "body dysmorphia."

“When you see a porn star's body grafted so perfectly where yours should be, it's the most obvious comparison game you can have in your life.

I cried and thought: 'my body will never be like this,' she recounted.

[57% of US teens feel sad and hopeless, according to the CDC]

Sophie Compton, who campaigns against intimate image abuse with the organization My Image, My Choice, said women who are subjected to such abuse are "shamed or silenced" and feel that their experience it is downplayed because there are few legal options available to those affected by

deepfakes

.

“We have to find a way to make these sites and their business model impossible,” Compton said.

Specific platforms hosting non-consensual sexual images should be held to account, rather than individual accounts and creators, according to Farid.

“If you really want to address this problem, you have to go upstream.

That's where all the power is,” he noted.

Anita said she wants there to be "very visible consequences."

What worries him most about the future is that it is impossible to know who bought the fake videos.

"When I go to a meet with fans I might end up hugging and signing something for someone who's seen me get

deepfaked

... and I'd have no way of knowing they're consuming that," he said.

"To have my body bought against my will is very, very horrible," she lamented.

Source: telemundo

All news articles on 2023-02-15

You may like

News/Politics 2024-03-18T16:16:27.651Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.