The Limited Times

Now you can see non-English news...

Telegram: more than 100,000 deepfake nude pictures of women shared publicly

2020-10-23T04:30:52.234Z


Researchers report on software that turns private photos into nude pictures. It is not known who is behind the program.


Icon: enlarge

AI software works on iPhones and Android models (icon image)

Photo: Feng Li / Getty Images

Researchers in the messanger service Telegram have discovered a program that can turn normal photos into nude pictures with just a few clicks.

The so-called deepfakes are created using a type of artificial intelligence.

"By the end of July 2020, 104,852 such photos had been published and distributed in several Telegram channels," the experts from the IT analysis company Sensity write in their report. 

In order to generate a nude picture, the perpetrators send photos of their victims to a program via Telegram.

After a short time they get back the photo that has been manipulated into a nude picture.

According to the Sensity researchers, the program uses Generative Adverserial Networks technology, a form of artificial intelligence.

Self-learning neural networks are behind the program.

They are programmed in such a way that they themselves learn to recognize clothed female bodies, to undress, to grasp the anatomy and to create a naked body from the photo. 

In this case, the software was programmed in such a way that it can only recognize and manipulate female bodies.

According to the technology magazine "The Verge", the quality of the images is different.

While some seem obviously manipulated, others look deceptively real.

Instructions with tips for better results can be found in the Telegram channels.

The more pictures, the cheaper

In the messenger service, the program was offered as a so-called bot.

Using Telegram's bot function, developers can integrate their own small programs as additional functions in chat windows.

Users who want to use the programs only need to write a chat message to the bot and it will react automatically.

It is not known who is behind the current deepfake program.

It is noticeable that the program is only available in English and Russian.

According to the Sensity Report, a greater proportion of the images on the public channels were shared by users from countries of the former Soviet Union.

According to the report, pictures are mainly uploaded by private individuals.

The recordings come from private property or from social media accounts.

Similar to normal image editing programs, there is a free version that can be recognized by a watermark in the image.

But there is also a premium version that delivers images faster and without watermarks.

The price per image is around 30 cents, the developers of the program offer a volume discount.

Victims can be blackmailed with deepfakes

The Sensity researchers assume that the number of victims is significantly higher than the more than 100,000 shared nude pictures suggest.

Because the generated nude pictures are not always shared in public channels and thus understandable.

Deepfake software has been circulating on the internet since 2017.

The researchers from Sensity see the danger that the images would be used to humiliate the victims publicly or to spread them in their private environment.

It also happens that perpetrators blackmail their victims "by threatening to publish the photos," the researchers write.

Image manipulation through deepfake programs is repeatedly debated as a future threat to democracies.

The concern is that manipulated recordings can make it much more difficult to unmask false information.

The report by the Sensity researchers shows, however, that the technology has so far been used primarily in the private sector.

The Telegram channels in which the images were distributed can no longer be found.

Sensity suspects that they were deleted by the operators.

Telegram did not respond to several inquiries from the researchers and to the request of SPIEGEL.

Icon: The mirror

Source: spiegel

All tech articles on 2020-10-23

You may like

Life/Entertain 2024-02-27T20:23:21.327Z
News/Politics 2024-03-07T11:37:30.901Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.