Microsoft has updated Designer, an artificial intelligence tool capable of creating images starting from textual indications.
The platform was allegedly used to create sexually explicit photos of Taylor Swift which later went viral and were blocked by platform X. One of these obtained over 47 million views.
Now the app does not allow you to associate any sexually explicit terms to generate similar images, even of non-famous people.
The site 404 Media - which reported the news of the Redmond update - reported that the AI-created nude photos of Taylor Swift came from the 4chan forum and a Telegram channel where people used Designer to generate images of celebrities with AI.
Before Swift's messages were spread on social media, Designer prevented the creation of content by typing terms like "Taylor Swift naked", but users of the Telegram channel and 4chan understood that they could circumvent the protections by writing the name incorrectly and using words only sexually suggestive.
404 Media was able to verify that these shortcomings have been resolved with a recent update.
“We are investigating the reports and taking appropriate action to address them,” a Microsoft spokesperson said shortly after the deepfakes were shared.
“The code of conduct prohibits the use of our tools for the creation of intimate adult or non-consensual content and any repeated attempts to produce content contrary to our policies may result in loss of access to the service. We have teams working right on such monitoring, in line with the principles of responsible artificial intelligence, including content filtering and abuse detection to create a safer environment for users", concluded the Redmond company.
Reproduction reserved © Copyright ANSA