Microsoft has updated Designer, the artificial intelligence tool capable of creating images starting from textual indications which would have been used to create sexually explicit photos of Taylor Swift which later went viral and were blocked by the X platform.

After the update, the app no longer allows you to associate any sexually explicit terms to generate similar images, even of non-famous people. According to the site 404 Media, the AI-created nude photos of Taylor Swift came from the 4chan forum and a Telegram channel where people used Designer to generate AI images of celebrities.

Before Swift's messages were spread on social media, Designer prevented the creation of content by typing terms like "Taylor Swift naked", but users were able to bypass the protections by spelling the name incorrectly and using words that were only sexually suggestive. Shortcomings that would be fixed after the update.

“We are investigating the reports and taking appropriate steps to address them,” a Microsoft spokesperson said shortly after the deepfakes were shared. «The code of conduct prohibits the use of our tools for the creation of intimate or non-consensual adult content and any repeated attempt to produce content contrary to our policies may result in loss of access to the service. We have teams working on just such monitoring, in line with responsible artificial intelligence principles, including content filtering and abuse detection to create a safer environment for users", concluded the Redmond company.

(Unioneonline/D)

© Riproduzione riservata