Credit: screenshot | YouTube | Taylor Swift
Searches for Taylor Swift were blocked on X (formerly Twitter) for several days after fake pornographic images of the pop star were discovered. They were generated by artificial intelligence (AI).
A “deep fake”AI-created fake image of American superstar was viewed 47 million times in X before being deleted. According to US media, the post remained active on the platform for approximately 17 hours.
Criticism from the artist’s fans and countless personalities allowed the blocking for several days of all searches related to Taylor Swift on the social network
The administration of President Joe Biden I even said “In the army” for this case. “Unfortunately, we often know that a lack of enforcement has a disproportionate impact on women and girls, who are the main targets of online harassment.“, lamented White House spokeswoman Karine Jean-Pierre.
The social network
Fake pornographic images of celebrities are nothing new, but activists and regulators fear AI could create a uncontrollable flow problematic content. And the fact that such images this time affect Taylor Swift, most listened to artist in the world on the Spotify platform in 2023could lead industry players to act more firmly.
According to analysts, X is one of the largest pornographic content platforms in the worldwith rules much more flexible such as Facebook or Instagram, owned by the Meta group.
Last week, however, the “strict” ban on “non-consensual nude images”ensuring that it applies a “zero tolerance policy to this type of content”.