_
_
_
_
_

The internet is filled with ‘deepfake’ Taylor Swift porn, evidencing the dangers of AI for women

Elon Musk’s social media platform X, where the images have been viewed millions of times, said it is working to eliminate all false content. A U.S. survey shows 74% of users who have engaged with deepfake pornography did not report feeling guilty

Taylor Swift imágenes porno falsas IA
Singer Taylor Swift upon her arrival at the Golden Globes ceremony in January.ALLISON DINNER (EFE)
Clara Angela Brascia

The internet has been filled with sexually explicit images of one of the most influential people of the moment: the singer Taylor Swift. Once again, as so often lately, these were not real photos, but nudes created with artificial intelligence, or deepfake porn. One of the most prominent accounts on the social media platform X accumulated over 47 million views, 24,000 reposts and hundreds of thousands of “likes” before it was suspended for violating the company’s policies. But it circulated for nearly a day, more than enough time to reignite concerns about the danger of the proliferation of fake AI-generated pornography that targets women. And about the challenge of stopping its spread: Swift is reportedly considering whether to sue the website that first posted the images, according to the Daily Mail.

Blocking X’s original account was not enough to settle the matter either, as others continued to spread the images; some of these accounts remain active. The term “Taylor Swift AI” became a trend in some regions, so much so that even X posted a reminder about its “zero-tolerance policy” towards this type of content. “Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them. We are closely monitoring the situation to ensure that any further violations are addressed immediately and the content is removed. We are committed to maintaining a safe and respectful environment for all users,” the post says.

It is not the first time this has happened, both to celebrities and to anonymous women. A few months ago, a group of schoolgirls in Spain were victimized when images manipulated with an artificial intelligence application began to circulate in their hometown, in a case that sparked public interest in the problems that can arise from this technology. It also happened to the singer Rosalía, who denounced the publication of a deepfake that was shared by the singer JC Reyes. In Mexico, the dissemination of hundreds of thousands of manipulated images of students caused a massive protest at the National Polytechnic Institute.

In the case of Taylor Swift, the technology news site 404 Media suggests that the images may have originated in “a specific Telegram group dedicated to abusive images of women” and generated by AI through a free text-to-image generator offered by Microsoft. The first user to spread the deepfakes on X goes by @Zvbear and made the account private after the singer’s legion of fans, known as Swifties, mobilized against it, using hashtag to give visibility to real Swift clips and push down the fakes.

According to some American media, the man behind @Zvbear is a 27-year-old Somali citizen residing in Canada, known for posting risky content on platforms such as 4chan, X and Reddit. These images have also been disseminated on a website expressly dedicated to nudes of famous women, in some cases explicit images from films in which they participated, but the volume of deepfake porn now circulating online has already affected celebrities such as Meghan Markle, Megan Fox and Kate Upton.

The “pornification” of women using artificial intelligence has skyrocketed in recent months. According to a recent study on deepfakes carried out in the United States by Home Security Heroes, 48% of men acknowledged having seen fake pornography at least once. And “74% of users who have engaged with deepfake pornography did not report feeling guilty about their consumption,” suggesting that deepfake content has become an accepted and normalized practice in adult entertainment preferences. In addition, 20% of the participants in the survey had considered the possibility of learning how to create pornography with artificial intelligence to use it either with celebrities or individuals known to them.

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

More information

Archived In

Recomendaciones EL PAÍS
Recomendaciones EL PAÍS
_
_