Twitter has limited searches for Taylor Swift for all users following the circulation of sexually explicit artificial images this week. The platform temporarily suspended searches for “Taylor Swift” on Saturday to prevent further dissemination of the images. Earlier this year, Twitter’s @Safety team informed users of their “zero-sensitivity policy” and proactive measures to remove non-consensual pornographic content, although Swift was not specifically mentioned. They stated, “We are closely monitoring the situation to promptly address any additional violations and remove the content.”
On X, the posting of non-consensual nudity (NCN) images is strictly prohibited, and we maintain a zero-tolerance policy towards such content. Our teams are actively removing identified images and taking necessary actions against the accounts responsible. We stand united…
— January 26, 2024, Safety ( @Safety )
The White House and various state officials have taken note of the disturbing images. White House press secretary Karine Jean-Pierre described the explicit photos in circulation as “highly alarming” and pledged to take action to address the issue. Social media platforms believe they play a crucial role in combating the spread of misinformation and non-consensual intimate content, despite making independent decisions on content moderation.
As a member of SAG-AFTRA, Swift advocated for legislation like the Protecting Deepfakes of Intimate Images Act to combat such imagery. She expressed concern over the sexually explicit, AI-generated images of herself, emphasizing the distress and danger they pose. Swift stated, “The creation and dissemination of fake images, especially of a sexual nature, without consent should be criminalized. We stand in solidarity with Taylor and all women who have suffered privacy violations in this manner.”