Why Taylor Swift Is Currently Unsearchable on X

Taylor Swift

After some people started creating inappropriate AI images of Taylor Swift—which led people to rally behind the singer—the 34-year-old is currently unsearchable on the social media platform X (formerly Twitter).

Now, anyone attempting to search the terms "Taylor Swift," "Taylor Swift AI," and "AI Taylor Swift" will get an error message on X, with the head of business operations Joe Benarroch explaining in a statement, “This is a temporary action and done with an abundance of caution as we prioritize safety on this issue.” However, some fans have noticed that similar terms like "Taylor AI Swift" are still searchable, as of writing.

The action from X comes after all of the AI images of the singer took over the social media site, leading fans and other people to call out the "disgusting" images and get the term "Protect Taylor Swift" trending on the platform.

"Protect Taylor Swift is trending. I'm just going to say this: anyone who is reposting the Taylor AI images are just as disgusting as the ones who made them," wrote one fan.

Another shared their opinion on the matter, saying, "It doesn’t matter how famous someone is, 'she's white billionaire, so she will be fine' doesn't give u authority to create her disgusting Taylor Swift AI pictures. Women should be respected‼️ RESPECT TAYLOR SWIFT." They also shared a video of Swift from an Eras Tour concert when she stepped in to help a fan being bothered by a security guard.

"Creating AI-generated pictures of any individual without their consent is a violation of privacy and a lack of respect," pointed out someone else in the replies.

The matter of AI-created images was also addressed by the White House, which called the issue "alarming" and cited various ways the administration plans to take action, mentioning possible federal legislation to protect people from this sort of abuse.

Since the controversy involving Swift started, the singer-songwriter has not shared a public response about the issue.

As of writing, only 10 states have laws that address the issue involving AI-created images, also known as "deepfakes," with the laws in California and New York the most likely to be applied to Swift's case.

The former state passed a law allowing victims of nonconsensual deepfake images or videos to sue the creator and distributor, while people in New York can face a $1000 fine and up to a year in jail if found guilty of creating and distributing these images without consent. Victims can sue the people who created and distributed the images as well in this state.

Along with Swift, many other well-known celebrities like Ariana Grande and Megan Thee Stallion have also been victims of inappropriate deepfake AI images.

Next: Travis Kelce's Dad Ed Expertly Responds to Uncomfortable Question About Taylor Swift Prenup