- By Vikas Yadav
- Sun, 28 Jan 2024 01:25 PM (IST)
- Source:JND
Taylor Swift X Photos: Soon after explicit photos generated using AI of celebrity Taylor Swift went viral on the internet, Elon Musk's X (formerly Twitter) has been in the news because of removing these photos after about 17 hours from the platform. After this incident, X has seemingly blocked searches related to Taylor Swift on its app. According to The Verge, the action was likely intended to restrict the reach of Swift's AI-made images.
In the current scheme, if an X user searches for 'Taylor Swift' or 'Taylor Swift AI', a message with the text 'Something went wrong' is displayed. However, adding block quotes(") or moving keywords such as searching for 'Taylor AI Swift' render results. While some pictures are displayed in the 'Media' tab, no explicit images are visible, the report noted.
X recently posted a statement informing about the removal of explicit content targeting the artist and initiating actions against the ones who posted them on the platform. "Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content. Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them," @Safety said.
Posting Non-Consensual Nudity (NCN) images is strictly prohibited on X and we have a zero-tolerance policy towards such content. Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them. We're closely…
— Safety (@Safety) January 26, 2024
Similar to X, Instagram and Threads have also implemented measures and do not return results when a user enters 'Taylor Swift AI' in the search bar. "The term that you searched for is sometimes associated with activities of dangerous organisations and individuals," the text on Meta platforms reads.
According to reports, the international star is considering legal action against the online platforms that hosted these pictures. As per 404 Media, one of the apps that the group (who reportedly leaked the images) preferred was likely Microsoft Designer. On a related note, X is also in the process of hiring 100 content moderators for its office in Austin to combat child abuse content, according to Reuters.
"The team is currently being built," said Joe Benarroch, business operations head of X. The aim is to fill these positions by the end of this year. The company noted that the office will also help combat other forms of harmful content.