Social media sites have been awash with AI-generated Taylor Swift porn this week — and it’s hard to tell whether there will be any real consequences for those who posted them or allowed them to proliferate.
Depending on which social network you’re on, searching for “Taylor Swift AI” right now could either produce either the aforementioned fake porn or, in the case of X-formerly-Twitter, a rush of posts supporting the superstar songstress, which was part of a successful viral effort to drown out the much lewder images that began cropping up en masse earlier in the week.
If you’d searched the same term half a day earlier, however, chances are that you’d have been presented almost entirely with an onslaught of deeply pornographic fake Swift nudes. Clearly made using AI image generators, those images initially featured the singer in various suggestive poses and states of undress at Kansas City Chiefs games, in a salacious nod to her romance with team tight end Travis Kelce.
Soon after, however, substantially grosser images began circulating as well, including ones featuring the star — and don’t say we didn’t warn you — engaged in sex acts with “Sesame Street” characters.
As 404 Media reports, the images appear to have begun circulating on a Telegram channel whose sole, disgusting purpose is to create nonconsensual sexualized images of women using AI, including the ones that ended up on social media. When the offending images began going viral on Twitter, its users joked that the attention would likely “get this shit shut down.”
Though it’s unclear whether or not the Telegram channel did indeed get shut down, the Daily Mail — of all places — reported that some of the accounts that shared the fake nudes were taken down after its reporters alerted contacts at X, Facebook, Instagram, and Reddit to their existence. But as The Verge pointed out in its own reporting, one of the viral posts on X was up for at least 17 hours before its removal — an egregious delay that it’s hard to imagine seeing before Elon Musk acquired the network and slashed its content moderation resources.
With AI laws lagging far behind the technology and its more nefarious uses, the onus of removing this kind of content obviously lies with social networks. But there are other pressing questions. Should companies like OpenAI, Microsoft, and MidJourney be responsible for the dissemination of disgusting and abusive content created using their systems? What about the app stores that offer AI software?
As lawyer Carrie Golberg, a victim advocacy attorney specializing in digital abuses, noted in her own post, “seller negligence” on the part of Apple’s App Store and the Google Play store could potentially be in play because they host the “malicious products that create the images.”
For now, though, the law remains largely untested — and unwritten, as lawmakers grapple with the rapidly changing tech in real time.
With laws about nonconsensual AI imagery currently being debated in legislatures all over the world, there will need to be a multi-pronged plan of attack to take down those responsible for these sorts of repulsive acts — and if anyone has the litigious power and capital to make actual enforcement happen, it’s Swift.
“Hard to imagine anything more terrifying for the generative AI companies already staring down multiple existential-level lawsuits than the Taylor Swift AI porn circulating online right now,” quipped former LA Times columnist Bryan Merchant. “A year of AI companies talking about chatbots and image generators like they were the coming of SkyNet only to have them all crushed like a bug by Taylor Swift’s legal team would really be the funniest timeline.”
More on AI rule-breaking: OpenAI Failing to Destroy Contraband AI Girlfriends Flooding GPT Store
Share This Article