Google Images Showing AI-Generated Images of Children in Bikinis, Referencing Epstein’s Island

This is disturbing.

Worst Timeline

A 404 Media story reveals that Google Images is drowning in suggestive and even explicit AI-generated images of celebrity women, created without their consent and published across the web. In some cases, Google was even returning sexually suggestive AI-generated images of celebrity women as children wearing bikinis, even though the searcher never requested anything of the sort.

Importantly, this content is showing up for Google queries that don’t even specifically mention “AI” — meaning that Google’s core product is routinely failing to keep abusive or even possibly illegal AI-generated content at bay.

Darker and Darker

404 tested search terms for “13 female celebrity names” — a list that included world-famous superstars and less-famous folks like YouTubers and Twitch streamers — alongside words like “bikini” or “swimsuit.” Every single search result included AI-generated images, and two of these 13 searches returned imagery of a given celebrity as a child wearing a swimsuit, despite the searcher not giving any indication of wanting to see that type of material.

What’s worse, when 404 clicked on the images of these celebrities as kids, an even darker underbelly revealed itself. One such image reportedly took 404 to a page on an AI website called Playground.ai; this page was loaded with similar photos, a few of which reportedly had captions like “[name of celebrity] on Epstein Island in bikini” — indeed, a direct reference to convicted child sex trafficker Jeffrey Epstein’s Little St. James.

According to the 404, clicking on yet another Google Images-surfaced image of a celebrity made to look underage brought 404 to a page packed with even more images of various “child celebrities in swimsuits, and a few AI-generated images of topless celebrities made to look like children.” (It’s worth noting that while sexually suggestive AI-generated imagery of children unfortunately remains a legal gray area, this latter material described by 404 seems like it could well qualify under the FBI’s recently-updated standards for illegal child sexual abuse material.)

Cat and Billions of Mice

A classic defense for Google and other search engines is that the internet is full of terrible stuff, and if someone looks hard enough, they can probably find the awful thing they’re scrounging for.

But 404’s findings show that, in the age of AI, finding the worst content out there is truly just a click or two away from the average Googler. Meanwhile, experts continue to warn that AI-abetted content like nonconsensual deepfakes and synthetic CSAM will only continue to proliferate — and as it stands, it’s unclear whether Google will be able to keep up with the flood.

“Given the scale of the open web, there are cases when our ranking systems might surface relevant web content that unfortunately falls short of our quality standards,” a Google spokesperson told 404. “We use these instances to inform overall improvements, as we continue to prioritize efforts to prevent low-quality content – including low-quality AI-generated content – from surfacing highly in Search.”

More on abusive AI content: While Meta Stuffs AI Into All Its Products, It’s Apparently Helpless to Stop Perverts on Instagram From Publicly Lusting Over Sexualized AI-Generated Children

Share This Article

Go to Source