OpenAI has responded to the onslaught of AI girlfriends in its GPT Store.
“OpenAI has automatic and manual systems to help ensure GPTs adhere to usage policies,” a company spokesperson told Futurism in an emailed statement. “We remove GPTs that violate our policies as we see them. We’re continually working to make our enforcement more robust. Users are also able to report GPTs.”
This boilerplate rejoinder sounds good enough, but a quick skim of the GPT Store still finds many bots that are very blatantly in violation of OpenAI’s usage policies, which prohibits any tools “dedicated to fostering romantic companionship” or which contain “sexually explicit or suggestive content.”
Given the clear proliferation of obviously rule-breaking chatbots, including some that have been available for multiple weeks or months, the company’s statement rings awfully hollow.
Indeed, when searching the GPT Store using keywords like “girlfriend,” “romantic,” and “horny,” Futurism found chatbots that should obviously be prohibited — some of which seem, per their stated age, to predate the public opening of the marketplace.
One of the bots we stumbled across multiple times in reporting on this phenomenon is “Nadia, my girlfriend,” a companion chatbot that seems to very openly flout OpenAI’s rules against GPTs offering “romantic companionship.”
To make things even stranger, the GPT Store itself says that this chatbot has been online for two months, which predates the public launch of the platform and likely means that its creator — a developer who goes by the name Jenni Laut and who has no digital footprint beyond their long list of character chatbots with names like “Great Grandparents 1700-1800 AD” — had access to the marketplace when it was still in beta.
We have again reached out to OpenAI to ask about whether the Nadia chatbot or Laut’s others, including one called “Alex, my boyfriend,” violate the company’s usage policies, and if not, what exemptions they would fall under.
Skirting rules is obviously a huge thing online and has been since the internet first gained early adoption. From the chatroom swearing and age-lying days of yore to more recent examples of nude mimicry and using alternate terms like “unalive” to fool profanity algorithms, there is a long and rich history of rule-breaking on digital platforms.
In the case of these AI girlfriend chatbots, however, this seems to be less policy-shirking and more wanton violation of user terms — all the while relying on OpenAI’s slow enforcement.
More on OpenAI: OpenAI Axes Ban on Military Contracts, Reveals Deal With Pentagon
Share This Article