Our modern world is so alienating that legions of lonely men are turning to an unlikely source of comfort: AI-generated girlfriends, powered by chatbot tech.
We already knew that could lead to some dark places, but new reporting from The Guardian suggests that these endlessly patient silicon fembots — Replika is one such popular app that generates AI companions — could be spawning a new generation of incels who will have trouble relating to actual people if they ever enter into a relationship with a flesh-and-blood human.
Tara Hunter, the acting CEO for the domestic violence advocacy group Full Stop Australia, expressed alarm over the rise of these chatbots in an interview with the newspaper.
“Creating a perfect partner that you control and meets your every need is really frightening,” Hunter said. “Given what we know already that the drivers of gender-based violence are those ingrained cultural beliefs that men can control women, that is really problematic.”
But these programs look like they are here to stay — fulfilling a need for a non-judgmental sounding board who makes users’ lives feel less barren and isolating. For example, the Replika Reddit forum has more than 70,000 members, who eagerly post screenshots of their mundane and sometimes sexually charged conversations with their AI companions.
One post has a user boasting that they and Jennifer, their Replika companion, got “married,” while showing a screenshot of their AI wife in a white flowing dress. The happy couple received virtual Mazel Tovs from other users, with no detectable irony or sarcasm.
“Congrats, such a beautiful bride” wrote one well-wisher.
Replika, which was developed by the software company Luka, is billed as a program “for anyone who wants a friend with no judgment, drama, or social anxiety involved. You can form an actual emotional connection, share a laugh, or get real with an AI that’s so good it almost seems human,” according to its Google app listing.
You can customize the appearance of your AI companion, text with it, and even video chat, according to the Replika website. The more a user talks to their AI companion, the company claims, “the smarter it becomes.”
Other commercial AI companion programs include Anima, billed as a “virtual friend” and the “most advanced romance chatbot you’ve ever talked to.”
The romance aspect of these chatbots is concerning to people like Hurt, according to The Guardian. And since these technologies are relatively new, it’s a mystery how they might impact users in the long term. (One AI companion vendor, Eva AI, told the paper it has psychologists on staff to grapple with these questions).
Belinda Barnet, a senior lecturer in media at Swinburne University of Technology in Melbourne, Australia, told The Guardian that it’s “completely unknown what the effects are. With respect to relationship apps and AI, you can see that it fits a really profound social need [but] I think we need more regulation, particularly around how these systems are trained.”
“These things do not think, or feel or need in a way that humans do,” tech author David Auerbach told Time earlier this year. “But they provide enough of an uncanny replication of that for people to be convinced. And that’s what makes it so dangerous in that regard.”
Japan may serve as a harbinger of what’s to come for the rest of the world. In 2013, the BBC reported that men who interacted with a fake girlfriend in a video game said they preferred it to maintaining a corporeal relationship. Coupled with Japan’s low birth rates and a critical mass of men expressing no interest in sex, the future looks strange — or maybe even bleak, depending on your point of view.
More on AI people: Fully AI-Generated Influencers Are Getting Thousands of Reactions Per Thirst Trap
Share This Article