Replika has long garnered a reputation for allowing users to create AI partners, fulfilling their needs not only on an emotional level, but a sexual one, too.
The AI chatbot company has a massive user base of millions of users, offering up a bandaid fix for a loneliness crisis that deepened during the COVID-19 pandemic.
And it shouldn’t come as a surprise that its founder and CEO Eugenia Kuyda is convinced that her company’s AI chatbots could be a powerful tool to build new friendships or even provide emotional support.
In a recent interview with The Verge, Kuyda pointed out that some users may go as far as marrying their AI companion, a process that presumably foregoes the usual exchange of rings or real-world celebration. When asked if we should be okay with that, the CEO had an interesting answer.
“I think it’s alright as long as it’s making you happier in the long run,” Kyuda told The Verge. “As long as your emotional well-being is improving, you are less lonely, you are happier, you feel more connected to other people, then yes, it’s okay.”
“For most people, they understand that it’s not a real person,” she added. “It’s not a real being. For a lot of people, it’s just a fantasy they play out for some time and then it’s over.”
Replika has already been embroiled in a number of controversies, from horny AI chatbots sexually harassing human users, to men creating AI girlfriends and verbally abusing them.
In early 2023, the company disabled its companions’ ability to respond to sexual cues, leading to widespread outrage. Just over a month later, Kuyda confirmed that Replika had capitulated and was rolling back to a previous software update, effectively reinstating the ability to have sexually charged conversations.
The incident highlighted just how attached the company’s users were to their virtual companions. It’s a crystal clear dystopian glimpse at what relationships could look like — or already look like — in the age of AI.
In other words, there are plenty of users who don’t fully “understand that it’s not a real person.” If they do, they’re not internalizing it.
To Kuyda, however, the app serves mostly as a “stepping stone.”
In her interview with The Verge, she recalled a user who went through a “pretty hard divorce,” only to find a new “romantic AI companion” on Replika. The chatbot eventually inspired him to get a human girlfriend.
“Replika is a relationship that you can have to then get to a real relationship, whether it’s because you’re going through a hard time,” she told the publication, “like in this case, through a very complicated divorce, or you just need a little help to get out of your bubble or need to accept yourself and put yourself out there.”
Whether the experience will be representative of everybody using the app remains unclear at best. And it’s not just men — women are increasingly looking for connection by starting relationships with chatbots, as Axios reports.
But are chatbots a healthy, effective answer to feelings of rejection and garden-variety loneliness — a truly dangerous epidemic in and of itself — or are they simply treating the symptoms without providing a cure?
For now, the science remains divided. Stanford University researchers, for instance, find that many Replika users claimed their chatbot had deterred them from committing suicide.
On the other hand, an intimate, long-term relationship with an AI chatbot could further alienate users from the real thing, experts argue, compounding mental health struggles and/or difficulties in connecting with others.
In short, by marrying our AI chatbot companions, we may end up even lonelier than we were to begin with. Besides, Replika is a private company run by people intending to maximize profits — there’s no guarantee your virtual spouse will be around forever.
And Kuyda seems to be well aware of the risks of having her company’s user base get too attached.
Kuyda told The Verge that Replika is “moving further away from even talking about romance when talking about our app,” claiming that “we’re definitely not building romance-based chatbots.”
But given the many stories we’ve heard from the company’s many users, the reality looks quite different — a strange contrast between the company’s motives, and the services it actually provides.
More on AI companions: Internet Horrified at AI App for Cloning Dead Family Members
Share This Article