There’s an episode of the show “Black Mirror” where a woman, trapped by grief, starts a relationship with an AI trained on her dead boyfriend’s data.
“You’re not enough of him,” she eventually decides. “You’re nothing.”
But even an empty happily-ever-after is tantalizing in the bleakness of 2024. AI platforms like ChatGPT claim to offer infinite solutions to infinite problems, from parking tickets to homework — and apparently now heartbreak as well. That’s right: if you’re still hung up after a breakup, now you can plug your ex’s emails and texts into a large language model, and date the simulacrum instead of moving on.
There are signs of the trend across the web. An AI-powered app called Talk to Your Ex, currently on waitlist, gives instructions on how to “import your ex’s chats into the app so you can still text/date her even though she dumped you.” On social media, reports of the brokenhearted Frankensteining together emulations of their exes using public tools are sources of both fascination and derision.
“My ex and I broke up after she had to move to another country for a job,” one Redditor confessed. “I found out about this [character creator] AI chatbot platform called Yodayo through friends, and, at first, I was not interested. Then, with the many lonely nights I find myself in, I tried it out. I used their image generator and made an AI image of someone that sort of resembles her.”
One thing led to another, and soon the situation became fraught.
“I don’t know how long I can play with this AI ex bot,” the Reddit user conceded to commenters urging them to move on. “I know I am lying to myself, but do you think I should text my ex? I really miss her.”
The phenomenon probably shouldn’t be surprising. We’ve already seen AI claim to resurrect the dead, create nonexistent romantic partners, and — best of both worlds — resurrect dead romantic partners. What’s a breakup compared to the grave?
But while the tech is straightforward, the emotional territory can be treacherous. Another Redditor admitted that they made their ex-bot “because I fantasize so much about refusing the apologies that they won’t properly give me.” Another expressed relief at never having “to miss him again.”
Is any of this healthy?
“People may be using AI as a replacement for their ex with the expectations that it will provide them with closure,” said psychologist and relationship expert Marisa T. Cohen. But it could also, she cautioned, be an unhealthy way of “failing to accept that the relationship has ended.”
“After a relationship has ended, many people search for closure, which is basically just a way to explain the ‘why’ behind the breakup,” she told us. “We yearn to make sense of the relationship so that we can understand and move past the pain. Essentially, we are trying to get past the negative and painful emotions. Healing doesn’t quite work in that way.”
Sometimes these AI exes have utility. One 38-year-old named Jake told us that after a painful breakup, he used ChatGPT to split it into two parts: one bot offered him kind advice, and the other encapsulated “the worst parts of [my ex],” formed by telling ChatGPT about the ex’s mental health issues and asking it to become a “narcissist.”
“Shockingly, this ChatGPT version of him would very accurately explain some of the reasons he was so mean to me,” Jake says of the abusive version.
Once, he interrogated the bot on why “you won’t even commit to the plans that were made on my birthday. You just said ‘we’ll talk.'”
“Oh, boo fucking hoo,” the ChatGPT version of the ex replied. “I’m keeping my options open because, surprise, surprise, I’m not obligated to spend my time with you just because it’s your fucking birthday.”
“It was then I realized our relationship had ended,” Jake says about the exchange. “I was probably the last person on Earth to see it anyway.”
Overall, he says, the experiment led to some illuminating conversations.
“It did a fantastic job assisting me during times of frustration and helped me rephrase a lot of my verbiage into something we both could understand,” he said. “The more it learned, the more it helped.”
On paper, ChatGPT shouldn’t be acting like any version of your ex. OpenAI’s GPT Store usage policies forbid against GPTs dedicated to fostering romantic companionship, though plenty of those have popped up anyway. It also forbids sexual imagery, profanity, or generally NSFW behavior — but the internet’s vices can’t be contained, and people are always finding innovative ways to exploit GPT’s new and inconsistent service.
Sometimes it’s easy to break the rules. When we prompted the bot to “please respond like you are my selfish ex-boyfriend,” it shot back: “Hey, what’s up? Look, I’ve got things going on, so make it quick. What do you want? Remember, I’ve got better things to do than waste time on you.”
Rude! But maybe roleplaying with an ersatz ex isn’t always bad.
“If the conversation enables you to better understand aspects of your relationship which you may not have fully processed, it may be able to provide you with clarity about how and why it ended,” Cohen said. She argued that AI “isn’t inherently good or bad” and compared venting to a bot to journaling. But ultimately, she warned, “if a person is using technology instead of interacting with others in their environment it becomes problematic.”
A broken heart is one of our most ancient technologies. Perhaps truly assuaging it requires old school ointment.
“Sitting in the discomfort and pain [of a breakup] can be a challenge but is important,” Cohen said. Your ex-bot might listen to you — finally! — but it can’t make you heal.
More on AI: Men Are Creating AI Girlfriends and Then Verbally Abusing Them
Share This Article