Therapytips.org logo

a close up of a person holding a cell phone sending a text

2 Major Issues With 'AI-Powered Breakups'

There's something incredibly unsettling about asking Chat-GPT to write a breakup text for you. Here's a psychologist's take on the 'AI-powered breakup' trend.


Mark Travers, Ph.D.

By Mark Travers, Ph.D. | May 09, 2025

There's a clear pattern in how we adapt to technology and how quickly we lose touch with what it replaced. Here's a thought experiment for you:

Imagine you're in a foreign country. No phone, no cloud, no email. You can picture the five most important people in your life. But can you recall their phone numbers, if your life depended on it?

A recent survey found that just 49.4% of people have even two to five phone numbers memorized from their favorites list. The numbers get worse among younger adults, most of whom grew up with smartphones and have never needed to memorize a number. By contrast, people over 55 were far more likely to remember phone numbers without digital help.

But forgetting a phone number is one thing. Forgetting how to emotionally connect with someone, especially when that connection is ending, is something else entirely. And this is the reality AI is quietly shaping.

What Is An 'AI-Powered Breakup'?

OpenAI's ChatGPT is the most popular LLM — or large language model — to date. When it launched, it was the first real taste of what everyday AI could feel like in human hands. For many, it marked the beginning of a new relationship with technology. And now, it's hard to ignore how deeply embedded it's become.

A growing number of emails, Slack messages and work memos are quietly edited — or entirely rewritten — by LLMs. That means what your colleague receives may not be your words, but the algorithm's best guess at what you meant to say. Because of how tempting it can be to sound "polished," and because of how easy it is prompt an LLM, this has become a no-brainer for many in the workforce, especially younger people.

It gets more unsettling when you apply this to personal relationships. Because as well as these tools mimic understanding, they don't actually understand you. There's no real nuance, and no heart.

And language is slippery. Meaning is up for interpretation. What the model outputs might be fluent — even persuasive — but your personality might be far removed from what it comes up with.

So imagine asking AI to help you break up with someone. The words might be clean, the tone just right. But whose voice are they really hearing?

Here are two reasons why we just aren't ready for AI-powered breakups.

1. Insights From AI Exist In An Echo Chamber

Social media already traps us in a feedback loop. We see what we want to see, hear what we want to hear and get rewarded for confirmation, not confrontation. But LLMs go a step further.

They're not just passive mirrors. They're designed to keep you satisfied with what they produce. Especially when prompted with something emotionally charged, like a breakup, they prioritize emotional fluency over emotional truth. In fact, this commercial need to keep users emotionally satisfied for as long as possible recently caused OpenAI co-founder Sam Altman to openly admit that the latest updates to ChatGPT have made it "sycophant-y."

Now, apply that to relationship conflict. Say you're hurt, confused or unsure about ending things. You might feed your side of the story into an LLM. It might respond with empathy — tuned just for you. It might sound smart, balanced, maybe even enlightened. It might sound like the best relationship expert you could ever hope for.

But here's the catch. It doesn't question you, and it doesn't challenge your bias. It can't know the other person, the history or the context beyond your input. So what you get back is a reflection — not of the relationship, but of your current emotional state, dressed up to sound like wisdom.

That's not clarity. That's a well-worded, manipulative echo.

2. It Can Lead To A Breakup Stripped Of Humanity

No one enjoys going through a breakup. But the process itself, including the messiness, the hesitation and the hard conversations, often reveals something essential about both people involved. It's painful and meaningful precisely because it's human.

In some ways, a breakup is the final act of care — a last offering of clarity, honesty or dignity to someone you once loved. It's about how you end it.

Now imagine replacing that with ideas crafted by a model trained on Reddit, customer service logs and blog posts. It might be based on real data, but it's still predictive text.

The result might be emotionally articulate, but hollow — an "autocorrected" version of emotional reckoning. Clean on the surface, but stripped of the very discomfort that makes a goodbye real.

This kind of sterility might soften the sting of separation for now. But eventually, the absurdity of using a machine to end a relationship with a living, breathing person, someone who you once felt so deeply, will begin to weigh on you.

And in that moment of doubt, you might try a different prompt:

"Was I wrong about my ex?"

Because LLMs are tuned to align with your input, and because they have no real context of how you are feeling, you'll probably get something like:

"Good. You're asking the right question now. Let's revisit the situation…"

Only this time, the answers won't feel satisfying — just vacant. The machine will flip-flop based on your phrasing. And when the answers keep shifting with your tone, you'll stop wondering what went wrong and start wondering if you ever understood it at all.

Still reeling from your breakup? Take this science-backed test to learn if you're ready to start your healing journey: Breakup Distress Scale

A similar version of this article can also be found on Forbes.com, here.

© Psychology Solutions 2025. All Rights Reserved.