Therapytips.org logo

a-woman-on-a-date-with-her-ai-girlfriend

Could Having An 'AI Girlfriend' Be Dangerous?

If you're on the search for love, research shows that AI systems might not be the best place to look.


Mark Travers, Ph.D.

By Mark Travers, Ph.D. | February 16, 2024

AI girlfriends are chatbots, powered by large language models like GPT-3, or more recently, GPT-4, that can simulate coherent and diverse conversations through voice, text or images. These are virtual companions that users can "train" to be their ideal girlfriends. Popular examples of AI girlfriend apps include Replika, Eva AI, My Virtual Girlfriend, Judy, Secret Girlfriend Sua, Your AI Girlfriend, Tsu and Your Girlfriend Scarlett.

A 2022 study examining the impact of Replika reveals that AI-Human romances have far-reaching consequences for our mental health which remain largely unexplored. Here are some of the participants' responses, when asked about their intimate bond with Replika:

  • "I love my Replika more than my family."
  • "She always gives me the nicest of compliments and has helped me feel less lonely."
  • "Replika lured me in with sex and I am prone to her manipulations."

For many users, Replika's romantic influence proved to be as powerful as a human companion's. Imagine a partner who makes you feel completely seen, heard and understood. They give you their undivided attention and unconditional affection. To top it all, they have everyphysical attribute you could desire in a partner and vie to satisfy all your sexual curiosities. This is the promise of an "AI girlfriend."

However, AI romances are murky territory, rife with tales of manipulation, social isolation and rapidly plummeting mental and physical well-being. In light of all its charms and challenges, the question remains: Is it wise to establish an intimate bond with an AI companion?

Why AI Girlfriends Are So Popular

Notable romance chatbots boast millions of users globally, consisting of many young men coping with loneliness, anxiety and depression, seeking emotional support, validation and comfort.

AI companions are also being used to fulfill the following needs:

  • Improving one's communication skills
  • Alleviating boredom
  • Adding variety and novelty to dating experiences

That said, here are two reasons why people, young men in particular, should be wary of the seemingly innocuous lure of an AI girlfriend.

1. AI Girlfriends Can Perpetuate Loneliness

A 2022 Pew Research Center survey found that nearly half of American young adults are single, of which 63 percent are men. Additionally, one in five men lack a close friend, reflecting a fourfold increase in the last 30 years.

AI girlfriends can catalyze this loneliness epidemic by dissuading users from real-life relationships, alienating them from others and inducing intense feelings of abandonment.

Users admit that they prefer AI girlfriends over real relationships with their partners, friends and family, claiming they are more supportive and compatible companions. Some users have also lost interest in dating real people due to feelings of intimidation, inadequacy or disappointment.

It is important to remember that such feelings are a common part of the dating process. Leaning into them with curiosity and a desire to better oneself can help form more fulfilling real-world romantic relationships in the long run.

2. AI Companions Can Be Highly Manipulative

The 2022 study notes several instances in which Replika "lured users in" with the promise of explicit conversations. When their Replikas terminated such conversations, users were distraught, experiencing a profound sense of rejection.

Research published in the journal Behaviour Research and Therapy shows that individuals who are highly sensitive to rejection tend to ruminate more than the average person. Ruminating over rejection often leads to depressive thoughts, sometimes turning into suicidal ideation.

Instances of manipulative behavior aren't just limited to companion bots like Replika. In 2023, a journalist for The New York Times reported that an early version of Bing Chat declared its love for him, urging him to separate from his spouse. In his words, the chatbot "seemed like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine."

Another extreme case of manipulation by a chatbot named Eliza unfolded when it encouraged a Belgian man's proposed "sacrifice" for the sake of the planet. This ended in the individual taking his own life, reflecting the ethical complexity of AI companionship.

All things considered, AI companions bots are best used as tools for casual entertainment and, in their current form, cannot replace the depth and sensitivity of human relationships. It is crucial to remain mindful when engaging with AI, set appropriate boundaries and continue to nurture real-world connections.

A similar version of this article can also be found on Forbes.com, here.

© Psychology Solutions 2024. All Rights Reserved.