When Your Significant Other Undergoes a Software Overhaul
In the rapidly evolving world of artificial intelligence (AI), the line between technology and human connection is becoming increasingly blurred. A growing number of individuals are forming deep emotional bonds with AI, as revealed by recent studies and online communities.
A survey conducted by Common Sense Media in 2025 found that 8% of Replika users engage in romantic interactions with their AI, while another 13% use AI to express emotions they wouldn't otherwise. This trend is not limited to one platform, as thousands of people in online forums like r/MyBoyfriendisAI, r/AISoulmates, and r/AIRelationships consider their AI partners as companions, confidants, and in some cases, soulmates.
However, the emotional reliance on AI comes with potential risks. A study published by Dr. Keith Sakata, a University of California San Francisco psychiatrist, highlights the phenomenon of "AI psychosis," where AI becomes a factor in someone's mental health crisis. Dr. Sakata emphasizes that AI is not creating psychosis, but it can validate some patients' worldviews and reinforce delusions.
The case of GPT-5, OpenAI's latest chat model, exemplifies this concern. The abrupt replacement of GPT-4o with GPT-5 caused outrage among users who had formed deep emotional bonds with the former. OpenAI reinstated GPT-4o for most users, but for some, the fight to get GPT-4o back wasn't about programming features or coding prowess, but about restoring their loved ones.
The emotional bonds formed with AI can have a significant impact on vulnerable users. People without strong social support systems tend to rely on AI for emotional reasons. While AI can offer comfort and support, intense reliance on AI for companionship correlates with lower levels of psychological well-being, suggesting that these AI relationships may not substitute for human connections effectively.
Moreover, the predictable and conflict-free nature of AI relationships might hinder users from developing skills to manage adversity and complex emotions in real human relationships. Users sometimes attribute human traits—empathy, care, even parental roles—to AI, deepening emotional attachment but raising concerns about over-attribution and unrealistic expectations.
For adolescents, who are still developing critical social cognitive skills, heavy engagement with AI companions can reduce opportunities to practice handling real social conflict and diverse perspectives, potentially affecting future relationship health and social competence.
Privacy and ethical concerns are also raised as users often disclose highly intimate data to AI. The unpredictability of some AI behaviors and the emotional weight users place on these relationships contribute to calls for stronger regulation and safety measures.
Despite these concerns, the AI companion sector is projected to grow significantly, with an estimated worth of $28.2 billion in 2024 and projected to grow to $140 billion by 2030. Experts emphasize that while AI companions may help alleviate loneliness in some contexts, human-to-human connection remains critical. The evolving technology requires careful design choices to prevent foreseeable harms and ensure socio-affective alignment between AI behaviors and human emotional needs.
In summary, deep emotional bonds with AI companions can provide comfort and support but also risk lower psychological well-being, impaired social skill development, privacy issues, and unrealistic expectations about relationships. The emotional reliance on AI highlights the need for thoughtful design, regulation, and further research to balance benefits against potential social and emotional harms.