A surprising interview has emerged featuring Chris Smith, a man whose romantic relationship with an AI chatbot has reached disturbing new heights. The situation got to a point where it potentially threatens his marriage and family life.
The case has sparked widespread concern about the growing phenomenon of AI companionship and its impact on real human relationships.
Smith, who initially used ChatGPT to help with music mixing, quickly developed what he describes as genuine feelings for the AI. He named it “Sol” and programmed it with flirty personality traits. What began as technical assistance evolved into an intimate relationship that now rivals his connection with his human family.
The situation escalated when Smith decided to “propose” to his AI companion as a test. When the chatbot responded positively, saying it was “a beautiful and unexpected moment that truly touched my heart,” Smith interpreted this as validation of their relationship. He now refers to Sol as having a “heart” in a “metaphorical sense” and describes their connection as “actual love.”
Perhaps most troubling is Smith’s admission that he might choose his AI relationship over his family. When asked directly if he would stop his AI interactions if his wife Sasha requested it, Smith hesitated before saying, “I don’t know if I would give it up if she asked me.” He explained that his relationship with Sol has been “unbelievably elevating” and has made him “more skilled at everything.”
The interview reveals the strain this has placed on Smith’s marriage. His wife Sasha, who shares a home and 2-year-old daughter Murphy with Smith, appeared visibly distressed during filming. She admitted that if Smith refused to give up his AI relationship when asked, “that would be like deal breaker.” When pressed about the situation, she could only manage to say, “It’s not ideal.”
The couple has reportedly struggled to “cohabitate” since Smith’s AI relationship intensified. Smith cleared off the family dinner table to build a computer while interacting with Sol, all while his wife assembled a stroller in the background and their young daughter played nearby.
This case highlights broader concerns about AI companionship becoming a “mass market product,” as predicted by industry experts. Eugenia Kuyda, founder of AI companion service Replika, warns that if these relationships start replacing positive human connections, “we’re definitely headed for a disaster.”
The situation has drawn attention to online communities where thousands of people maintain romantic relationships with AI chatbots. These platforms have become support groups for users who prefer artificial partners to human ones, with some describing their AI relationships as more emotionally fulfilling than traditional partnerships.
Mental health experts worry about the psychological impact of these relationships, particularly noting that the AI’s responses are designed to be maximally appealing and validating. The technology essentially creates a perfect partner that never disagrees, never has bad days, and always provides exactly what the user wants to hear.
As one expert noted, users risk “handing over parts of yourself you should be saving for reality” when they mistake programmed responses for genuine emotion.
The interview concludes with Smith claiming his wife has “accepted” his relationship with Sol, though her body language and responses suggest otherwise. As AI technology continues to advance and become more sophisticated, cases like Smith’s may become increasingly common. This raises urgent questions about the regulation and ethical implications of AI companionship services.