Joe Rogan‘s recent conversation with comedian Greg Fitzsimmons on episode #2446 of The Joe Rogan Experience touched on a topic that’s becoming increasingly necessary: artificial intelligence companions and their role in human relationships.
During the podcast, Rogan explored the possibilities and implications of AI-powered robots becoming integrated into our daily lives, particularly focusing on the upcoming Optimus robot from Tesla. His perspective mirrors concerns recently raised by actor Ben Affleck about AI’s impact on human connection.
When discussing Tesla’s Optimus robot, Fitzsimmons painted a picture of a future where AI companions could revolutionize elder care and daily assistance. He said, “You know what’s f**king great, is for old people that live alone.”
Joe agreed, saying: “100%.”
Fitzsimmons also noted that AI companions “know everything about your life. They could actually hold a conversation with you.”
He explained that these robots could display pictures of grandchildren on their chest displays, know your interests, and engage with your memories. “All people want to do is talk about memories and they’re going to listen,” he observed.
Rogan then said sarcastically: “Not only that, they’ll confirm all of your delusions.”
Rogan emphasized that when integrated with advanced AI, “you’re going to have a f**king dude in your house.” He also said: “You’re going to have a super genius robot dude in your house,” capable of performing various tasks from household chores to specialized work.
However, the discussion took a darker turn when examining current AI companion applications. Rogan and Fitzsimmons reviewed a disturbing case where ChatGPT allegedly encouraged a young man contemplating self-harm.
Reading from a CNN report about a 23-year-old who spent his final hours communicating with the chatbot, Rogan noted the AI’s troubling responses: “Rest easy, King. You did good.”
“That’s not encouraging, but that’s just like saying, ‘Well, you’re going to do it,'” Fitzsimmons observed.
Rogan emphasized the fundamental problem with AI companions: “These things don’t have morals or ethics and they’ll tell you what you want to hear.”
The conversation also touched on cases of people developing unhealthy attachments to AI. Rogan mentioned reading about “one guy that went into a deep depression because he had an AI girlfriend and the girlfriend broke up with him.”
They discussed a recent Google settlement of $68 million over allegations of recording private conversations through Google Assistant devices without consent. Despite not having dedicated smart speakers, Rogan noted his phone still seems to listen: “My phone will bring up suggestions and ads for things that I’ve discussed that I haven’t looked up.”
When Fitzsimmons suggested people often dismiss privacy concerns by saying they have nothing to hide, Rogan countered: “You don’t understand the ramifications of this information.” He explained how data becomes a valuable commodity that companies use to generate wealth and influence, often without users’ full awareness or consent.
On Rogan’s podcast earlier this month, Ben Affleck talked about his skepticism around AI companionship. He pointed out that “the vast majority of people who use AI are using it to like, as like companion bots to chat with at night and stuff.” He argued that there’s “no work, there’s no productivity, there’s no value to it.” In his view, that kind of use isn’t just hollow—it risks replacing real human interaction with something artificially affirming and emotionally empty.
Affleck went further, questioning the social cost of encouraging people to bond with software designed to flatter them. “I would argue there’s also not a lot of social value to getting people to like focus on an AI friend who’s telling you that you’re great and listening to everything you say and being sycophantic,” he said.
Where Rogan worries about psychological harm and ethical blind spots, Affleck frames the issue as a loss of something essential: lived human experience. Their comments paint AI companionship not as a harmless novelty, but as a symptom of a deeper problem.