The CEO of Microsoft AI says AI chatbots are a powerful way for humans to offload emotions and 'detoxify ourselves'
Inflection
- Microsoft AI CEO Mustafa Suleyman says AI chatbots are a way to "detoxify ourselves."
- The rise of ChatGPT therapy is a concern for some tech leaders, such as OpenAI's Sam Altman.
- But Suleyman said AI can make people "feel seen" in a way that most other humans cannot.
The CEO of Microsoft AI says chatbots are a good way to offload emotions and "detoxify ourselves."
Appearing on an episode of Mayim Bialik's "Breakdown" podcast, which was released on December 16, Mustafa Suleyman said companionship and support have become some of AI's most popular use cases.
People are using AI chatbots for everything from navigating breakups to solving disagreements with family members, he said.
"That's not therapy," Suleyman said. "But because these models were designed to be nonjudgmental, nondirectional, and with nonviolent communication as their primary method, which is to be even-handed, have reflective listening, to be empathetic, to be respectful, it turned out to be something that the world needs."
The upside of this, he said, is "this is a way to spread kindness and love and to detoxify ourselves so that we can show up in the best way that we possible can in the real world, with the humans that we love."
Suleyman cofounded DeepMind in 2010. The company was acquired by Google in 2014.
On the podcast, he said people need a space to "ask a stupid question, repeatedly, in a private way, without feeling embarrassed."
Over time, he said, chatbots can make people "feel seen and understood" in a way that many other humans can't, outside of partners or close friends.
Not everyone in tech, however, is enthusiastic about chatbots being used as therapy stand-ins. OpenAI CEO Sam Altman is one of those voices. In August 2025, he expressed his discomfort over people relying on chatbots to make major life choices.
"I can imagine a future where a lot of people really trust ChatGPT's advice for their most important decisions," Altman wrote on X. "Although that could be great, it makes me uneasy."
In July 2025, while appearing on "This Past Weekend with Theo Von," Altman also flagged the potential legal risks of offloading onto a robot. He said that OpenAI may be required to produce its users' therapy-style chats in a lawsuit.
Mental health professionals have voiced concerns over the rise of ChatGPT therapy, too. Speaking to Business Insider senior health reporter Julia Pugachevsky in March 2025, two therapists said that relying on AI chatbots for emotional support could exacerbate loneliness and make people dependent on seeking reassurance.
Suleyman acknowledged some of those downsides on the podcast, saying there is "definitely a dependency risk," and that chatbots can sometimes be overly flattering or "sycophantic."
Suleyman is not the sole tech voice who sees AI's therapeutic potential. In a May 2025 interview with the Stratechery newsletter, Meta CEO Mark Zuckerberg said he believed everyone should have a therapist.
"For people who don't have a person who's a therapist," Zuckerberg said, "I think everyone will have an AI."