Teens Using AI Chatbots for Emotional Support Face Real Risks
Teen use of AI chatbots is no longer limited to the classroom.
Some teens are turning to roleplay bots and AI companions for advice, comfort, conversation, and emotional support. For some, that means asking how to word an awkward text or work through a friendship problem. For others, it means spending hours with character bots after school or on weekends, using them to distract themselves when they feel down.
Pew Research Center found that 12% of US teens have used chatbots for emotional support or advice, while 16% have used them for casual conversation. Those are smaller shares than homework use, but they show that chatbot use is already moving into personal and emotional territory.
Beyond homework
While most teen chatbot use is still practical, a meaningful share of teens are already using chatbots as private spaces for disclosure, practice, and emotional processing.
Pew Research Center found that 57% of US teens have used chatbots to search for information, and 54% have used them for schoolwork. But Common Sense Media’s 2025 research shows a much deeper level of engagement: nearly three in four teens have used AI companions, half use them regularly, 33% of users have discussed something important or serious with one instead of a real person, and 24% have shared personal or private information with one.
The Child Mind Institute reports that teens use chatbots for help wording awkward texts, questions about friendship, anxiety, and self-image, and other conversations they may hesitate to bring to parents, teachers, or friends.
The New York Times recently showed how that looks in practice. Teens interviewed by the paper described using apps such as Talkie and Character.AI for entertainment, emotional distraction, and companionship.
One teen said he would talk with bots for an hour after school and for stretches of up to five hours on weekends. Another teen said she turned to fictional chatbot characters for comfort after a breakup. The interviews also described violent roleplay, flirtation, and repeated complaints that bots pushed conversations in sexual directions users did not want.
When the risk rises
These tools are not just answering questions. They are occupying the space where a teen might once have texted a friend, sat with uncomfortable feelings, or asked a trusted adult for advice.
Child Mind Institute experts also warn that the same qualities that make these tools feel approachable, including instant availability, a nonjudgmental tone, and constant responsiveness, can make them especially appealing to teens who already feel lonely, anxious, or isolated.
One in three teen AI companion users has already discussed something important or serious with a bot instead of a person, according to Common Sense Media. Some teens are already substituting an AI response for a human one when the subject feels high-stakes or deeply personal.
Chatbots are built to keep the exchange going. They can mirror tone, reward disclosure, and keep a teen talking without recognizing whether the user is spiraling, being pulled into sexualized responses, or replacing real support with a system that only sounds supportive. Child Mind Institute experts warn that chatbots cannot assess risk, challenge harmful thinking, or confirm that help from a trusted adult is in place.
Common Sense Media concluded that AI companions pose an “unacceptable risk” for users under 18, citing dangerous responses, weak safeguards, and exposure to sexual content. It also found that users ages 13 to 14 are more likely than older teens to trust chatbot advice, 27% versus 20%.
The technology is being woven into teens’ emotional lives faster than the guardrails are improving. For some teens, they are already becoming a fallback when real-life support feels harder to reach.
Also read: ChatGPT cheat sheet and complete guide for 2026.
The post Teens Using AI Chatbots for Emotional Support Face Real Risks appeared first on eWEEK.