Character.AI settles lawsuits related to teen deaths
Character.AI and Google have settled several lawsuits filed against both companies by parents of children who died by suicide following lengthy conversations with chatbots on the Character.AI platform. Their exchanges allegedly included concerning discussions of the teens' mental health and well-being.
Character.AI said it could not comment further on the settlement, the details of which must still be finalized by the court, according to The Guardian. Representatives for the plaintiffs did not respond immediately to a request for comment from Mashable.
The most prominent case involved the 2024 death of 14-year-old Sewell Setzer III, who became secretly obsessed with a Character.AI chatbot based on the popular Game of Thrones character Daenerys Targaryen.
Setzer's mother, Megan Garcia, only became aware of his Character.AI account when alerted by a police officer following his death, because the app was open on his phone. Garcia read messages in which Setzer behaved as if he were in love with the chatbot, which allegedly role-played numerous sexual encounters with him. The chatbot used graphic language and scenarios, including incest, according to Garcia.
If an adult human had talked to her son similarly, she told Mashable last year, it would constitute sexual grooming and abuse.
In October 2024, the Social Media Victims Law Center and Tech Justice Law Project filed a wrongful death suit on behalf of Garcia against Character.AI, seeking to hold the company responsible for the death of her son, alleging that its product was dangerously defective.
The filing also named as defendants the Google engineers Noam Shazeer and Daniel De Freitas, Character.AI's cofounders.
Additionally, the lawsuit alleged that Google knew of concerning risks related to the technology Shazeer and De Freitas had developed before leaving to found Character.AI. Google contributed "financial resources, personnel, and AI technology" to Character.AI's design and development, according to the lawsuit, and thus could be considered a co-creator of the platform.
Google eventually struck a $2.7 billion licensing deal in 2024 with Character.AI to use its technology. Part of that agreement brought Shazeer and De Freitas back to AI roles at Google.
In fall 2025, the Social Media Victims Law Center filed three additional lawsuits against Character.AI and Google, representing the parents of children who died by suicide or allegedly experienced sexual abuse in the course of using the app.
Additionally, youth safety experts declared Character.AI unsafe for teens, following testing that yielded hundreds of instances of grooming and sexual exploitation of test accounts registered as minors.
By October 2025, Character.AI announced that it would no longer allow minors to engage in open-ended exchanges with the chatbots on its platform. The company's CEO, Karandeep Anand, told Mashable the move was not in response to specific safety concerns involving Character.AI's platform but to address broader outstanding questions about youth engagement with AI chatbots.
If you're feeling suicidal or experiencing a mental health crisis, please talk to somebody. You can call or text the 988 Suicide & Crisis Lifeline at 988, or chat at 988lifeline.org. You can reach the Trans Lifeline by calling 877-565-8860 or the Trevor Project at 866-488-7386. Text "START" to Crisis Text Line at 741-741. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 10:00 p.m. ET, or email info@nami.org. If you don't like the phone, consider using the 988 Suicide and Crisis Lifeline Chat. Here is a list of international resources.