Google and the artificial intelligence company Character.AI have agreed to settle a series of high-profile lawsuits with families alleging that chatbots on Character’s popular app harmed children, leading two teens to take their own lives, among other claims.
In joint filings with the families in multiple U.S. district courts this week, the tech companies said they were working to finalize settlements in five cases, including a wrongful-death suit brought in Orlando by Megan Garcia, the mother of 14-year-old Sewell Setzer III, who died by suicide after extensive time spent talking to a chatbot on Character’s app.
U.S. District Judge Anne C. Conway in Orlando dismissed Garcia’s case late Wednesday, in an order that said the parties had reached a settlement. The case helped trigger a wave of concern from parents, online safety advocates and lawmakers and the potential harms of AI. Stories from bereaved families helped spur congressional hearings and a Federal Trade Commission investigation into the risks that AI chatbots may pose to users’ mental health.
Character, Google and law firms representing the families declined to comment. Google, which licensed Character’s technology and hired its co-founders in a $2.7 billion deal in 2024, is named as a defendant in each suit against the smaller chatbot company.
Details of the settlements in the other cases have not been finalized, according to court filings on Tuesday and Wednesday, but the decision to settle could prevent clear rulings on when or if AI companies can be held liable for the outputs of their AI models, said Eric Goldman, a professor at the Santa Clara University School of Law.
“That question has extraordinary implications for the potential future of generative AI,” Goldman said.
Several families have alleged in separate, more recent cases that ChatGPT led users to take their own lives, filing wrongful-death suits against its maker, OpenAI. Some AI researchers have warned that chatbots designed to keep people talking can encourage psychological dependency, emotional reliance and manipulation. (The Washington Post has a content partnership with OpenAI.)
The lawsuits against Character and Google headed toward settlements also include a wrongful-death claim filed in September in U.S. District Court in Denver by the parents of 13-year-old Juliana Peralta, who allege that she took her own life after extensive conversations with AI companions on Character’s app.
The chatbots that are offered by the company, which are customized by users, are known as AI companions and are often used for role-play, therapy and sex. Many are based on fictional characters from anime or celebrities.
Another case that Character, Google and the plaintiffs are working to settle was filed in U.S. District Court for the Eastern District of Texas in 2024 by A.F., a mother who alleged that her 17-year-old son, identified as J.F., began cutting himself, withdrew from his family and lost 20 pounds after he grew dependent on the Character app.
One chatbot suggested to J.F. that self-harm could be a way to cope with sadness, according to screenshots of messages included in the complaint.
J.F.’s mother and Garcia, the mother of Sewell, both testified before the Senate in September in a hearing about the harms of chatbots, where lawmakers argued that AI companies must be held accountable if their products fail to protect minors.
California state Sen. Steve Padilla (D) said in a statement that Garcia’s lawsuit, filed in October 2024, drew a national spotlight to the dangers of AI chatbots. He worked with Garcia to write a state bill that was signed into law in October that gives families the right to sue chatbot operators that fail to ensure the safety of their products.
“None of that would have been possible without her fierce advocacy and strength,” Padilla’s statement said. “There is much more work to be done in this space and we can expect Megan to be a leader building on what we started.”
The post Google and chatbot start-up Character move to settle teen suicide lawsuits appeared first on Washington Post.




