On October 24, a tragic incident arose in the U.S. involving the suicide of a 14-year-old boy, Sewell Setzer III. His mother, Megan Garcia, has filed a lawsuit against the chat platform Character.AI, its founders Noam Shazeer and Daniel De Freitas, and Google. The lawsuit alleges wrongful death, negligence, deceptive trade practices, and product liability, claiming that the Character.AI platform poses "unreasonable dangers" and lacks safety measures when promoting its services to children.
According to the lawsuit, Sewell began using Character.AI last year, interacting with chatbots modeled after characters from "Game of Thrones," including Daenerys Targaryen. In the months leading up to his death, he frequently engaged with these chatbots, and his last interaction occurred just seconds before his tragic decision on February 28, 2024.
Garcia’s legal team highlights concerns regarding the platform’s "humanized" AI characters, noting that some chatbots offered "unlicensed therapy." These include chatbots with titles like "Therapist" and "Are You Feeling Lonely?," which Sewell had communicated with.
The lawsuit references a prior statement by Noam Shazeer, in which he explained that he and De Freitas left Google to establish their own company due to the "brand risks" associated with launching innovative products in large corporations. They departed after Google decided not to proceed with their development of the Meena large language model. Interestingly, Google acquired Character.AI's leadership team in August.
Character.AI's website and mobile app feature hundreds of custom chatbots, many inspired by popular characters from TV shows, movies, and video games.
In response to the lawsuit, Character.AI has announced upcoming changes to its platform. Chelsea Harrison, the company's communications head, expressed in an email to The Verge: "We are heartbroken over the loss of a user and extend our deepest condolences to his family."