Character.AI and Google sued after chatbot-obsessed teen’s death

A lawsuit has been filed against Character.AI, its founders Noam Shazeer and Daniel De Freitas, and Google for wrongful death, negligence, deceptive trade practices and product liability following the death of a teenager. The lawsuit filed by the teen's mother, Megan Garcia, claims the custom AI chatbot platform was “unreasonably dangerous” and failed to implement safety precautions while marketed to children.

As detailed in the lawsuit, 14-year-old Sewell Setzer III began using Character.AI last year and interacting with chatbots modeled after characters The Game of Thronesincluding Daenerys Targaryen. Setzer, who chatted incessantly with the bots in the months before his death, died by suicide on February 28, 2024, “seconds” after his last interaction with the bot.

The allegations include that the site “anthropomorphizes” AI characters and that the platform’s chatbots offer “psychotherapy without a license.” Character.AI hosts mental health-focused chatbots like Therapist and Are You Feeling Lonely, which Setzer has interacted with.

Garcia's lawyers quote Shazeer in an interview as saying that he and De Freitas left Google to start their own company because “the brand risk in large companies is simply too great to ever bring anything fun to market,” and that he wanted to “maximally accelerate” the technology. It is said that they left the company after the company decided against launching the Meena LLM that they had built. Google took over Character.AI's leadership team in August.

Character.AI's website and mobile app feature hundreds of custom AI chatbots, many of which are modeled after popular characters from TV shows, movies, and video games. A few months ago, The edge wrote about the millions of young people, including teenagers, who make up the bulk of the user base and interact with bots that could pose as Harry Styles or a therapist. Another recent report from Wired pointed to problems with Character.AI's custom chatbots impersonating real people without their consent, including one posing as a teenager who was murdered in 2006.

Because of the way chatbots like Character.ai generate output that depends on user input, they fall into an uncanny valley of thorny questions about user-generated content and liability that have no clear answers yet.

Character.AI has now announced several changes to the platform, as head of communications Chelsea Harrison said in an email to The edge“We are heartbroken by the tragic loss of one of our users and would like to extend our deepest condolences to the family.”

The changes include:

“As a company, we take the safety of our users very seriously and our Trust and Safety team has implemented numerous new safety measures over the last six months, including a pop-up that directs users to the National Suicide Prevention Lifeline, which is triggered by terms such as self-harm or Suicidal thoughts,” Harrison said. Google did not immediately respond The edgePlease comment.

Leave a Comment

url url url url url url url url url url url url url url url url url url url