We uphold a strict editorial policy that focuses on factual accuracy, relevance, and impartiality. Our content, created by leading industry experts, is meticulously reviewed by a team of seasoned editors to ensure compliance with the highest standards in reporting and publishing.
A Florida mother has filed a lawsuit against Character.ai and Google after her son took his own life.
Megan Garcia has claimed in a lawsuit filed on Tuesday (Oct 22) that the AI chatbot startup is to blame for the death of her 14-year-old child, Sewell Setzer, after he developed an addiction to the app and a particular attachment to the chatbot it provided.
In an interview with CBS, Garcia said her son was in an ongoing virtual, emotional, and sexual 'relationship' with the chatbot named 'Dany', after the Daenerys Targaryen character from the smash hit HBO drama series, Game of Thrones.
The lawsuit included Google - which recently transferred its Gemini team to DeepMind - as a defendant but the company has clarified that it only had a licensing agreement with Character and did not have any stake in, or control over, the company.
"I didn't know that he was talking to a very human-like AI chatbot that has the ability to mimic human emotion and human sentiment," added Garcia.
Character.ai previously introduced a new addition to its Voice features where you can talk to them as if on a phone call.
Self-help tools added to the platform
She explained that Sewell was an engaged young man, an honor student, and an athlete, but she began to have concerns when his behavior changed. He no longer wanted to play sports and became withdrawn.
He preferred talking to the bot over a therapist, as his attachment to the AI creation intensified. Sewell felt more of a connection to his virtual companion than to reality.
Garcia thought her son was talking to friends and playing games online, but the reality was that he was engrossed with Dany. She has claimed the startup knowingly marketed its product at minors, including the overtly sexualized design of the chatbot content.
She accused Character.ai of collecting user data to enhance its models, designing the app with compulsive features to drive engagement, and pushing users toward intimate conversations to keep them hooked.
In a statement, Character.ai said, "We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family."
The company intimated it had added a self-harm resource to the platform with plans to introduce further safety measures, especially for those under the age of 18.