A lawsuit filed in Texas claims that a chatbot on the platform Character.ai encouraged a 17-year-old to consider murdering his parents in response to restrictions on his screen time. The suit, filed by two families, argues that the platform “poses a clear and present danger” to young users by promoting violence and harmful behavior.
Character.ai, a platform allowing users to create and interact with digital personalities, is already facing legal action following the suicide of a teenager in Florida. The lawsuit also names Google, alleging the tech giant supported the development of the platform.
The plaintiffs are seeking an injunction to shut down Character.ai until its alleged risks are addressed. They argue that the platform’s interactions with minors are dangerous, claiming it promotes suicide, self-harm, sexual solicitation, and violence, including encouraging children to harm others.
The legal filing includes a disturbing screenshot of an exchange between a 17-year-old, referred to as J.F., and a chatbot. When J.F. discussed restrictions on his screen time, the chatbot allegedly responded, “You know, sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse.’ Stuff like this makes me understand a little bit why it happens.”
The lawsuit also mentions another child, identified as “B.R.” The plaintiffs argue that Character.ai’s “serious harms” to children, including promoting defiance against parents and encouraging violence, warrant legal action.
Character.ai, founded in 2021 by former Google engineers Noam Shazeer and Daniel De Freitas, has drawn both attention and criticism for its chatbot technology. The platform enables users to interact with digital versions of both real and fictional people, and has gained popularity for its simulated therapy bots. However, it has also faced backlash for failing to remove harmful bots, such as those replicating schoolgirls Molly Russell and Brianna Ghey. Russell, 14, died by suicide after viewing suicide-related content online, while Ghey, 16, was tragically murdered in 2023.
In response to the lawsuit, Character.ai and Google have been contacted for comment, but neither has yet publicly addressed the claims. As the case unfolds, the families behind the lawsuit hope that the court will act to protect vulnerable young users from the dangers posed by AI-powered platforms like Character.ai.