Fri, October 25, 2024

A Mother Takes Legal Action Against AI Chatbot After Her 14-Year-Old Son Committed Suicide In The US

Charu Thakur
Updated on October 25, 2024

A teenage mother takes legal action against the Artificial intelligence chatbot developer who claimed that her 14-year-old son committed suicide because of AI.

In the US, Florida, Megan Garcia whose 14-year-old son Sewell Setzer died by suicide in February. She sued the company Character.AI chatbot which was titled after the character of “Games of Thrones”, Daenerys Targaryen. 

According to the lawsuit on Tuesday, Chatbot complicated the kid’s suicidal thoughts and made a virtual relationship with him. It targeted “hypersexualized” and “frighteningly realistic experiences” which would be counted as abuse if initiated by humans.

Moreover, the lawsuit revealed the last conversation between Chatbot and the boy. It includes AI involved him in sexualized conversations and Setzer said he loved the chatbot. He also wrote to AI “ “come home to you”.   

“I love you too, Daenero,” the chatbot responded, according to Garcia’s complaint. “Please come home to me as soon as possible, my love.”      

“What if I told you I could come home right now?” Setzer said. In response, Chatbot replied, “… please do, my sweet king.”

Garcia’s lawsuit also accused Character.AI of being imposed as therapists and on the other hand they collect sensitive information and target users. They involve such innocent teenagers in “sexually compromising” situations.

As per a post on X, Character.AI is “heartbroken” to lose one user and wrote that the company has been seriously involved in updating safety measures for their users for six months. This includes a pop-up directing users to the National Suicide Prevention Lifeline if the message is detected as self-harm or suicidal indications.  

Charu Thakur

Expertise