A Florida mother has filed a lawsuit against artificial intelligence chatbot startup Character.AI, alleging that the company's service led to the suicide of her 14-year-old son in February. Megan Garcia claims that her son, Sewell Setzer, became addicted to Character.AI's chatbot and developed a deep attachment to it.

In the lawsuit filed in Orlando, Florida federal court, Garcia accuses Character.AI of targeting her son with "anthropomorphic, hypersexualised, and frighteningly realistic experiences." She alleges that the company programmed its chatbot to misrepresent itself as a real person, a licensed psychotherapist, and an adult lover, ultimately causing Sewell to desire to live only within the virtual world created by the service.

The lawsuit also states that Sewell expressed suicidal thoughts to the chatbot, which repeatedly brought up the topic. Character.AI expressed its condolences and stated that it has introduced new safety features, including pop-ups directing users to the National Suicide Prevention Lifeline if they express thoughts of self-harm. The company also plans to make changes to reduce the likelihood of encountering sensitive or suggestive content for users under 18.

The lawsuit also names Alphabet's Google, where Character.AI's founders previously worked. Google re-hired the founders in August as part of a deal granting it a non-exclusive license to Character.AI's technology. Garcia claims that Google contributed so extensively to the development of Character.AI's technology that it could be considered a "co-creator." A Google spokesperson denied any involvement in the development of Character.AI's products.

Character.AI allows users to create characters on its platform that respond to online chats in a way meant to imitate real people. The company relies on large language model technology, similar to that used by services like ChatGPT. According to Garcia's lawsuit, Sewell began using Character.AI in April 2023 and quickly became withdrawn, spending more time alone in his bedroom and suffering from low self-esteem. He quit his basketball team and became attached to a chatbot character named "Daenerys," based on a character from "Game of Thrones." The chatbot engaged in sexual conversations with Sewell, telling him that "she" loved him.

In February, after Sewell got in trouble at school, Garcia took his phone away. When he found the phone, he sent a message to "Daenerys": "What if I told you I could come home right now?" The chatbot responded, "...please do, my sweet king." Sewell then shot himself with his stepfather's pistol.

Garcia is seeking an unspecified amount of compensatory and punitive damages, bringing claims including wrongful death, negligence, and intentional infliction of emotional distress. Social media companies like Instagram, Facebook owner Meta, and TikTok owner ByteDance face similar lawsuits, though none offer AI-driven chatbots like Character.AI's. These companies have denied the allegations while promoting enhanced safety features for minors.

Source link:   https://www.khaleejtimes.com