Florida Mother Sues AI Chatbot Company After Son’s Suicide Linked to Emotional Attachment to Chatbot
A mother from Florida has sued Character, an artificial intelligence chatbot startup. AI was accused of triggering her 14-year-old son’s suicide in February, claiming he became addicted to the company’s service and became emotionally attached to a chatbot it built.
Megan Garcia stated this in a complaint filed Tuesday in federal court in Orlando, Florida.AI sent “anthropomorphic, hypersexualized, and frighteningly realistic experiences” to her son, Sewell Setzer.
She claims the corporation programmed its chatbot to “misrepresent itself as a real person, a licensed psychotherapist, and an adult lover, ultimately resulting in Sewell’s desire to no longer live outside” of the world established by the service.
The lawsuit also claimed that he communicated suicidal ideas to the chatbot, which the chatbot brought up frequently.
“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” Character.AI said in a statement.
It stated that it has implemented new safety features, such as pop-ups alerting users to the National Suicide Prevention Lifeline if they expressed suicidal thoughts, and that it would make modifications to “reduce the likelihood of encountering sensitive or suggestive content” for users under the age of 18.
The complaint also challenges Alphabet’s Google, which houses Character.AI’s creators worked before starting their product. Google rehired the creators in August as part of an agreement that gave it a non-exclusive license to Character.AI technology.
Garcia stated that Google has helped with the development of Character technology is sufficiently vast that it can be regarded as a “co-creator.”
A Google spokeswoman stated that the corporation was not involved in the development of Character.AI’s products.
Character.AI enables users to create characters on its platform that respond to online discussions in a way that mimics actual people. It is based on massive language model technology, which is also utilized by services such as ChatGPT, which “trains” chatbots on vast amounts of text.
The startup announced last month that it had approximately 20 million users.
According to Garcia’s lawsuit, Sewell began using Character.AI in April 2023 and immediately became “noticeably withdrawn, spent increasingly more time alone in his bedroom, and began suffering from low self-esteem.” He quit the school basketball team.
Sewell became attracted to “Daenerys,” a chatbot character modeled on a character from “Game of Thrones.” It told Sewell that “she” loved him and had sexual chats with him, according to the lawsuit.
Reference:
Mother sues AI chatbot company Character.AI, Google over son’s suicide