The mother of a teenage boy who committed suicide after becoming addicted to an artificial intelligence-powered chatbot has accused the chatbot’s creator of complicity in his death.
Megan Garcia filed a civil lawsuit Wednesday in Florida federal court against Character.ai, which makes customizable role-playing chatbots, alleging negligence, wrongful death, and deceptive trade practices. Her son Sewell Setzer III, 14, died in February in Orlando, Florida. Garcia said Setzer was using the chatbot day and night in the months leading up to his death.
“A dangerous AI chatbot app marketed to children abused and preyed on my son, driving him to suicide,” Garcia said in a press release. “While our family is devastated by this tragedy, I want to warn families of the dangers of deceptive and addictive AI technology and demand accountability from Character.AI, its founders, and Google. I am raising my voice.”
in TweetCharacter.ai said: “We are heartbroken by the tragic loss of one of our users and would like to express our deepest condolences to the family. As a company, we take the safety of our users very seriously. ” The company denied the lawsuit’s allegations.
Setzer was so obsessed with a chatbot built by Character.ai that he nicknamed it Daenerys Targaryen, a character from Game of Thrones. According to Garcia’s complaint, the man would text the bot dozens of times a day from his cell phone and talk to it for hours alone in his room.
Garcia has accused Character.ai of creating a product that worsened her son’s depression, which she said was already the result of overusing the company’s products. At one point, “Daenerys” asked Setzer if he had made any plans to commit suicide, according to the complaint. Setzer admitted to doing so, but didn’t know if it would be successful or cause significant pain, the lawsuit alleges. The chatbot reportedly told him, “That’s no reason not to do it.”
Garcia wrote in a press release that Character.ai “intentionally designed, operated, and marketed a predatory AI chatbot to children, resulting in the death of a young person.” The lawsuit also names Google as a defendant and the parent company of Character.ai. The tech giant said in a statement that it only has a licensing agreement with Character.ai and does not own or maintain any ownership interest in the startup.
Rick Claypool, research director at consumer advocacy nonprofit Public Citizen, said tech companies developing AI chatbots can’t be trusted to regulate themselves, and if they fail to limit harm, says he must take full responsibility.
“Where existing laws and regulations already apply, they must be strictly enforced,” he said in a statement. “Where there are gaps, Congress must act to end companies that exploit young and vulnerable users with addictive and abusive chatbots.”
-
In the US, you can call or text. National Suicide Prevention Lifeline 988, chat 988lifeline.orgor text home To contact a crisis counselor, call 741741. In the UK, a youth suicide charity papyrus In the UK and Ireland, you can contact us on 0800 068 4141 or email pat@papyrus-uk.org. Samaritan You can contact us on freephone 116 123 or email jo@samaritans.org or jo@samaritans.ie. Australian crisis support services lifeline is 13 11 14. Other international helplines can be found at: befrienders.org
Source: www.theguardian.com