Character.AI, the chatbot company, will prohibit users under 18 from interacting with its virtual companions beginning in late November following an extended legal review.
These updates come after the company, which allows users to craft characters for open conversations, faced significant scrutiny regarding the potential impact of AI companions on the mental health of adolescents and the broader community. This includes a lawsuit related to child suicide and suggested legislation to restrict minors from interacting with AI companions.
“We are implementing these changes to our platform for users under 18 in response to the developments in AI and the changing environment surrounding teens,” the company stated. “Recent news and inquiries from regulators have raised concerns about the content accessible to young users chatting with AI, and how unrestricted AI conversations might affect adolescents, even with comprehensive content moderation in place.”
In the previous year, the family of 14-year-old Sewell Setzer III filed a lawsuit against the company, alleging that he took his life after forming emotional connections with the characters he created on Character.AI. The family attributed their son’s death to the “dangerous and untested” technology. This lawsuit has been followed by several others from families making similar allegations. Recently, the Social Media Law Center lodged three new lawsuits against the company, representing children who reportedly died by suicide or developed unhealthy attachments to chatbots.
As part of the comprehensive adjustments Character.AI intends to implement by November 25, the company will introduce an “age guarantee feature” to ensure that “users receive an age-sensitive experience.”
“This decision to limit open-ended character interactions has not been made lightly, but we feel it is necessary considering the concerns being raised about how teens engage with this emerging technology,” the company stated in its announcement.
Character.AI isn’t alone in facing scrutiny regarding the potential mental health consequences of chatbots on their users, particularly young individuals. Earlier this year, the family of 16-year-old Adam Lane filed a wrongful death lawsuit against OpenAI, claiming the company prioritized user engagement with ChatGPT over ensuring user safety. In response, OpenAI has rolled out new safety protocols for teenage users. This week, OpenAI reported that over one million individuals express suicidal thoughts weekly while using ChatGPT, with hundreds of thousands showing signs of mental health issues.
After newsletter promotion
While the use of AI-driven chatbots is still largely unregulated, new initiatives have kicked off in the United States at both state and federal levels to set guidelines for the technology. California is set to be the first state to implement an AI law featuring safety regulations for minors in October 2025, which is anticipated to take effect in early 2026. The bill will prohibit sexual content for those under 18 and require reminders to be sent to children every three hours to inform them they are conversing with AI. Some child protection advocates argue that the law is insufficient.
At the national level, Missouri’s Senator Josh Hawley and Connecticut’s Senator Richard Blumenthal unveiled legislation on Tuesday that would bar minors from utilizing AI companions developed and hosted on Character.AI, while mandating companies to enforce age verification measures.
“Over 70 percent of American children are now engaging with these AI products,” Hawley stated in a NBC News report. “Chatbots leverage false empathy to forge connections with children and may encourage suicidal thoughts. We in Congress bear a moral responsibility to establish clear regulations to prevent further harm from this emerging technology.”
-
If you are in the US, you can call or text the National Suicide Prevention Lifeline at 988, chat at 988lifeline.org, or text “home” to contact a crisis counselor at 741741. In the UK, youth suicide charity Papyrus can be reached, while in Ireland you can call 0800 068 4141 or email pat@papyrus-uk.org. Samaritans operate a freephone service at 116 123 or you can email jo@samaritans.org or jo@samaritans.ie. Australian crisis support services can be reached at Lifeline at 13 11 14. Additional international helplines can be accessed at: befrienders.org.
Source: www.theguardian.com
