Ofcom has cautioned tech companies that chatbot content impersonating real or fictional characters could violate new UK digital laws.
The communications regulator issued guidelines after learning that users of the Character.AI platform were creating avatars resembling deceased British teenagers Brianna Gee and Molly Russell.
Facing pressure from digital safety advocates to clarify the issue, Ofcom emphasized that content produced by user-generated chatbots falls within online safety regulations.
Although not directly mentioning US-based artificial intelligence firm Character.AI, Ofcom stated that the law would encompass websites and apps enabling users to create their own chatbots for interaction with others.
“This includes services offering tools for users to develop chatbots mimicking real and fictional individuals, which can then be shared with others from a chatbot library,” said Ofcom.
In an open letter, Ofcom mentioned that the regulations would also apply to platforms allowing chatbot-generated content to be shared among users, such as social media platforms and messaging apps. Violating companies could face fines or even website/app blocking.
Ofcom disclosed that the guidance was released following a tragic incident involving Character.AI, where users had created bot versions of Brianna and Molly, leading to the dissemination of harmful content online.
The rules on online safety, set to take full effect next year, will require social media platforms to safeguard users, especially children, from illicit or harmful content through proactive removal systems and user-friendly reporting tools.
The Molly Rose Foundation (MRF), established by Molly’s family, supported Ofcom’s guidance, emphasizing the potential harm chatbots could cause.
Jonathan Hall KC, the government’s terrorism laws adviser, considered AI chatbot responses dangerous, prompting the MRF to view bot-generated content as illegal. Ofcom will soon provide guidance on addressing illicit content, including chatbot material.
Ben Packer from law firm Linklaters highlighted the expansive reach of the Online Safety Act, necessitating clear guidelines amid the rise of GenAI tools and chatbots, which were not initially within the law’s scope.
Character.AI assured that they are actively monitoring platform safety and have removed chatbots resembling Guy, Russell, and Game of Thrones characters in response to user reports.
Source: www.theguardian.com