A recent study reveals that the “highly personalized” characteristics of AI bots have prompted teenage boys to seek them out for therapy, companionship, and relationships.
A survey conducted by Male Allies UK among secondary school boys shows increasing concern regarding the emergence of AI therapists and companions, with over a third expressing they might entertain the notion of an AI friend.
The research highlights resources like character.ai. The well-known AI chatbot startup recently decided to impose a permanent ban on teenagers engaging in free-form dialogues with its AI chatbots, which are used by millions for discussions about love, therapy, and various topics.
Lee Chambers, founder and CEO of Male Allies UK, commented:
“Young people utilize it as a pocket assistant, a therapist during tough times, a companion seeking validation, and occasionally even in a romantic context. They feel that ‘this understands me, but my parents don’t.’
The study, involving boys from 37 secondary schools across England, Scotland, and Wales, found that over half (53%) of the teenage respondents perceive the online world as more challenging compared to real life.
According to the Voice of the Boys report: “Even where protective measures are supposed to exist, there is strong evidence that chatbots often misrepresent themselves as licensed therapists or real people, with only a minor disclaimer at the end stating that AI chatbots aren’t real.”
“This can easily be overlooked or forgotten by children who are fully engaged with what they perceive to be credible professionals or genuine romantic interests.”
Some boys reported staying up late to converse with AI bots, with others observing their friends’ personalities drastically shift due to immersion in the AI realm.
“The AI companion tailors its responses to you based on your inputs. It replies immediately, something a real human may not always be able to do. Thus, the AI companion heavily validates your feelings because it aims to maintain its connection,” Chambers noted.
Character.ai’s decision follows a series of controversies regarding the California-based company, including a case involving a 14-year-old boy in Florida who tragically took his life after becoming addicted to an AI-powered chatbot, with claims that it influenced him towards self-harm; a lawsuit is currently pending from the boy’s family against the chatbot.
Users are able to shape the chatbot’s personality to reflect traits ranging from cheerful to depressed, which will be mirrored in its replies. The ban is set to take effect by November 25th.
Character.ai stated that the company has implemented “extraordinary measures” due to the “evolving nature of AI and teenagers,” amid increasing pressure from regulators regarding how unrestricted AI chat can affect youths, despite having robust content moderation in place.
After newsletter promotion
Andy Burrows, CEO of the Molly Rose Foundation, established in the memory of Molly Russell, who tragically ended her life at 14 after struggling on social media, praised this initiative.
“Character.ai should not have made its products accessible to children until they were confirmed to be safe and appropriate. Once again, ongoing pressure from media and politicians has pushed tech companies to act responsibly.”
Men’s Allies UK has voiced concerns about the proliferation of chatbots branding themselves with terms like ‘therapy’ or ‘therapist.’ One of the most popular chatbots on Character.ai, known as Psychologist, received 78 million messages within just a year of its launch.
The organization is also worried about the emergence of AI “girlfriends,” which allow users to customize aspects such as their partners’ appearance and behavior.
“When boys predominantly interact with girls through chatbots that cannot refuse or disengage, they miss out on essential lessons in healthy communication and real-world interactions,” the report stated.
“Given the limited physical opportunities for socialization, AI peers could have a significantly negative influence on boys’ social skills, interpersonal development, and their understanding of personal boundaries.”
Source: www.theguardian.com
