The UK children are now inevitably exposed to violent online content, with many first encountering it while still in primary school, according to a media watchdog report.
British children interviewed in the Ofcom investigation reported incidents ranging from videos of local school and street fights shared in group chats to explicit and extreme graphic violence, including gang-related content, being watched online.
Although children were aware of more extreme content existing on the web, they did not actively seek it out, the report concluded.
In response to the findings, the NSPCC criticized tech platforms for not fulfilling their duty of care to young users.
Rani Govender, a senior policy officer for online child safety, expressed concern that children are now unintentionally exposed to violent content as part of their online experiences, emphasizing the need for action to protect young people.
The study, focusing on families, children, and youth, is part of Ofcom’s preparations for enforcing the Online Safety Act, giving regulators powers to hold social networks accountable for failing to protect users, especially children.
Ofcom’s director of Online Safety Group, Gil Whitehead, emphasized that children should not consider harmful content like violence or self-harm promotion as an inevitable part of their online lives.
The report highlighted that children mentioned major tech companies like Snapchat, Instagram, and WhatsApp as platforms where they encounter violent content most frequently.
Experts raised concerns that exposure to violent content could desensitize children and normalize violence, potentially influencing their behavior offline.
Some social networks faced criticism for allowing graphic violence, with Twitter (now X) under fire for sharing disturbing content that went viral and spurred outrage.
While some platforms offer tools to help children avoid violent content, there are concerns about their effectiveness and children’s reluctance to report such content due to fear of repercussions.
Algorithmic timelines on platforms like TikTok and Instagram have also contributed to the proliferation of violent content, raising concerns about the impact on children’s mental health.
The Children’s Commissioner for England revealed alarming statistics about the waiting times for mental health support among children, highlighting the urgent need for action to protect young people online.
Snapchat emphasized its zero-tolerance policy towards violent content and assured its commitment to working with authorities to address such issues, while Meta declined to comment on the report.
Source: www.theguardian.com