Mark Zuckerberg’s Meta has made changes to its policies following the tragic death of teenager Molly Russell, who was influenced by harmful content on Instagram related to suicide and self-harm. Just days before her passing, Molly raised concerns about the risks associated with social media platforms.
The Molly Rose Foundation, established in memory of Molly Russell after her death in 2017, is now urging UK regulatory authorities to take urgent action to address these issues. Meta, under Zuckerberg’s leadership, recently announced modifications to its content acquisition methods, amid the restructuring of the company during the Trump administration.
In the US, the fact-checking system has been replaced with a “community notes” feature, allowing users to verify the accuracy of content. Policies regarding “hate speech” have been revised, with guidelines on respectful language for non-binary individuals and restrictions on harmful claims based on gender or sexual orientation.
Meta has implemented measures to address issues related to suicide, self-harm, and eating disorders through its automated content scanning system.
Despite Meta’s efforts, the Molly Rose Foundation remains concerned about the normalization of harmful behaviors associated with suicide and self-harm, particularly among individuals experiencing severe depression.
The META platform is working to collaborate with regulatory bodies to prevent teenagers from encountering harmful content.
Meta’s own data shows that only 1% of reported suicide and self-harm content on their platforms between July and September last year led to action being taken.
Andy Barrows, the CEO of the Molly Rose Foundation, emphasizes the need for OFCOM to strengthen regulations on tech platforms to ensure child safety. He warns that if OFCOM fails to act decisively, the Prime Minister should intervene.
In May, OFCOM released a draft of safety guidelines requiring tech companies to take action in safeguarding children online. These measures include discontinuing algorithms recommending harmful content, implementing age verification checks, and enhancing overall safety protocols.
A spokesperson for Meta asserts that they are actively working to identify and remove harmful content through automated systems and community standards. They emphasize their commitment to user safety and have restricted access to certain types of content for British teen accounts.
An OFCOM representative affirms the importance of online safety laws in protecting children from risks like suicide and self-harm content, emphasizing swift removal of such materials.
The OFCOM spokesperson states that social media companies, including Meta, must comply with regulations to protect children, and OFCOM is prepared to enforce these measures with full authority if necessary.
Source: www.theguardian.com