For months, Instagram has faced challenges in persuading parents, advocates, and officials that it is a safe environment for children, despite increasing evidence indicating otherwise. Now, the platform is rolling out another safety feature intended to protect teens. Yet, given its track record, parents remain skeptical.
Beginning this week, all users under 18 will automatically be categorized for ages 13+ and their feeds will be restricted to content suitable for the U.S. PG-13 movie rating.
However, Instagram’s previous unfulfilled commitments make this latest content restriction feel like mere window dressing—an illusion of action without genuine effectiveness.
The company has accrued substantial profits while advocacy groups have long cautioned against exposing minors to inappropriate content and individuals. $100 billion annually is what it reports in profits. Meta’s own estimates suggest that about 100,000 children using Facebook and Instagram face online sexual harassment daily. This is concerning, especially considering that as of July 2020, internal communications revealed that the measures to prevent child grooming on the platform were, at best, “between zero and negligible.” The lawsuit in New Mexico claims that Meta’s social networks, including Instagram, have essentially become a haven for child predators. (Meta refutes these core allegations, claiming the lawsuit is ‘unfair’.)
Last year, the firm finally enacted mandatory Instagram accounts for teenagers. However, a recent study led by a whistleblower revealed that 64% of the new safety features designed for teens were ineffective.
Research indicates that 47% of young teen users on Instagram encounter unsafe content, and 37% of users aged 13 to 15 receive at least one unsafe piece of content or unwanted message weekly. This includes “approximately 1 in 7 users viewing self-harm content, unwanted sexual content, discriminatory content, or substance-related content every week.”
“These failures showcase a corporate culture at Meta that prioritizes engagement and profit over safety,” stated Andy Burrows, CEO of the UK’s Molly Rose Foundation, which advocates for stronger online safety legislation, as part of the investigative team. BBC reported. A spokesperson for Meta countered that the study “misrepresents our commitment to empowering parents and protecting youth, and mischaracterizes the functionality of our safety tools and their use by millions of parents and youth.”
Concurrently, measures introduced last year followed a significant moment for Meta’s public perception. In January 2024, the leaders of the world’s major social media firms were summoned to the U.S. Senate to discuss their security policies. Meta CEO Mark Zuckerberg issued an apology to parents whose children allegedly endured harm from social media.
Despite Instagram’s lengthy struggle to address these concerns, it appears to continually place children at risk, only to issue apologies afterward. On Monday, Reuters reported that it has been found in company-specific research that teens who frequently felt negative about their bodies on Instagram encountered three times more “eating disorder-related content” than their peers. Alarmingly, technology companies and social media platforms have become so entrenched in everyday life that it’s nearly impossible to engage with society without them, particularly for children.
So, what is the resolution? Primarily, we must acknowledge online spaces as extensions of the real world, rather than merely digital counterparts. Social media platforms replicate real-life violence and can cause other tangible harms, putting children at a higher risk.
It’s essential for lawmakers to require these companies to incorporate safety measures into their design processes rather than treating them as an afterthought. Equally vital is for parents to educate their children on online safety, just as they would about physical safety in public.
The technology developed by these profit-driven companies is pervasive. If we cannot rely on them to safeguard our most vulnerable users, it falls upon us to ensure our own protection.
Source: www.theguardian.com
