Instagram is set to implement a PG-13 style rating system to enhance parental control over their teens’ interactions on the platform.
Owned by Meta, Instagram will introduce guidelines akin to the U.S. “Parental Guidance” movie ratings established 41 years ago for all content viewed by teen accounts. Consequently, users under 18 will automatically be categorized under the 13+ setting, with the option to opt out only with parental consent.
Currently, teen accounts restrict or prohibit sexually suggestive material, graphic images, and promotions for adult content like alcohol and tobacco. The forthcoming PG-13 framework will impose even stricter regulations.
Meta indicated that it will limit visibility on posts promoting “harmful” activities, including strong language, risky stunts, or content featuring marijuana accessories. Additionally, search terms like “alcohol” and “gore” will be blocked, even if misspelled.
Mehta commented, “While there are distinctions between movies and social media, our modifications aim to provide a teen experience within a 13+ context that parallels viewing a PG-13 film,” emphasizing the desire to communicate the policy in a familiar framework for parents.
The closest equivalent to PG-13 in British film ratings is 12A. Notably, Instagram’s new rating doesn’t impose a complete ban on nudity, similar to how PG-13/12A films like Titanic include brief nudity that isn’t explicitly sexual. Moderate violence, akin to what is found in Fast & Furious films, will also remain accessible.
This initiative follows a study by a former Meta whistleblower, revealing that 64% of new safety features on Instagram are ineffective. The assessment was conducted by Arturo Bejar, a former Meta senior engineer, alongside academics from New York University, Northeastern University, and the Molly Rose Foundation in the UK. Béjart stated, “Children are not safe on Instagram.” Mehta dismissed the findings, asserting that parents possess robust tools at their disposal.
Ofcom, the UK communications regulator, urged social media platforms to adopt a “safety-first” strategy and warned that non-compliance could lead to enforcement actions.
Mehta announced that the Instagram update will begin in the U.S., UK, Australia, and Canada, with plans to expand to Europe and globally early next year.
Activists raised concerns regarding whether these changes will effectively enhance safety.
Rowan Ferguson, policy manager at the Molly Rose Foundation, remarked: “Despite Meta’s numerous public statements, we have not received substantial safety improvements for teens, and our recent report indicates that there’s still work to be done to shield them from harmful content.”
“These additional updates will need to be evaluated for their effectiveness, which necessitates transparency from Meta for independent testing of safety features.”
Source: www.theguardian.com












