Meta is enhancing safety measures for teenagers on Instagram by implementing a LiveStreaming block, as social media companies extend their under-18 safety measures to Facebook and messenger platforms.
Individuals under the age of 16 will now be restricted from using the live Instagram feature unless they have parental authorization. Additionally, parental permission is required to disable the ability to obscure images containing suspected nudity in direct messages.
These changes come alongside the expansion of Instagram’s teen account system to Facebook and Messenger. Teen accounts, introduced last year, are automatically set for users under 18, with features like daily time limits set by parents, restrictions on usage at specific times, and monitoring of message exchanges.
Facebook and Messenger teen accounts will initially launch in the US, UK, Australia, and Canada. Similar to Instagram accounts, users under 16 must have parental permission to adjust settings, while 16 and 17-year-olds can make changes independently.
Meta disclosed that Instagram teen accounts have fewer than 54 million users globally, with over 90% of 13-15-year-olds adhering to default limits.
These announcements coincide with the UK enforcing online safety laws. Since March, websites and apps covered by the law must take steps to prevent or remove illegal content like child sexual abuse, fraud, terrorist material, etc.
The Act also includes provisions to shield minors from harmful content related to suicide or self-harm, requiring protection for those under 18. Recent reports suggest the law may be softened as part of a UK-US trade deal, sparking backlash from critics.
After the newsletter promotion
At the launch of Instagram restrictions, Nick Clegg, then Meta’s President of Global Affairs, highlighted the goal of shifting the balance in favor of parental controls. These developments follow Clegg’s recent remarks on the lack of parental use of child safety features.
Source: www.theguardian.com