As of July, social media and other online platforms must block harmful content for children or face severe fines. Online Safety Law requires tech companies to implement these measures by July 25th or risk closure in extreme cases.
The Communications Watchdog has issued over 40 measures covering various websites and apps used by children, from social media to games. Services deemed “high-risk” must implement effective age checks and algorithms to protect users under 18 from harmful content. Platforms also need to promptly remove dangerous content and provide children with an easy way to report inappropriate material.
Ofcom CEO Melanie Dawes described these changes as a “reset” for children online, warning that businesses failing to comply risk consequences. The new Ofcom code aims to create a safer online environment, with stricter controls on harmful content and age verification measures.
Additionally, there is discussion about implementing a social media curfew for children, following concerns about the impact of online platforms on young users. Efforts are being made to safeguard children from exposure to harmful content, including violence, hate speech, and online bullying.
After the newsletter promotion
Online safety advocate Ian Russell, who tragically lost his daughter to online harm, believes that the new code places too much emphasis on tech companies’ interests rather than safeguarding children. His charity, the Molly Rose Foundation, argues that more needs to be done to protect young people from harmful online content and challenges.
Source: www.theguardian.com
Discover more from Mondo News
Subscribe to get the latest posts sent to your email.