Understanding the New UK Online Safety Regulations: How Are Age Checks for Adult Content Implemented?

The importance of online safety for children in the UK is reaching a pivotal moment. Starting this Friday, social media and other internet platforms must take action to safeguard children or face substantial fines for non-compliance.

This marks a critical evaluation of the online safety law, a revolutionary regulation that encompasses platforms like Facebook, Instagram, TikTok, YouTube, Google, and more. Here’s an overview of the new regulations.


What will happen on July 25th?

Companies subject to the law are required to implement safety measures that shield children from harmful content. Specifically, all pornography sites must establish stringent age verification protocols. According to Ofcom, the UK communications regulator, 8% of children aged 8 to 14 accessed online pornographic sites or apps within a month.

Furthermore, social media platforms and major search engines must block access for children to pornography and content that promotes or encourages suicide, self-harm, and eating disorders. This may involve completely removing certain feeds for younger users. Hundreds of businesses will be impacted by these regulations.

Platforms must also minimize the distribution of other potentially harmful content, such as promoting dangerous challenges, substance abuse, or instances of bullying.


Recommended measures include: Algorithms that suggest content to users must exclude harmful materials. All sites and applications must implement procedures to rapidly eliminate dangerous content. Additionally, children should have a straightforward method to report concerns. Compliance is flexible if businesses believe they have effective alternatives to meet their child safety responsibilities.

Services deemed “high risk”, like major social media platforms, must utilize “highly effective” age verification methods to identify users under 18. If a social media platform is found hosting harmful content without age checks, it is responsible for ensuring a “positive” user experience.

X states that if it cannot determine a user’s age as 18 or older, it defaults to sensitive content settings, thereby restricting adult material. They are also integrating age estimation technology and ID verification to ensure users are not underage. Meta, the parent company of Instagram and Facebook, claims to have a comprehensive approach to age verification that includes a teen account feature set by default for users under 18.

“We collaborate with the law firm Payne Hicks Beach,” noted Mark Jones, a partner at the firm. “[Online Safety Act] If not, we strive to clarify it for the company.”

The Molly Rose Foundation, set up by the family of British teenager Molly Russell, who tragically lost her life in 2017 due to harmful online content, is advocating for further changes, including the prohibition of perilous online challenges and requiring platforms to proactively mitigate depressive and body image-related content.


How will age verification be implemented?

Some age verification methods for pornographic providers supported by OFCOM include: assessing a person’s age through live photos and videos (face age estimation), verifying age via credit card, bank, or mobile network operator, matching photo ID, and utilizing a “digital identity wallet” that contains proof of age.

Ria Moody, a lawyer at Linklaters, commented, “Age verification measures must be highly accurate. OFCOM indicates these measures are ineffective unless they ensure the user is over 18, so platforms should not rely solely on them.”


What does this mean in practice?

Pornhub, the UK’s most frequented online porn site, has stated it will implement a “regulatory approved age verification method” by Friday, though specific methods have yet to be disclosed. Another adult site, OnlyFans, is already using facial age verification software, which estimates users’ ages without saving their facial images, relying instead on data from millions of other images. A company called Yoti provides this software and has also made it available on Instagram.

Last week, Reddit began verifying the age of forums and threads containing adult content. The platform utilizes technology from a company named Persona, which verifies age using uploaded selfies or government-issued ID photos. Reddit does not retain the photos, instead storing validation statuses to streamline the process for users.


How accurate is facial age verification?

The software allows websites or apps to set a “challenge” age (e.g., 20 or 25) to minimize the number of underage users accidentally accessing content. When Yoti set a challenge age of 20, less than 1% of 13-17-year-olds were mistakenly verified.


What other methods are available?

Another direct approach entails requiring users to present formal identification, like a passport or driver’s license. Importantly, the ID details need not be stored and can be used solely to verify access.


Will all pornographic sites conduct age checks?

They are expected to, but many smaller sites might try to circumvent the regulations, fearing it will deter demand for their services. Industry representatives suggest that those who disregard the rules may await Ofcom’s response to violations before determining their course of action.


How will child protection measures be enforced?

Ofcom has a broad spectrum of penalties it can impose under the law. Companies can face fines of up to £18 million or 10% of their global revenue for violations—potentially amounting to $16 billion for Meta. Additionally, sites or apps can receive formal warnings. For severe violations, Ofcom may seek a court order to restrict the availability of the site or app in the UK.

Moreover, senior managers at technology firms could face up to two years in prison if they are found criminally liable for repeated breaches of their obligations to protect children and for ignoring enforcement notices from Ofcom.

Source: www.theguardian.com

Smartphone Ban Implemented by Coalition of 17 London Secondary Schools

A collective of schools in London has made the decision to prohibit the use of smartphones, reflecting a growing concern about the reliance on mobile devices among children.

The heads of 17 out of 20 state secondary schools in Southwark, south London, have united to discourage students from using smartphones outside of school premises in an effort to address the negative impacts of excessive smartphone use.

Additionally, three other public schools in the area are working towards implementing the same policy.

The schools aim to educate families and students about the various harmful consequences associated with smartphone and social media use in young individuals. These include mental health issues, addiction to screen time, disruptions to sleep and concentration, exposure to inappropriate content, as well as an increased risk of theft and robbery.

Mike Baxter, principal at City of London Academy, stated, “We have witnessed firsthand the detrimental effects of smartphones and social media on the health and education of children. The negative behaviors often manifested outside of school hours but were subsequently revealed within the school environment.”

The schools have collectively agreed to confiscate cellphones if used during class. Traditional phones without Wi-Fi access may be quickly returned, while smartphones may only be retrieved after a week or upon personal collection by a parent.

The new measures will impact over 13,000 young individuals in one of London’s top-performing boroughs. The policy applies to students in years 7 to 9 across all secondary schools, with some schools adopting a comprehensive approach.

Furthermore, a group of secondary school principals are collaborating with primary school leaders in Southwark to establish a borough-wide initiative.

Jessica West, principal at Ark Walworth Academy, emphasized that the inaction of phone companies compelled schools to take action to ensure the well-being of children. They aim to guide families and children in making healthy choices regarding smartphone usage.


Recent reports indicate a significant rise in screen time among young children and teenagers, with screen time among children increasing by 52% between 2020 and 2022, according to the UK House of Commons Education Committee.

Approximately 25% of children and adolescents are reported to use smartphones in a manner consistent with behavioral addiction, as per the findings of the report.

The collaborative effort has been praised by Daisy Greenwell, co-founder of A childhood without smartphones (SFC), who stated, “This united action by a headteacher in south London is groundbreaking and truly impactful. It is unprecedented for secondary schools to collectively address this issue. Commendably, this could potentially alter the lives of a generation of children in south London who are at risk of developing mental health challenges due to early smartphone usage.”

Concerns regarding smartphones and children are escalating rapidly, with SFC expanding its reach to other countries such as the US, UAE, South Africa, Australia, New Zealand, Switzerland, and Portugal.

In the UK, an increasing number of parents are committing to delaying the provision of smartphones to their children until they reach the age of 14. Bristol is a prime example, where 80 schools have established SFC groups and over 1,000 parents have pledged their support.

Greenwell expressed excitement about the organic growth of this movement among schools, principals, and parents, indicating that this long-awaited conversation is finally gaining traction.

Source: www.theguardian.com