Parental Control in Roblox: Grooming Warnings for Kids’ Activities

Roblox, a popular children’s gaming platform, is increasing parental monitoring to address concerns about child grooming, exploitation, and inappropriate content. The platform will now restrict young users from accessing certain content that may be violent, crude, or frightening.

Starting from Monday, parents will have access to a dashboard on their phones where they can see their kids’ interactions, daily usage time, and age. This feature aims to provide accurate recording of children’s activities.

Users under 9 will be limited to “mild” rated games and will require parental approval to access “moderate” content. These ratings differentiate between unrealistic violence and more realistic portrayals of violence.

Furthermore, pre-teens will not be able to chat outside the game as part of global rule tightening. Roblox has become a popular online destination for children aged 8-12 in the UK, after major platforms like Google, Instagram, Facebook, and TikTok.

In response to recent reports of concerning content on the site, Roblox is taking measures to increase safety. The company’s automated software already monitors text and multimedia content to enforce community standards.

Despite the platform’s millions of user-generated game worlds enjoyed by millions of users daily, concerns remain about inappropriate content and interactions. Roblox has faced criticism for hosting games with themes of violence, depression, and even racism.

Calls for better protection of children using Roblox have intensified, prompting the company to implement stricter controls and monitoring. The company remains committed to maintaining safety and civility on the platform.

New restrictions on age-appropriate content and parental controls will be implemented in the coming weeks to enhance child protection measures on Roblox.

Source: www.theguardian.com

Apple finally closes loophole allowing children to bypass parental controls

Apple has acknowledged a persistent bug in its parental controls that allowed children to bypass restrictions and access adult content online.

This bug, which enabled kids to evade controls by entering specific nonsensical phrases in Safari’s address bar, was initially reported to the company in 2021.

Despite being ignored, a recent Wall Street Journal report has shed light on this issue, prompting Apple to commit to addressing it in the next iOS update.

This loophole effectively disabled the Screen Time parental control feature for Safari, allowing children unrestricted access to the internet.

While the bug doesn’t seem to have been widely exploited, critics argue that it reflects Apple’s disregard for parents.

iOS developer Mark Jardine expressed frustration, stating, “As a parent who relies on Screen Time to keep my kids safe, I find the service buggy with loopholes persisting for over a decade.”

When Screen Time was introduced in 2018, it was promoted as a tool for parents to monitor their kids’ device usage and manage their own screen time habits.

Over time, parents have become heavily reliant on Screen Time to control features, apps, and usage times for their children.


Following the release of Screen Time, Apple implemented restrictions on third-party services that offered similar functionalities, citing security concerns. However, this move faced criticism for anticompetitive behavior.

Five years later, critics argue that Apple’s monopoly has led to neglect in improving parental controls. Apple blogger Dan Mollen highlighted concerns raised by parents disillusioned with Screen Time.

Apple responded by saying, “We take reports of issues with Screen Time seriously and have continually made improvements to give customers the best experience. Our work isn’t done yet, and we will continue to provide updates in future software releases.”

Source: www.theguardian.com

Ofcom investigates TikTok for lack of parental control information | Science and Technology News


Ofcom has initiated an inquiry into TikTok to determine if it provided false information about parental controls.

The UK’s media regulator has requested details from Chinese-owned apps and other online video platforms like Snapchat, to report on measures taken to protect children.

While they were generally commended for their efforts, Ofcom stated that it had “reasons to believe” that TikTok provided “inaccurate” information about the family pairing system.

This feature allows adults to link their account to their child’s account and control settings such as screen time limits.

Ofcom will now investigate whether the company “failed in its obligations” by not taking appropriate action.

TikTok attributed the issue to a technical problem, and Ofcom said it is aware and will provide the necessary data.

A spokesperson mentioned that the platform enforces an age requirement of 13 years and that the report notes the platform’s significant effort and resources in locating and removing underage users.

Ofcom’s report is the first in two years following guidance on video sharing apps about how to protect young users from encountering harmful content.

YouTube and Facebook were not covered in the report as they fall under Irish jurisdiction, a result of EU rules that the UK continues to follow, despite leaving the EU gradually through the online safety bill.

An Ofcom report published on Thursday found that TikTok, Snapchat, and Twitch all met the requirements set out two years ago.

All three platforms categorize and label content to ensure it is age-appropriate.

However, while Snapchat and TikTok offer parental controls, Twitch requires parents to supervise their children in real time as per its terms of service.

Ofcom stated that although steps were being taken to protect young users, “victims can still be victimized while using these platforms”.

Source: news.sky.com