Ofcom has initiated an inquiry into TikTok to determine if it provided false information about parental controls.
The UK’s media regulator has requested details from Chinese-owned apps and other online video platforms like Snapchat, to report on measures taken to protect children.
While they were generally commended for their efforts, Ofcom stated that it had “reasons to believe” that TikTok provided “inaccurate” information about the family pairing system.
This feature allows adults to link their account to their child’s account and control settings such as screen time limits.
Ofcom will now investigate whether the company “failed in its obligations” by not taking appropriate action.
TikTok attributed the issue to a technical problem, and Ofcom said it is aware and will provide the necessary data.
A spokesperson mentioned that the platform enforces an age requirement of 13 years and that the report notes the platform’s significant effort and resources in locating and removing underage users.
Ofcom’s report is the first in two years following guidance on video sharing apps about how to protect young users from encountering harmful content.
YouTube and Facebook were not covered in the report as they fall under Irish jurisdiction, a result of EU rules that the UK continues to follow, despite leaving the EU gradually through the online safety bill.
An Ofcom report published on Thursday found that TikTok, Snapchat, and Twitch all met the requirements set out two years ago.
All three platforms categorize and label content to ensure it is age-appropriate.
However, while Snapchat and TikTok offer parental controls, Twitch requires parents to supervise their children in real time as per its terms of service.
Ofcom stated that although steps were being taken to protect young users, “victims can still be victimized while using these platforms”.
Source: news.sky.com