Porn Company Fined £1 Million by Ofcom for Inadequate Age Verification

A pornography company managing 18 adult websites has incurred a £1 million fine from the regulatory body Ofcom for inadequate age verification measures, marking the largest penalty issued thus far under the UK’s Online Safety Act.

The Belize-based AVS Group has also faced an additional fine of £50,000 for not adhering to information request protocols.

This incident represents the third time the communications regulator has enforced fines on a company regarding the UK’s Online Safety Act, which implemented stringent age verification rules in July.

AVS has implemented what it describes as an age verification system; however, regulatory assessments have deemed it to be ineffective.

The company finds itself facing a £1,000 penalty for each day Ofcom considers the circumstance valid within a 72-hour window of the age check being introduced. This leads to a cumulative fine of £300 daily until they comply with the information request or for as much as 60 days.

Since the implementation of the new regulations, Ofcom has initiated investigations into 92 online services, giving priority to sites attracting millions of monthly visitors, considering the potential harm these sites pose.

Oliver Griffiths, Ofcom’s group director for online safety, shared with BBC Radio 4’s Today program that the fines are part of a “broader shift” focused on platform accountability, which includes the “large scale” rollout of age verification on adult sites to combat child sexual abuse material.

Mr. Griffiths noted that more than 90 websites, inclusive of 83 adult sites, are still under scrutiny for possible infringements of the law, and further penalties are expected.

Ofcom also indicated that a significant social media platform, unnamed, could face formal repercussions should it fail to enhance its compliance measures. This platform has reportedly provided insufficient risk assessments needed to evaluate the potential appearance of illegal content, such as scams and unlawful pornography, to its users.

“We reached back out to inform them a re-evaluation was necessary,” Griffiths stated. “Should they fail to treat this matter with due seriousness again, we will promptly escalate to enforcement.”

Furthermore, Ofcom disclosed its review of major anonymous platforms’ capabilities to eliminate illegal terrorist and hate-driven content, including anti-Semitic and anti-Muslim material, with possible enforcement actions on the horizon.

The Online Safety Act brings forth several new regulations designed to shield children and adults from harmful content, with violations potentially costing up to £18 million or 10% of annual UK revenue, or even business closures.

More than half of the UK’s 100 most frequented adult services have instituted age verification since the rule changes in July, alongside social media sites like X, TikTok, and Reddit, according to the regulator. Mr. Griffiths acknowledged a rapid increase in the usage of virtual private networks (VPNs), which enable users to bypass regional restrictions on certain sites, noting that this number had peaked at between 600,000 to over 1 million users when the age verification was assessed but has since declined “significantly” below 1 million.

“There has been a slight uptick in VPN usage; however, several elements were not sold wholesale. Interesting research …indicates that children do not seem to constitute a large proportion,” he noted.

Technology Secretary Liz Kendall stated: “Since the Online Safety Act was enacted, platforms are indeed beginning to accept responsibility for safeguarding children and eradicating illegal and hateful content.”

“Ofcom has the full backing of the Government and is leveraging every authority at its disposal to ensure a service prioritizing user safety. Ensuring the online safety of children remains a top priority for this Government and for me personally.”

Source: www.theguardian.com

Ofcom Calls on Social Media Platforms to Combat Fraud and Curb Online ‘Pile-Ons’

New guidelines have urged social media platforms to curtail internet “pile-ons” to better safeguard women and girls online.

Ofcom, Britain’s communications regulator, implemented guidance on Tuesday aimed at tackling misogynistic abuse, coercive control, and the non-consensual sharing of intimate images, with a focus on minimizing online harassment of women.

The measures imply that tech companies will limit the number of responses to posts on platforms like X, a strategy Ofcom believes will lessen incidents where individual users are inundated with abusive responses.


Additional measures proposed by Ofcom include utilizing databases of images to prevent the non-consensual sharing of intimate photos—often referred to as ‘revenge porn’.

The regulator advocates for “hash matching” technology that helps platforms remove disputed images. This system cross-references user-reported images or videos with a database of illegal content, transforming them into “hashes” or digital identifiers, enabling the identification and removal of harmful images.

These recommendations were put forth under the Online Safety Act (OSA), a significant law designed to shield children and adults from harmful online content.

While the advice is not obligatory, Ofcom is urging social media companies to follow it, announcing plans to release a report in 2027 assessing individual platforms’ responses to the guidelines.

The regulator indicated that the OSA could be reinforced if the recommendations are not acted upon or prove ineffective.

“If their actions fall short, we will consider formally advising the government on necessary enhancements to online safety laws,” Ofcom stated.

Dame Melanie Dawes, Ofcom’s chief executive, has encountered “shocking” reports of online abuse directed at women and girls.


Melanie Dawes, Ofcom’s chief executive. Photo: Zuma Press Inc/Alamy

“We are sending a definitive message to tech companies to adhere to practical industry guidance that aims to protect women from the genuine online threats they face today,” Dawes stated. “With ongoing support from our campaigners, advocacy groups, and expert partners, we will hold companies accountable and establish new benchmarks for online safety for women and girls in the UK.”

Ofcom’s other recommendations suggest implementing prompts to reconsider posting abusive content, instituting “time-outs” for frequent offenders, and preventing misogynistic users from generating ad revenue related to their posts. It will also allow users to swiftly block or mute several accounts at once.

These recommendations conclude a process that started in February, when Ofcom conducted a consultation that included suggestions for hash matching. However, more than a dozen guidelines, like establishing “rate limits” on posts, are brand new.

Internet Matters, a nonprofit organization dedicated to children’s online safety, argued that governments should make the guidance mandatory, cautioning that many tech companies might overlook it. Ofcom is considering whether to enforce hash matching recommendations.

Rachel Huggins, co-chief executive of Internet Matters, remarked: “We know many companies will disregard this guidance simply because it is not legally binding, leading to continued unacceptable levels of online harm faced by women and girls today.”

Source: www.theguardian.com

Liz Kendall: Ofcom Risks Losing Public Trust Over Online Harm Issues

Technology Secretary Liz Kendall has warned that Britain’s internet regulator, Ofcom, may lose public confidence if it doesn’t take adequate measures to address online harm.

During a conversation with Ofcom’s Chief Executive Melanie Dawes last week, Ms. Kendall expressed her disappointment with the slow enforcement of the Online Safety Act, designed to shield the public from dangers posed by various online platforms, including social media and adult websites.

While Ofcom stated that the delays were beyond their control and that “change is underway,” Ms. Kendall remarked to the Guardian: “If they utilize their authority, they risk losing public trust.”

The father of Molly Russell, who tragically took her life at 14 after encountering harmful online material, expressed his disillusionment with Ofcom’s leadership.

Kendall did not offer any support when questioned about his faith in the regulator’s leadership.

Her comments come amidst worries that key components of the online safety framework may not be implemented until mid-2027—nearly four years after the Online Safety Act was passed—and that the rapid pace of technological advancement could outstrip government regulations.

Kendall also voiced significant concerns about “AI chatbots” and their influence on children and young adults.

This concern is underscored by a U.S. case involving teenagers who sadly died by suicide after forming deep emotional bonds with ChatGPT and Character.AI chatbots, treating them as confidants.

“If chatbots are not addressed in the legislation or aren’t adequately regulated—something we are actively working on—they absolutely need to be,” Kendall asserted. “Parents need assurance that their children are safe.”

With Ofcom Chairman Michael Grade set to resign in April, a search for his successor is underway. Ms. Dawes has been CEO for around six years, having served in various roles in public service. Ofcom declined to provide further comment.




Michael Grade will soon step down as chairman of Ofcom. Photo: Leon Neal/Getty Images

On Thursday, regulators imposed a £50,000 fine on the Nudify app for failing to prevent minors from accessing pornography. The app typically uses AI to “undress” uploaded photos.

Mr. Kendall stated that Ofcom is “progressing in the right direction.” This marks the second fine issued by regulators since the law was enacted over two years ago.

He spoke at the launch of a new AI ‘Growth Zone’ in Cardiff, which aims to draw £10 billion in investment and create 5,000 jobs across various locations, including the Ford Bridgend engine factory and Newport.

The government noted that Microsoft is one of the companies “collaborating with the government,” although Microsoft has not made any new investment commitments.

Ministers also plan to allocate £100 million to support British startups, particularly in designing chips that power AI, where they believe the UK holds a competitive edge. However, competing with U.S. chipmaker Nvidia, which recently reported nearly $22 billion in monthly revenue, may prove challenging.


On Wednesday, Labour MPs accused Microsoft of “defrauding” British taxpayers, as U.S. tech firms raked in at least £1.9 billion from government contracts in the 2024-25 financial year.

When asked for his thoughts, Mr. Kendall praised Microsoft’s AI technology being utilized for creating lesson plans in schools within his constituency but emphasized the need for better negotiation expertise to secure optimal deals. He also expressed a desire to see more domestic companies involved, especially in the AI sector.

A Microsoft spokesperson clarified that the NHS procures its services through a national pricing framework negotiated by the UK government, which “ensures both transparency and value for money,” stating that the partnership is delivering “tangible benefits.”

“The UK government chooses to distribute its technology budget among various suppliers, and Microsoft is proud to be one of them,” they added.

Source: www.theguardian.com

Father of Teenager Killed Over Social Media Trusts Ofcom No More

Molly Russell’s father, the British teenager who tragically took her life after encountering harmful online material, has expressed his lack of confidence in efforts to secure a safer internet for children. He is advocating for a leadership change at Britain’s communications regulatory body.

Ian Russell, whose daughter Molly was only 14 when she died in 2017, criticized Ofcom for its “repeated” failure to grasp the urgency of safeguarding under-18s online and for not enforcing new digital regulations effectively.

“I’ve lost faith in Ofcom’s current leadership,” he shared with the Guardian. “They have consistently shown a lack of urgency regarding this mission and have not been willing to use their authority adequately.”

Mr. Russell’s remarks coincided with a letter from technology secretary Liz Kendall to Ofcom, expressing her “deep concern” over the gradual progress of the Online Safety Act (OSA), a groundbreaking law that lays out safety regulations for social media, search engines, and video platforms.

After his daughter’s death, Mr. Russell became a prominent advocate for internet safety and raised flags with Ofcom chief executive Melanie Dawes last year regarding online suicide forums accessible to UK users.

Ofcom opened an investigation into these forums after acquiring new regulatory authority under the OSA, and the site voluntarily restricted access to UK users.

However, Mr. Russell noted that the investigation seemed to be “stalled” until regulators intensified their scrutiny this month when it was revealed that UK users could still access the forums via undiscovered “mirror sites.”




Molly Russell passed away in 2017. Photo: P.A.

“If Ofcom can’t manage something this clear-cut, it raises questions about their competence in tackling other issues,” Mr. Russell stated.

In response, Ofcom assured Mr. Russell that they were continuously monitoring geo-blocked sites and indicated that a new mirror site had only recently come to their attention.

Mr. Russell voiced his agreement with Mr. Kendall’s frustrations over the slow implementation of additional components of the OSA, particularly stricter regulations for the most influential online platforms. Ofcom attributed the delays to a legal challenge from the Wikimedia Foundation, the organization that supports Wikipedia.

The regulator emphasized its “utmost respect” for bereaved families and cited achievements under its stewardship, such as initiating age verification on pornography websites and combating child sexual abuse content.

“We are working diligently to push technology firms to ensure safer online experiences for children and adults in the UK. While progress is ongoing, meaningful changes are occurring,” a spokesperson commented.

The Molly Rose Foundation, established by Molly’s family, has reached out to the UK government urging ministers to broaden legal mandates for public servant transparency to include tech companies.

In their letter, they requested Victims’ Minister Alex Davies-Jones to expand the Public Powers (Accountability) Bill, which introduces a “duty of honesty” for public officials.

This bill was prompted by critiques regarding the police’s evidence handling during the Hillsborough investigation, mandating that public entities proactively assist inquiries, including those by coroner’s courts, without safeguarding their own interests.

The foundation believes that imposing similar transparency requirements on companies regulated by the OSA would aid in preserving evidence in cases of deaths possibly linked to social media.

The inquest into Molly’s passing was postponed due to a conflict surrounding evidence presentation.

“This change fundamentally shifts the dynamic between tech companies and their victims, imposing a requirement for transparency and promptness in legal responses,” the letter asserted.

Recent legislative changes have granted coroners enhanced authority under the OSA to request social media usage evidence from tech companies and prohibit them from destroying sensitive data. However, the letter’s signatories contend that stricter measures are necessary.

More than 40 individuals, including members of Survivors for Online Safety and Meta whistleblower Arturo Bejar, have signed the letter.

A government spokesperson indicated that the legal adjustments empower coroners to request further data from tech firms.

“The Online Safety Act will aid coroners in their inquests and assist families in seeking the truth by mandating companies to fully disclose data when there’s a suspected link between a child’s death and social media use,” a spokesperson stated.

“As pledged in our manifesto, we’ve strengthened this by equipping coroners with the authority to mandate data preservation for inquest support. We are committed to taking action and collaborating with families and advocates to ensure protection for families and children.”


In the UK, you can contact the youth suicide charity Papyrus at 0800 068 4141 or email pat@papyrus-uk.org. For support, reach out to the Samaritans at freephone 116 123 or email jo@samaritans.org or jo@samaritans.ie. In the United States, contact the 988 Lifeline for suicide and crisis at 988 or chat. In Australia, you can reach Lifeline at 13 11 14. Other international helplines are available at: befrienders.org

Source: www.theguardian.com

YouTube Emerges as Kids’ Top TV Choice, According to Ofcom Findings

Children are turning to YouTube as soon as they power on their TVs, marking a significant shift in video consumption from laptops to living rooms.

According to an extensive study of UK viewing habits by the Communications Regulator Ofcom, YouTube stands as the most favored TV platform for Generation Alpha.

Last year, one in five young viewers, aged between four and 15, headed to this platform first. While BBC One ranked among the top five initial choices, kids had the option to select BBC iPlayer as well.

It’s not only the younger demographic that has taken to YouTube TV. An observable cultural shift reveals that viewers aged 55 and older have nearly doubled their watching time on YouTube, going from 6 minutes a day to 11 minutes last year, with an increasing percentage (42%) tuning in through television.

On average, viewers dedicated 39 minutes a day to YouTube in 2024, positioning the platform as the UK’s second most-watched service after the BBC and ahead of ITV.

For those aged 16 to 24, traditional live television earned only 17 minutes of daily watch time last year, as just 45% tuned in during an average week, down from 48% in 2023.

YouTube has emerged as a serious rival to conventional broadcasters, with half of its trending videos echoing the content typically associated with mainstream broadcasters, including long-form interviews and game shows.

The rise of YouTube poses significant challenges for public service broadcasters like the BBC, ITV, and Channel 4.

These organizations are eager to engage newer, younger demographics; however, YouTube also captures a substantial share of their advertising revenue, which raises concerns that featuring their content on YouTube may diminish viewer time on their own platforms.


A 12-year-old girl will be holding a morning training session on Joe Wicks YouTube. Photo: Richard Sarker/Guardian

“Public service broadcasters recognize this shift and will adapt to meet audiences in online environments, increasing engagement there,” a spokesperson commented. “However, more ambitious efforts are needed to ensure that public service media remains relevant and sustainable in the long term.”

Ofcom has already advised traditional broadcasters, referred to as “endangered species,” to collaborate with YouTube for greater content visibility, emphasizing the need for “fair commercial terms.”

Michael Grade, chairman of Ofcom and former Channel 4 director, proposed that the government consider new legislation requiring YouTube to feature UK public service broadcasters’ content more prominently.

Skip past newsletter promotions

Channel 4 has been among the pioneers in sharing content on YouTube, arguing that rather than detracting from their own platform, it expands their audience reach. ITV also announced a new agreement with YouTube at the end of last year.

Discussions are underway within the BBC about how much content should be uploaded to YouTube, although significant media coverage will still be featured there. Deborah Turness, head of BBC News, has informed staff that the new head of the AI department will drive a YouTube strategy aimed at younger viewers.

Weekly video viewing times remain stable, with the average person watching 4 hours and 30 minutes at home last year—down just a minute from the previous year.

Traditional broadcast content still dominates viewing, accounting for 56% of the total, although this figure has decreased from 71% in 2018.

Additionally, YouTube has become the leading online audio platform, with 47% of UK adults using it weekly, compared to 36% for Spotify. The rise of podcasts on YouTube signifies its growing influence across the media landscape.

Source: www.theguardian.com

Ofcom Chief: Age Verification Crucial for Kids’ Online Safety

The UK’s primary media regulator has vowed to deliver a “significant milestone” in the pursuit of online safety for children, although it has cautioned that age verification measures must enforce stricter regulations on major tech firms.

Ofcom’s chief, Melanie Dawes, will unveil a new framework on Sunday. To be introduced later this month, marking a pivotal change in how the world’s largest online platforms are regulated.

However, she faces mounting pressure from advocates, many of whom are parents who assert that social media contributed to the deaths of their children, claiming that the forthcoming rules could still permit minors to access harmful content.

Dawes stated to the BBC on Sunday: “This is a considerable moment because the law takes effect at the end of the month.”

“At that point, we expect broader safeguards for children to become operational. We aim for platforms that host material inappropriate for under-18s, such as pornography and content related to suicide and self-harm, to either be removed or to implement robust age checks for those materials.”

She continued: “This is a significant moment for the industry and a critical juncture.”


Melanie Dawes (left) remarked that age checks are “a significant milestone for the industry.” Photo: Jeffover/BBC/PA

The regulations set to take effect on July 25th are the latest steps under the online safety law enacted in 2023 by the Conservative government.

The legislation was partially influenced by advocates like Ian Russell, whose 14-year-old daughter, Molly, tragically took her own life in 2017 after being exposed to numerous online resources concerning depression, self-harm, and suicide.

Minister Tory Removing certain bill sections has been criticized for potentially neglecting regulations on “legal but harmful” content in 2022.

Russell, who previously referred to the ACT as “timid,” expressed concerns regarding its enforcement by Ofcom on Sunday. He noted that while regulators allow tech companies to self-determine validation checks, they will evaluate the effectiveness of these measures.

Russell commented: “Ofcom’s public relations often portray a narrative where everything will improve soon. It’s clear that Ofcom must not only prioritize PR but must act decisively.”

“They are caught between families who have suffered losses like mine and the influence of powerful tech platforms.”

Skip past newsletter promotions

Ian Russell, a father currently advocating for child internet safety, expressed concerns about the enforcement of the law. Photo: Joshua Bratt/PA

Russell pressed Dawes to leverage her influence to urge the government for more stringent actions against tech companies.

Some critics have charged the minister with leaving substantial regulatory loopholes, including a lack of action against misinformation.

A committee of lawmakers recently asserted that social media platforms facilitated the spread of misinformation following a murder in Southport last year, contributing to the unrest that ensued. Labour MP Chi Onwurah, chair of the Science and Technology Committee, remarked that the online safety law “is unraveling.”

Dawes has not sought authority to address misinformation, but stated, “If the government chooses to broaden the scope to include misinformation or child addiction, Ofcom would be prepared to implement it.”

Nonetheless, she called out the BBC regarding their handling of Glastonbury’s coverage, questioning whether the lead singer should continue broadcasting footage of Bob Dylan’s performance amid anti-Israel chants.

“The BBC needs to act more swiftly. We need to investigate these incidents thoroughly. Otherwise, there’s a genuine risk of losing public trust in the BBC,” she stated.

Source: www.theguardian.com

Critics accuse Ofcom of putting high-tech companies’ interests ahead of online child safety

The Communication Watchdog is accused of endorsing major technology for the safety of under-18s after England’s children’s commissioners criticized new measures to address online harm. Rachel de Souza warned Offcom last year that the proposals to protect children under online safety laws are inadequate. She expressed disappointment that the new code of practice published by WatchDog ignored her concerns, prioritizing the business interests of technology companies over child safety.

De Souza, who advocates for children’s rights, highlighted that over a million young people shared their concerns about the online world being a significant worry. She emphasized the need for stronger protection measures and criticized the lack of enhancements in the current code of practice.

Some of the measures proposed by Ofcom include implementing effective age checks for social media platforms, filtering harmful content through algorithms, swiftly removing dangerous material, and providing children with an easy way to report inappropriate content. Sites and apps covered by the code must adhere to these changes by July 25th or face fines for non-compliance.

Critics, including the Molly Rose Foundation and online safety campaigner Beavan Kidron, argue that the measures are too cautious and lack specific harm reduction targets. However, Ofcom defended its stance, stating that the rules aim to create a safer online environment for children in the UK.

The Duke and Duchess of Sussex have also advocated for stricter online protections for children, calling for measures to reduce harmful content on social media platforms. Technology Secretary Peter Kyle is considering implementing a social media curfew for children to address the negative impacts of excessive screen time.

Overall, the new code of practice aims to protect children from harmful online content, with stringent measures in place for platforms to ensure a safer online experience. Failure to comply with these regulations could result in significant fines or even legal action against high-tech companies and their executives.

Source: www.theguardian.com

Ofcom introduces new regulations for tech companies to ensure online safety for children

As of July, social media and other online platforms must block harmful content for children or face severe fines. Online Safety Law requires tech companies to implement these measures by July 25th or risk closure in extreme cases.

The Communications Watchdog has issued over 40 measures covering various websites and apps used by children, from social media to games. Services deemed “high-risk” must implement effective age checks and algorithms to protect users under 18 from harmful content. Platforms also need to promptly remove dangerous content and provide children with an easy way to report inappropriate material.

Ofcom CEO Melanie Dawes described these changes as a “reset” for children online, warning that businesses failing to comply risk consequences. The new Ofcom code aims to create a safer online environment, with stricter controls on harmful content and age verification measures.

Additionally, there is discussion about implementing a social media curfew for children, following concerns about the impact of online platforms on young users. Efforts are being made to safeguard children from exposure to harmful content, including violence, hate speech, and online bullying.

Skip past newsletter promotions

Online safety advocate Ian Russell, who tragically lost his daughter to online harm, believes that the new code places too much emphasis on tech companies’ interests rather than safeguarding children. His charity, the Molly Rose Foundation, argues that more needs to be done to protect young people from harmful online content and challenges.

Source: www.theguardian.com

Ofcom demands social media platforms to adhere to online safety laws

Social media platforms are required to take action to comply with UK online safety laws, but they have not yet implemented all the necessary measures to protect children and adults from harmful content, according to the regulator.

Ofcom has issued a code of conduct and guidance for tech companies to adhere to in order to comply with the law, which includes the possibility of hefty fines and site closures for non-compliance.

Regulators have pointed out that many of the recommended actions have not been taken by the largest and most high-risk platforms.

John Higham, Director of Online Safety Policy at Ofcom, stated, “We believe that no company has fully implemented all necessary measures. There is still a lot of work to be done.”

All websites and apps covered by the law, including Facebook, Google, Reddit, and OnlyFans, have three months to assess the risk of illegal content appearing on their platforms. Safety measures must then be implemented to address these risks starting on March 17, with Ofcom monitoring progress.

The law applies to sites and apps that allow user-generated content, as well as large search engines covering over 100,000 online services. It lists 130 “priority crimes,” including child sexual abuse, terrorism, and fraud, which tech companies need to address by implementing moderation systems.

The new regulations and guidelines are considered the most significant changes to online safety policy in history according to Technology Secretary Peter Kyle. Tech companies will now be required to proactively remove illegal content, with the risk of heavy fines and potential site blocking in the UK for non-compliance.

Ofcom’s code and guidance include designating a senior executive responsible for compliance, maintaining a well-staffed moderation team to swiftly remove illegal content, and improving algorithms to prevent the spread of harmful material.

Platforms are also expected to provide easy-to-find tools for reporting content, with a confirmation of receipt and timeline for addressing complaints. They should offer users the ability to block accounts, disable comments, and implement automated systems to detect child sexual abuse material.

Child safety campaigners have expressed concerns that the measures outlined by Ofcom do not go far enough, particularly in addressing suicide-related content and making it technically impossible to remove illegal content on platforms like WhatsApp.

In addition to addressing fraud on social media, platforms will need to establish reporting channels for instances of fraud with law enforcement agencies. They will also work on developing crisis response procedures for events like the summer riots following the Southport murders.

Source: www.theguardian.com

Ofcom calls for action following allegations of Roblox being a ‘pedophile hellscape’ by US company

Child safety activists have urged the UK’s communications watchdog to enforce new online laws following accusations that video game companies have turned their platforms into “hellscapes for adult pedophiles.” They are calling for “gradual changes.”

Last week, Roblox, a popular gaming platform with 80 million daily users, came under fire for its lax security controls. An investment firm in the US criticized Roblox, claiming that its games expose children to grooming, pornography, violent content, and abusive language. The company has denied these claims and stated that safety and civility are fundamental to their operations.

The report highlighted concerning issues such as users seeking to groom avatars, trading in child pornography, accessible sex games, violent content, and abusive behavior on Roblox. Despite these concerns, the company insists that millions of users have safe and positive experiences on the platform, and any safety incidents are taken seriously.

Roblox, known for its user-generated content, allows players to create and play their own games with friends. However, child safety campaigners emphasize the need for stricter enforcement of online safety laws to protect young users from harmful content and interactions on platforms like Roblox.

Platforms like Roblox will need to implement measures to protect children from inappropriate content, prevent grooming, and introduce age verification processes to comply with the upcoming legislation. Ofcom, the regulator responsible for enforcing these laws, is expected to have broad enforcement powers to ensure user safety.

In response, a Roblox spokesperson stated that the company is committed to full compliance with the Online Safety Act, engaging in consultations and assessments to align with Ofcom’s guidelines. They look forward to seeing the final code of practice and ensuring a safe online environment for all users.

Source: www.theguardian.com

Ofcom concludes that exposure to violent online content is unavoidable for children in the UK

The UK children are now inevitably exposed to violent online content, with many first encountering it while still in primary school, according to a media watchdog report.

British children interviewed in the Ofcom investigation reported incidents ranging from videos of local school and street fights shared in group chats to explicit and extreme graphic violence, including gang-related content, being watched online.

Although children were aware of more extreme content existing on the web, they did not actively seek it out, the report concluded.

In response to the findings, the NSPCC criticized tech platforms for not fulfilling their duty of care to young users.

Rani Govender, a senior policy officer for online child safety, expressed concern that children are now unintentionally exposed to violent content as part of their online experiences, emphasizing the need for action to protect young people.

The study, focusing on families, children, and youth, is part of Ofcom’s preparations for enforcing the Online Safety Act, giving regulators powers to hold social networks accountable for failing to protect users, especially children.

Ofcom’s director of Online Safety Group, Gil Whitehead, emphasized that children should not consider harmful content like violence or self-harm promotion as an inevitable part of their online lives.

The report highlighted that children mentioned major tech companies like Snapchat, Instagram, and WhatsApp as platforms where they encounter violent content most frequently.

Experts raised concerns that exposure to violent content could desensitize children and normalize violence, potentially influencing their behavior offline.

Some social networks faced criticism for allowing graphic violence, with Twitter (now X) under fire for sharing disturbing content that went viral and spurred outrage.

While some platforms offer tools to help children avoid violent content, there are concerns about their effectiveness and children’s reluctance to report such content due to fear of repercussions.

Algorithmic timelines on platforms like TikTok and Instagram have also contributed to the proliferation of violent content, raising concerns about the impact on children’s mental health.

The Children’s Commissioner for England revealed alarming statistics about the waiting times for mental health support among children, highlighting the urgent need for action to protect young people online.

Snapchat emphasized its zero-tolerance policy towards violent content and assured its commitment to working with authorities to address such issues, while Meta declined to comment on the report.

Source: www.theguardian.com

Ofcom investigates TikTok for lack of parental control information | Science and Technology News


Ofcom has initiated an inquiry into TikTok to determine if it provided false information about parental controls.

The UK’s media regulator has requested details from Chinese-owned apps and other online video platforms like Snapchat, to report on measures taken to protect children.

While they were generally commended for their efforts, Ofcom stated that it had “reasons to believe” that TikTok provided “inaccurate” information about the family pairing system.

This feature allows adults to link their account to their child’s account and control settings such as screen time limits.

Ofcom will now investigate whether the company “failed in its obligations” by not taking appropriate action.

TikTok attributed the issue to a technical problem, and Ofcom said it is aware and will provide the necessary data.

A spokesperson mentioned that the platform enforces an age requirement of 13 years and that the report notes the platform’s significant effort and resources in locating and removing underage users.

Ofcom’s report is the first in two years following guidance on video sharing apps about how to protect young users from encountering harmful content.

YouTube and Facebook were not covered in the report as they fall under Irish jurisdiction, a result of EU rules that the UK continues to follow, despite leaving the EU gradually through the online safety bill.

An Ofcom report published on Thursday found that TikTok, Snapchat, and Twitch all met the requirements set out two years ago.

All three platforms categorize and label content to ensure it is age-appropriate.

However, while Snapchat and TikTok offer parental controls, Twitch requires parents to supervise their children in real time as per its terms of service.

Ofcom stated that although steps were being taken to protect young users, “victims can still be victimized while using these platforms”.

Source: news.sky.com