Criticism has been directed at Mark Zuckerberg’s meta by Britain’s terror watchdog for reducing the minimum age for WhatsApp users from 16 to 13. This move is seen as “unprecedented” and is expected to expose more teenagers to extremist content.
Jonathan Hall KC expressed concerns about the increased access to unregulated content, such as terrorism and sexual exploitation, that meta may not be able to monitor.
According to Mr. Hall, the use of end-to-end encryption by WhatsApp has made it difficult for meta to remove harmful content, contributing to the exposure of younger users to unregulated materials.
He highlighted the vulnerability of children to terrorist content, especially following a spike in arrests among minors. This exposure may lead vulnerable children to adopt extremist ideologies.
WhatsApp implemented the age adjustment in the UK and EU in February, aligning with global standards and implementing additional safeguards.
Despite the platform’s intentions, child safety advocates criticized the move, citing a growing need for tech companies to prioritize child protection.
The debate over end-to-end encryption and illegal content on messaging platforms has sparked discussions on online safety regulations, with authorities like Ofcom exploring ways to address these challenges.
The government has clarified that any intervention by Ofcom regarding content scanning must meet privacy and accuracy standards and be technically feasible.
In a related development, Meta announced plans to introduce end-to-end encryption to Messenger and is expected to extend this feature to Instagram.
Source: www.theguardian.com