Teenagers are facing new restrictions on beauty filters on TikTok that are aimed at addressing concerns about increasing anxiety and decreasing self-esteem.
In the near future, users under 18 will not be able to use filters that artificially alter features like enlarging eyes, plumping lips, or changing skin color.
Filters such as “Bold Glamor” that significantly alter a user’s appearance will be affected, while simple comic filters like bunny ears or dog noses will remain available. The changes were announced by TikTok during a safety forum at its European headquarters in Dublin.
Despite these restrictions, the effectiveness depends on users accurately providing their age on the platform.
Beauty filters on TikTok, whether provided by the platform or created by users, are a source of concern as they pressure teenagers, especially girls, to conform to unrealistic beauty standards and can lead to negative emotional impacts. Some young users have reported feeling insecure about their real appearance after using filters.
TikTok will also enhance its systems to prevent users under 13 from accessing the platform, potentially resulting in the removal of thousands of underage British users. An automated age detection system using machine learning will be piloted by the end of the year.
These actions come in response to stricter regulations on minors’ social media use under the Online Safety Act in the UK. TikTok already deletes millions of underage accounts globally each quarter.
Chloe Setter, head of public policy for child safety at TikTok, stated that they aim for faster detection and removal of underage users, understanding that this might be inconvenient for some young people.
Ofcom’s report from last December highlighted TikTok’s removal of underage users and raised concerns about the effectiveness of age verification enforcement. TikTok plans to implement a strict age limit of 13+ for social media users next summer.
Social media platforms will introduce new rules regarding beauty filters and age verification, anticipating stricter regulations on online safety in the future. These adjustments are part of broader efforts to enhance online safety.
Other platforms like Roblox and Instagram are also implementing measures to enhance child safety, reflecting a growing concern about the impact of social media on young users.
Andy Burrows, CEO of the Molly Rose Foundation, emphasized the importance of transparent age verification measures and the need to address harmful content promoted on social media platforms.
The NSPCC welcomed measures to protect underage users but stressed the need for comprehensive solutions to ensure age-appropriate experiences for all users.
Source: www.theguardian.com