The European Commission has stated that Instagram and Facebook failed to comply with EU regulations by not offering users a straightforward method to report illegal content, such as child sexual abuse and terrorism.
According to the EU enforcement agency’s initial findings released on Friday, Meta, the California-based company valued at $1.8 trillion (approximately £1.4 trillion) that operates both platforms, has implemented unnecessary hurdles for users attempting to submit reports.
The report indicated that both platforms employ misleading designs, referred to as “dark patterns,” in their reporting features, which can lead to confusion and discourage users from taking action.
The commission concluded that this behavior constitutes a violation of the company’s obligations under the EU-wide Digital Services Act (DSA), suggesting that “Meta’s systems for reporting and addressing illegal content may not be effective.” Meta has denied any wrongdoing.
The commission remarked, “In the case of Meta, neither Facebook nor Instagram seems to provide user-friendly and easily accessible ‘notification and action’ systems for users to report illegal content like child sexual abuse or terrorist content.”
A senior EU official emphasized that the matter goes beyond illegal content, touching on issues of free speech and “overmoderation.” Facebook has previously faced accusations of “shadowbanning” users regarding sensitive topics such as Palestine.
The existing reporting system is deemed not only ineffective but also “too complex for users to navigate,” ultimately discouraging them from reaching out, the official noted.
Advocates continue to raise concerns about inherent safety issues in some of Meta’s offerings. Recent research released by Meta whistleblower Arturo Bejar revealed that newly introduced safety features on Instagram are largely ineffective and pose a risk to children under 13.
Meta has refuted the report’s implications, asserting that parents have powerful tools at their disposal. The company implemented mandatory Instagram accounts for teenagers as of September 2024 and recently announced plans to adopt a version of its PG-13 film rating system to enhance parental control over their teens’ social media engagement.
The commission also pointed out that Meta complicates matters for users whose content has been blocked or accounts suspended. The report indicated that the appeal mechanism does not allow users to present explanations or evidence in support of their case, which undermines its efficacy.
The commission stated that streamlining the feedback system could also assist platforms in combating misinformation, citing examples like: an Irish deepfake video. Leading presidential candidate Catherine Connolly has claimed she will withdraw from Friday’s election.
This ongoing investigation has been conducted in partnership with Coimisiún na Meán, Ireland’s Digital Services Coordinator, which oversees platform regulations from its EU headquarters in Dublin.
The commission also made preliminary findings indicating that TikTok and Meta are not fulfilling their obligation to provide researchers with adequate access to public data necessary for examining the extent of minors’ exposure to illegal or harmful content. Researchers often encounter incomplete or unreliable data.
The commission emphasized that “granting researchers access to platform data is a crucial transparency obligation under the DSA, as it allows for public oversight regarding the potential effects these platforms have on our physical and mental well-being.”
These initial findings will allow the platforms time to address the commission’s requests. Non-compliance may result in fines of up to 6% of their global annual revenue, along with periodic penalties imposed to ensure adherence.
After newsletter promotion
“Our democracy relies on trust, which means platforms must empower their users, respect their rights, and allow for system oversight,” stated Hena Virkunen, executive vice-chair of the commission for technology sovereignty, security, and democracy.
“The DSA has made this a requirement rather than a choice. With today’s action, we are sharing preliminary findings on data access by researchers regarding four platforms. We affirm that platforms are accountable for their services to users and society, as mandated by EU law.”
A spokesperson for Meta stated: “We disagree with any suggestions that we have violated the DSA and are actively engaging with the European Commission on these matters. Since the DSA was implemented, we have made changes to reporting options, appeal processes, and data access tools in the EU, and we are confident that these measures meet EU legal requirements.”
TikTok mentioned that fully sharing data about its platform with researchers is challenging due to restrictions imposed by GDPR data protection regulations.
“TikTok values transparency and appreciates the contributions of researchers to our platform and the industry at large,” a spokesperson elaborated. “We have invested significantly in data sharing, and presently, nearly 1,000 research teams have accessed their data through our research tools.
“While we assess the European Commission’s findings, we observe a direct conflict between DSA requirements and GDPR data protection standards.” The company has urged regulators to “clarify how these obligations should be reconciled.”
Source: www.theguardian.com












