Meta Found in Violation of EU Law Due to ‘Ineffective’ Illegal Content Complaint System

The European Commission has stated that Instagram and Facebook failed to comply with EU regulations by not offering users a straightforward method to report illegal content, such as child sexual abuse and terrorism.

According to the EU enforcement agency’s initial findings released on Friday, Meta, the California-based company valued at $1.8 trillion (approximately £1.4 trillion) that operates both platforms, has implemented unnecessary hurdles for users attempting to submit reports.

The report indicated that both platforms employ misleading designs, referred to as “dark patterns,” in their reporting features, which can lead to confusion and discourage users from taking action.

The commission concluded that this behavior constitutes a violation of the company’s obligations under the EU-wide Digital Services Act (DSA), suggesting that “Meta’s systems for reporting and addressing illegal content may not be effective.” Meta has denied any wrongdoing.

The commission remarked, “In the case of Meta, neither Facebook nor Instagram seems to provide user-friendly and easily accessible ‘notification and action’ systems for users to report illegal content like child sexual abuse or terrorist content.”

A senior EU official emphasized that the matter goes beyond illegal content, touching on issues of free speech and “overmoderation.” Facebook has previously faced accusations of “shadowbanning” users regarding sensitive topics such as Palestine.

The existing reporting system is deemed not only ineffective but also “too complex for users to navigate,” ultimately discouraging them from reaching out, the official noted.

Advocates continue to raise concerns about inherent safety issues in some of Meta’s offerings. Recent research released by Meta whistleblower Arturo Bejar revealed that newly introduced safety features on Instagram are largely ineffective and pose a risk to children under 13.

Meta has refuted the report’s implications, asserting that parents have powerful tools at their disposal. The company implemented mandatory Instagram accounts for teenagers as of September 2024 and recently announced plans to adopt a version of its PG-13 film rating system to enhance parental control over their teens’ social media engagement.

The commission also pointed out that Meta complicates matters for users whose content has been blocked or accounts suspended. The report indicated that the appeal mechanism does not allow users to present explanations or evidence in support of their case, which undermines its efficacy.

The commission stated that streamlining the feedback system could also assist platforms in combating misinformation, citing examples like: an Irish deepfake video. Leading presidential candidate Catherine Connolly has claimed she will withdraw from Friday’s election.

This ongoing investigation has been conducted in partnership with Coimisiún na Meán, Ireland’s Digital Services Coordinator, which oversees platform regulations from its EU headquarters in Dublin.

The commission also made preliminary findings indicating that TikTok and Meta are not fulfilling their obligation to provide researchers with adequate access to public data necessary for examining the extent of minors’ exposure to illegal or harmful content. Researchers often encounter incomplete or unreliable data.

The commission emphasized that “granting researchers access to platform data is a crucial transparency obligation under the DSA, as it allows for public oversight regarding the potential effects these platforms have on our physical and mental well-being.”

These initial findings will allow the platforms time to address the commission’s requests. Non-compliance may result in fines of up to 6% of their global annual revenue, along with periodic penalties imposed to ensure adherence.

Skip past newsletter promotions

“Our democracy relies on trust, which means platforms must empower their users, respect their rights, and allow for system oversight,” stated Hena Virkunen, executive vice-chair of the commission for technology sovereignty, security, and democracy.

“The DSA has made this a requirement rather than a choice. With today’s action, we are sharing preliminary findings on data access by researchers regarding four platforms. We affirm that platforms are accountable for their services to users and society, as mandated by EU law.”


A spokesperson for Meta stated: “We disagree with any suggestions that we have violated the DSA and are actively engaging with the European Commission on these matters. Since the DSA was implemented, we have made changes to reporting options, appeal processes, and data access tools in the EU, and we are confident that these measures meet EU legal requirements.”

TikTok mentioned that fully sharing data about its platform with researchers is challenging due to restrictions imposed by GDPR data protection regulations.

“TikTok values transparency and appreciates the contributions of researchers to our platform and the industry at large,” a spokesperson elaborated. “We have invested significantly in data sharing, and presently, nearly 1,000 research teams have accessed their data through our research tools.

“While we assess the European Commission’s findings, we observe a direct conflict between DSA requirements and GDPR data protection standards.” The company has urged regulators to “clarify how these obligations should be reconciled.”

Source: www.theguardian.com

UK industry rules find video game company in violation for loot box practices

The UK government’s mandate for technology companies to self-regulate gambling-style loot boxes in video games has come under scrutiny as some developers, who were involved in creating industry guidelines, failed to comply with their own rules.

In the last six months, three companies, including major developer Electronic Arts (EA), faced charges from the advertising regulator for not disclosing the presence of loot boxes in their games as stipulated in the guidelines they helped establish.

Experts who filed the complaint noted numerous other breaches but only reported a few to highlight the issue to the Advertising Standards Authority (ASA).

Loot boxes are game features that allow players to spend real or virtual currency to unlock digital envelopes with random rewards like character outfits or weapons.

Despite concerns about the gambling-like risks associated with loot boxes, the Department for Digital, Culture, Media, and Sport announced in July 2022 that loot boxes would not be classified as gambling products.

Nadine Dorries, the then culture secretary, expressed concerns about regulating loot boxes due to potential unintended consequences.

Instead of direct regulation, the government established a “technical working group” which included video game and tech companies and introduced 11 principles related to loot boxes in August 2023.

One of the guidelines requires clear disclosure of paid loot boxes in game promotions.

Leon Hsiao, an expert on loot box regulation, found that the majority of game ads he analyzed violated the group’s disclosure rules despite being members of the Loot Box Working Group.

Several games, including those from EA, Hutch, and Jagex, were subject to complaints upheld by the ASA for inadequate disclosure of loot boxes.

While EA and Jagex cited human error and lack of space for disclosures, Hatch claimed misunderstanding of the advertising guidelines.

Hsiao stressed that these incidents were not isolated and suggested the industry’s self-regulation efforts were not sufficient.

Don Foster, chairman of the House of Lords’ group for Gambling Reform, called out the failure of self-regulation and urged government intervention to protect children from loot box-related harm.

The Department for Culture, Media and Sport emphasized the need for video game companies to enhance efforts in safeguarding players from loot box risks.

The UK games industry body Ukey supported the implementation of new guidelines by July 2024 to ensure player protection and promote responsible gaming.

EA affirmed their commitment to loot box disclosures and providing players with information for safe gaming practices.

Jagex and Hatch were contacted for comments by The Guardian.

Source: www.theguardian.com