Meta permitted pornographic advertisements that breach content moderation standards

Meta owns social media platforms such as Facebook and Instagram

JRdes / Shutterstock

In 2024, Meta allowed more than 3,300 pornographic ads, many featuring AI-generated content, to run on social media platforms such as Facebook and Instagram.

The survey results are available below. report by AI forensics a European non-profit organization focused on researching technology platform algorithms. Researchers also found inconsistencies in Meta’s content moderation policies by reuploading many of the same explicit images as standard Instagram and Facebook posts. Unlike ads, these posts violated Meta’s terms and were quickly removed. community standards.

“I am disappointed and not surprised by this report, as my research has already revealed double standards in content moderation, particularly in the area of sexual content,” he said. carolina are At the Center for Digital Citizenship at Northumbria University, UK.

The AI Forensics report focuses on a small sample of ads targeting the European Union. As a result, the explicit meta-authorized ads primarily target middle-aged and older men promoting “shady sexual enhancement products” and “dating sites,” with a total reach of 8.2 million impressions. It turned out that it was exceeded.

This permissiveness reflects a widespread double standard in content moderation, Allais said. She says tech platforms often block content by “women, femme presentations, and LGBTQIA+ users.” That double standard extends to the sexual health of men and women. “Examples include lingerie and period-related advertising. [removed] Ads from Meta are approved, but ads for Viagra are approved,” she says.

In addition to discovering AI-generated images within ads, the AI Forensics team also discovered audio deepfakes. For example, some ads for sex-enhancing drugs featured the digitally manipulated voice of actor Vincent Cassel superimposed over pornographic visuals.

“Meta prohibits the display of nudity or sexual activity in ads or organic posts on our platform, and we remove violating content shared with us,” a Meta spokesperson said. “Bad actors are constantly evolving their tactics to evade law enforcement, which is why we continue to invest in the best tools and technology to identify and remove violating content.”

The report comes at the same time that Meta CEO Mark Zuckerberg announced he would be eliminating the fact-checking team in favor of crowd-sourced community notes.

“If you really want to sound dystopian, which I think there’s reason to do so at this point given Zuckerberg’s latest decision to eliminate fact checkers, Meta You could even say that they’re quickly stripping agencies of their users by taking money from questionable ads,” Allais said.

topic:

Source: www.newscientist.com

Charity warns that UK children are facing a relentless onslaught of gambling advertisements and images online

New research has discovered that despite restrictions on advertising campaigns targeting young people, children are being inundated with gambling promotions and content that resembles gambling while browsing the internet.

The study, commissioned by charity GambleAware and funded by donations from gambling companies, highlights the blurred line between gambling advertising and online casino-style games, leading to a rise in online gambling with children unaware of the associated risks. It warns that gambling advertisements featuring cartoon graphics can strongly attract children. Recently, a gambling company promoted a new online slot game on social media using a cartoon of three frogs to entice players.

GambleAware is recommending new regulations to limit the exposure of young people to advertising. Research conducted by the charity revealed that children struggle to differentiate between actual gambling products and gambling-like content, such as mobile games with in-app purchases.

Zoe Osmond, CEO of GambleAware, emphasized the need for immediate action to protect children from being exposed to gambling ads and content, stating, “This research demonstrates that gambling content has become a part of many children’s lives.”

GambleAware chief executive Zoe Osmond said urgent action on internet promotions was needed to protect children. Photo: Doug Peters/Pennsylvania

The report also points out that excessive engagement in online games with gambling elements, like loot boxes bought with virtual or real money, can fall under a broader definition of gambling. It calls for stricter regulation on platforms offering such games to children.

Businesses are cautioned against using cartoon characters in gambling promotions, as they may appeal to children. However, there is no outright ban on using such characters. Online casino 32Red, for instance, recently advertised its Fat Frog online slot game on social media with a cartoon frog theme.

Dr. Raffaello Rossi, a marketing lecturer focused on the impact of gambling advertising on youth, criticized regulators for not acting swiftly enough to address the proliferation of online promotions enticing children. He called for new advertising codes to regulate social media promotions effectively.

Skip past newsletter promotions

The Gambling and Gambling Council assured that their members strictly verify ages for all products and have implemented new age restriction rules for social media advertising.

Recent data from the Gambling Commission indicates that young people are now less exposed to gambling ads compared to previous years. While no direct link between problem gambling development and advertising has been established.

The Advertising Standards Authority (ASA) stated that it regulates gambling advertising to safeguard children and monitors online gambling ads through various tools and methods.

The Department for Culture, Media and Sport affirmed its focus on monitoring new forms of gambling and gambling-like products, including social casino games, to ensure appropriate regulations are in place.

Kindred Group, the owner of the 32Red brand, was reached out to for comment.

Source: www.theguardian.com