Another policy tug-of-war may be emerging in the European Union over Big Tech’s content recommendation systems, with the European Commission ruling out profiling-based content feeds (also known as “personalization” engines that process user data). Many members of Congress are calling for the government to curb this. To determine what content to display. The tracking and profiling of users by mainstream platforms to power “personalized” content feeds has long raised concerns about potential harm to individuals and democratic societies, and whether this technology is fueling social media addiction. , some critics say poses mental health risks to vulnerable people. There are also concerns that this technology is undermining social cohesion through its tendency to amplify divisive and polarizing content that can push individual anger and anger towards political extremes.
Of letter, 17 MPs from political groups including S&D, the Left, the Greens, EPP and Renew Europe have signed the petition, which calls for recommendation systems on technology platforms to be switched off by default. The idea emerged during negotiations over the bloc’s Digital Services Act (DSA). ), but it was not included in the final regulations because it did not have a democratic majority. Instead, EU lawmakers agreed to transparency measures for recommender systems, along with a requirement that large platforms (so-called VLOPs) must provide at least one content feed that is not based on profiling. But in a letter, lawmakers are calling for a complete dedefault on the technology. “Interaction-based recommender systems, especially hyper-personalized systems, pose a serious threat to the public and society as a whole, as they prioritize emotional and extreme content and target individuals who are particularly likely to be provoked. ” they wrote. “This insidious cycle exposes users to sensational and dangerous content, prolonging their engagement with the platform in order to maximize ad revenue.”
Amnesty International’s experiment on TikTok showed that the algorithm were exposed to videos glorifying suicide within just an hour. Additionally, Meta’s internal research found that 64% of joins to extremist groups were due to recommended tools, and that extremists It has become clear that we are exacerbating the spread of ideology.” The phone is: Draft online safety guidelines for video sharing platforms, was announced earlier this month by the Irish Media Commission (Coimisiún na Meán). The committee will be responsible for overseeing the DSA when regulations become enforceable for covered services next February. Coimisiún na Meán is currently consulting on guidance proposing that video sharing platforms “take steps to ensure that profiling-based recommendation algorithms are turned off by default.” The publication of the guidance occurred after the following episodes. violent civil unrest in Dublin The country’s police authorities suggested the attack was fabricated by far-right “hooligans” with false information spread on social media and messaging apps. And earlier this week, Irish Civil Liberties and Human Rights Council ICCL, which has been campaigning on digital rights issues for many years, also called on the European Commission to support the Koimisiun na Mean proposal and to make it public. my report They say social media algorithms are tearing society apart and are calling for personalized feeds to be turned off by default.
In their letter, MEPs said they also accepted proposals from Ireland’s media regulator, which similarly tend to promote “emotional and extremist content” that they say could undermine civic cohesion. It suggests that it “effectively” addresses issues related to recommender systems. The letter also references recently adopted regulations. Report by the European Parliament On the addictive design of online services and consumer protection, they highlight the negative impact of recommender systems on online services, which involve the profiling of individuals, especially minors. , which aims to keep users on the platform for as long as possible, thus manipulating them.” Artificial amplification of hatred, suicide, self-harm, and disinformation. ” “We call on the European Commission to follow Ireland’s lead and not only approve this measure under TRIS, but also take decisive action.” [Technical Regulations Information System] In addition to following the steps, you can also recommend this measure as a mitigation measure for large online platforms to take. [VLOPs] 35(1)(c) of the Digital Services Act, to give citizens meaningful control over their data and online environment,” the MEPs wrote, adding: “The protection of our citizens, especially young people, is of paramount importance” We believe that the European Commission has an important role to play in ensuring a safe digital environment for everyone. We look forward to your prompt and decisive action on this issue. ”
Under TRIS, EU member states must submit proposals before they are adopted into national law so that the EU can carry out a legal review to ensure that they are consistent with the bloc’s rules, in this case the DSA. draft technical regulations must be notified to the European Commission. . This system means that domestic laws that seek to “golden” EU regulations are unlikely to pass scrutiny. As such, the Irish Media Commission’s proposal to turn off video platforms’ recommender systems by default appears to go further than the text of the relevant legislation and may not survive the TRIS process. be. However, no company has gone that far yet. And clearly not the kind of step that ad-funded, engagement-driven platforms would choose as their commercial default.
When we asked, the European Commission declined public comment on the MEP’s letter (or the ICCL report). Instead, the spokesperson pointed to the “clear” obligations regarding her VLOP’s recommendation system set out in Article 38 of the DSA. This mandate requires platforms to provide at least one non-profiling-based option for each of these systems. However, we were able to discuss the profiling feed debate with EU officials who provided background to speak more freely. They agreed that platforms could choose to turn off profiling-based recommender systems by default as part of DSA systemic risk mitigation compliance, but they still do not have initiatives that stray too far from their own policies. I have confirmed that the platform you are using does not exist. So far, we have only seen examples where non-profiling feeds are optionally provided to users, such as on TikTok and Instagram, in order to meet the aforementioned (Article 38) DSA requirement to provide users with the option of circumvention. not. Personalization of this type of content. However, this requires active opt-out by the user. On the other hand, setting a feed to non-profiling by default is clearly a stronger type of content regulation, as it requires no user action to enable. EU officials we spoke to said that the European Commission, in its capacity as enforcer of the DSA on VLOPs, is considering a recommender system, including the formal process initiated in relation to X earlier this week. admitted that. The recommendation system has also been the focus of some of the formal requests for information the commission has sent to his VLOP, including one to Instagram that focuses on child safety risks. they spoke. And they agreed that the EU could use its enforcer role, or law-abiding power, to force large platforms to stop personalized feeds by default. However, they indicated that the commission would only take such action if it determined it would be effective in mitigating a particular risk. The official noted that multiple types of profiling-based content feeds are in place, even on a platform-by-platform basis, and emphasized that each must be considered in context.
More generally, they appealed for “nuance” in the debate over the risks of recommendation systems. They suggested that the Commission’s approach here would be to conduct a case-by-case assessment of concerns and advocate for data-driven policy interventions on VLOPs rather than blanket measures. did. After all, it’s a collection of platforms as diverse as video-sharing and social media giants, as well as retail and information services and (most recently) porn sites. The risk that an enforcement decision will not be selected by legal challenge in the absence of solid evidence to support the decision is clearly a concern for the Commission. The official also wants to collect more information before making a decision on whether to recommend.
Source: techcrunch.com