Mark Zuckerberg’s recent changes to content moderation were hastily announced without considering human rights impacts, according to the Social Media Company’s Oversight Board.
The Board’s criticism came after Facebook and Instagram owners were faulted last summer for allowing three posts with anti-Muslim and anti-immigrant content to remain during UK riots, leading to a critique of the changes made.
Raising concerns about the removal of a US fact checker in January, the oversight committee questioned the company’s decision to reduce “censorship” on its platform and encouraged more political content.
In its first official statement on the matter, the Board published a binding decision regarding the removal of meta content, urging the company to act cautiously and assess the human rights impact of the changes.
The Board noted that Meta’s policy changes lacked regular procedures and transparency regarding human rights due diligence, emphasizing the importance of upholding UN principles on business and human rights.
The Board’s criticism extended to the handling of content related to the UK riots, urging Meta to act swiftly in removing harmful posts that incited violence.
Additionally, it highlighted changes in meta guidance allowing users to target protected groups with harmful language, and recommended assessing the effectiveness of community content monitoring post the removal of the US Fact Checker.
Responding to the Board’s decision, a Meta spokesperson stated their commitment to complying with oversight recommendations and addressing past shortcomings.
Meta pledged to respond to the Board’s broader suggestions within the specified timeframe and acknowledged the need to consider the impact of policy updates on vulnerable groups.
Source: www.theguardian.com