Meta is now facing a second lawsuit in Africa related to the psychological trauma endured by content moderators tasked with filtering out disturbing material on social media, including depictions of murder, extreme violence, and child sexual abuse.
A lawyer is preparing to take legal action against a contractor of Meta, the parent company of Facebook and Instagram, following discussions with moderators at a facility in Ghana that reportedly employs around 150 individuals.
Moderators at Magilal in Accra report suffering from depression, anxiety, insomnia, and substance abuse directly linked to their responsibilities of reviewing extreme content.
The troubling conditions faced by Ghanaian workers have come to light through a collaborative investigation by the Guardian and the Bureau of Investigative Journalism.
This issue arose after over 140 Facebook content moderators in Kenya were diagnosed with severe post-traumatic stress disorder due to their exposure to traumatic social media content.
The Kenyan workers were employed by Samasource, an outsourcing company that recruits personnel from across Africa for content moderation tasks for Meta. The Magilal facility, central to the allegations in Ghana, is owned by the French multinational Teleperformance.
One individual, who cannot be identified for legal reasons, disclosed that he attempted suicide due to his work. His contract has since expired, and he claims to have returned to his home country.
Facebook and similar large social media platforms often employ numerous content moderators in some of the world’s most impoverished regions, tasked with removing posts that violate community standards and aiding in training AI systems for the same purpose.
Content moderators are required to review distressing and often brutal images and videos to determine if they should be taken down from Meta’s platform. According to reports from Ghanaian workers, they have witnessed videos including extreme violence, such as people being skinned alive or women being decapitated.
Moderators have claimed that the mental health support provided by the company is inadequate, lacking professional oversight, and there are concerns that personal disclosures regarding the impact of their work are being circulated among management.
Teleperformance contested this claim, asserting that they employed a licensed mental health professional, duly registered with a local regulatory body, who possesses a master’s degree in psychology, counseling, or a related mental health field.
The legal action is initiated by the UK-based nonprofit Foxglove. This marks the second lawsuit filed by an African content moderator, following the lawsuit from Kenya’s Samasource workers in December.
Foxglove has stated they will “immediately investigate these alarming reports of worker mistreatment,” with the goal of employing “all available methods, including potential legal action,” to enhance working conditions.
They are collaborating with Agency Seven Seven, a Ghanaian firm, to prepare two potential cases. One could involve claims of unfair dismissal, including a group of moderators who allege psychological harm, along with an East African moderator whose contract ended following a suicide attempt.
Martha Dark, co-executive director at Foxglove, remarked:
“In Ghana, Meta seems to completely disregard the humanity of the crucial safety personnel that all interests rely on—content moderators.
Dark noted that the base wages for content moderators in Accra fall below the living wage, with pressures to work overtime. Moderators reportedly face pay deductions for not meeting performance targets, she indicated.
Contracts obtained by the Guardian show that starting wages are around 1,300 Ghanaian Cedis per month. This base pay is supplemented by a performance-related bonus system, with the highest earnings reaching approximately 4,900 Cedis (£243) per month, significantly less than what is needed for a decent living, according to living costs in Accra.
A spokesperson for Teleperformance stated that content moderators receive “a competitive salary and benefits,” including a monthly income approximately 10 times the national minimum wage for local moderators, and 16 times the minimum wage from other countries, along with project allowances and other benefits, all automatically provided and not contingent on performance.
Foxglove researcher Michaela Chen observed that some moderators are crammed into tight living spaces: “Five individuals were packed into a single room.” She mentioned the existence of a secretive culture of surveillance from managers that monitors workers even during breaks.
This surveillance extends to the work of Meta moderators. She stated: “Workers dedicate all day to the Meta platform, adhering to Meta’s standards and utilizing its systems, yet they are constantly reminded, ‘You’re not working for Meta,’ and are prohibited from disclosing anything to anyone.”
Teleperformance asserted that the moderators are housed in one of Accra’s most luxurious and well-known residential and commercial zones.
The spokesperson described the accommodation as “secure and offering high levels of safety,” complete with recreational facilities such as air conditioning, a gym, and a pool.
Agency Seven Seven partner Carla Olympio believes personal injury claims could succeed in Ghanaian courts, stating they would set a legal precedent that acknowledges employee protections extend to psychological and physical harm.
“[There exists] a gap in our laws as they do not adequately address advancements in technology and virtual work,” she expressed.
Rosa Curling, co-director at Foxglove, has called upon the court to “mandate immediate reforms in the work environment for content moderators,” ensuring proper protective measures and mental health care.
A Teleperformance spokesperson stated: “We are committed to addressing content moderation in Ghana. We fully disclose the type of content moderators may encounter throughout the hiring process, employee contracts, training sessions, and resilience assessments, while actively maintaining a supportive atmosphere for our content moderators.”
Meta commented that the companies it partners with are “contractually obligated to ensure that employees engaged in content reviews on Facebook and Instagram receive adequate support that meets or exceeds industry standards.”
The tech giant further stated it “places great importance on the support provided to content reviewers,” detailing expectations for counseling, training, and other resources when engaging with outsourced companies.
All content moderators indicated they had signed a non-disclosure agreement due to the sensitivity of user information they handle for their safety; however, they are permitted to discuss their experiences with medical professionals and counselors.
Source: www.theguardian.com
Discover more from Mondo News
Subscribe to get the latest posts sent to your email.