Over 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder as a result of being exposed to distressing social media content, including violent acts, suicides, child abuse, and terrorism.
Dr. Ian Kananya revealed that these moderators, based at a facility in Kenya contracted by social media companies, worked long hours and were diagnosed with PTSD, generalized anxiety disorder (GAD), and major depressive disorder (MDD) by the Head of Mental Health Services at Kenyatta National Hospital in Nairobi.
A lawsuit filed against Meta, Facebook’s parent company, and the outsourcing company Samasource Kenya, which employed moderators from across Africa, brought to light the distressing experiences faced by these employees.
Images and videos depicting disturbing content caused some moderators to have physical and emotional reactions such as fainting, vomiting, screaming, and leaving their workstations.
The lawsuit sheds light on the toll that moderating such content takes on individuals in regions where social media usage is on the rise, often in impoverished areas.
Many of the moderators in question turned to substance abuse, experienced relationship breakdowns, and felt disconnected from their families, due to the nature of their work.
Facebook and other tech giants use content moderators to enforce community standards and train AI systems to do the same, outsourcing this work to countries like Kenya.
A medical report submitted to the court depicted a bleak working environment where moderators were constantly exposed to distressing images in a cold, brightly lit setting.
The majority of the affected moderators suffered from PTSD, GAD, or MDD, with severe symptoms affecting a significant portion of them, even after leaving their roles.
MetaSource and Samasource declined to comment on the allegations due to the ongoing litigation.
Foxglove, a nonprofit supporting the lawsuit, highlighted the lifelong impact that this work has had on the mental health of the moderators.
The lawsuit aims to hold the companies accountable for the traumatic experiences endured by the moderators in the course of their duties.
Content moderation tasks, though often overlooked, can have significant long-term effects on the mental health of those involved, as seen in this case.
Meta stresses the importance of supporting its content moderators through counseling, training, on-site support, and access to healthcare, while implementing measures to reduce exposure to graphic material.
Source: www.theguardian.com