mArk Zuckerberg may take place in FactChecking, but he cannot escape the truth. The world’s third wealthiest man has announced that Meta will replace independent fact checks with community notes. This week I went to the AI Action Summit in Paris and told technology executives and policymakers.
Why is this wrong?.
Instead of scaling back programs that make social media and artificial intelligence more reliable, businesses need to filter social media and invest and respect people who label the data that AI relies on. I know because I was once one of them.
As a mother of two young children, I was hired from my home country in South Africa and committed to joining Kenya’s growing technical division for Sama, a Facebook subcontractor, as a content moderator. For two years, I spent up to 10 hours a day staring at the darkest parts of the internet, so you didn’t have to.
It wasn’t just the type of content I had to look at that gave me insomnia, anxiety, and migraines. SAMA had what was called AHT or Action Processing Time. This was the time given to analyze and evaluate the content. We were timing, and the company measured success in seconds. We were constantly under pressure to get it right.
If you saw any trauma, you couldn’t stop it. You couldn’t stop because of your mental health. I couldn’t stop going to the bathroom. You just couldn’t stop. We were told our clients had requested that we continue on Facebook in our case.
This was not the life I imagined when I moved to Nairobi. My only real community isolated from my family was my colleagues at Sama and other outsourcing companies. When we got together, our conversations were always back to the same thing: what we did and how it broke us.
The more I spoke, the more I realized that something greater was going on than our personal stories. Every content moderator, data annotator, and AI workers we encountered had the same story of impossible assignments, deep trauma, and neglect of our happiness.
It wasn’t just a SAMA issue. It wasn’t just a Facebook issue. It is the way the entire high-tech industry operates, externally overseeing the most brutal digital labor and profiting from the pain.
These issues are now the subject of a class action lawsuit in Kenya brought about by 185 former content moderators against Meta, the owner of Facebook, as reported by The Guardian. When approached for comment, Sama said as of March 2023 that he no longer engaged in content moderation and no longer hired content moderators. Regarding the current case, the Kenyan courts have requested that the parties not speak to the media regarding the case, they added.
I left Sama two years ago. The problem has been getting worse since then. We know this by helping data supply chain employees working for other outsourcing companies organize through Nairobi
High-tech workers in Africa are rising. Workers are still traumatized and their jobs are even more intense. Content moderators need to watch videos at once at twice or three times faster on multiple screens. The wages and conditions are not good. Some data workers pay $0.89 (70p) per hour, while content moderators earn $2.
Things can’t just go on, but Zuckerberg’s approach to weakening protection is the wrong course. This task must be specialized. We need to have standards for workers, such as content moderators, who recognize the difficulties of our work and respect our rights. This means training and true health and safety protocols like other occupations. This means ensuring a living wage and setting a reasonable working quota. This means creating a framework that respects our humanity and dignity. This means having a union.
Mehta declined to comment on certain claims while the lawsuit was underway, but said it would require outsourcing companies to provide counseling and healthcare and exceed local industry standards. Opt out of autoplay functions where videos and photos are played non-stop streams.
You can’t wait for a tech company to resolve this issue. High-tech worker rise in Africa is organized for wages, mental health protection and professional standards in Kenya and beyond. We do this because AI is not magic. Behind every algorithm is thousands of hidden workers of labeling, training and moderating data under unstable conditions. The AI-powered workforce remains invisible as many people want to focus on innovation rather than maintaining them.
If you believe in a safer, more ethical internet, stand with us: support our organizing efforts, regulate big technologies to policy makers, AI and social It requires media companies to respect all workers. Changes don’t come from above, they come from us. That’s the truth.
-
Sonia Kgomo is the organizer of African Tech Workers Rising, a project supported by Uni Global Union and Kenya’s Communications Workers Union.
-
Do you have any opinions on the issues raised in this article? If you would like to send a response of up to 300 words by email to consider being published in our Letters section, please click here.
Source: www.theguardian.com