Without Us, There Are No Social Media PlatformsJulia Kloiber
This post is a report from the Content Moderators Summit in Nairobi, Kenya.
Content warning: The following content might include references, for example, to topics such as self-harm, sexual abuse, suicide and violence.
In 2019 Daniel Motaung, content moderator at Sama by that time, was in the process of formally filing trade union papers when he got fired. Five years later, on May 1st 2023, 150 moderators from different platforms gather in a conference room at the Mövenpick Hotel in Nairobi to collectively fight for better working conditions. It's the largest gathering of content moderators ever to take place. The date – Labour Day – is well chosen. After arriving and signing in at the registration desk, the participants move on to a wall with three names on it: “Platform Workers Union”, “Content Moderators Union”, and “Content Protectors Union”. Sticky notes mark the favourite name choice.
As all participants are seated in the big conference venue, Daniel Motaung takes the stage and starts to reflect on his first unionisation attempt. Back then, he had understood the power of the content moderators. However, he miscalculated; the moderators were not united. He asserts, though, that Big Tech has long understood the power of the content moderators, but it is the moderators themselves who have not yet understood the power they possess. If you fight and engage individually, you have no chance. This is a collective problem, and it needs to be addressed collectively. The companies know too well that the moderators, as individuals, do not have the financial leverage to back themselves up. What is required to improve the working conditions, as Daniel ensures, is a bottom-up approach: The unionisation of content moderators and the professionalisation of the job. Exploitation has to end.
Content Moderation Is a Job Like No Other
Too few people know that content moderators are the emergency first responders of social media; they shield the rest of us from graphic and violent content. They watch videos of murders, rapes, suicides, and child sexual abuse – so we don’t have to. They work in dark spaces. Their employers don’t want them to be seen, but they are there behind the curtains with little protection and validation. A participant says that he feels like the community has forgotten about them, that people forget that content moderators are human beings and must be protected like any other human being.
The content moderators in the room talk about the toll the job has taken on their psychological health. Some have to take sleeping pills to get at least a little sleep at night, blocking out the graphic images that their brains unwantedly reproduce. Others mention having dark thoughts since they started working in the field. Again, others say they are taking antidepressants. Everyone has this one ticket they have dealt with that is forever engrained in their memories. Daniel Motaung has publicly spoken about being diagnosed with PTSD after six months of reviewing the most graphic and violent content imaginable. He says that many of the moderators will never get diagnosed. Some will leave their job, go back home, commit suicide, and we will never hear about them. Their families will never know what caused them to commit suicide.
This Job Leaves a Lifelong Impact on People
The well-being counselling that the companies offer to support the workers psychologically has been deemed insufficient and useless by the content moderators. A moderator mentions that he sometimes leaves the counselling sessions worse off than he goes into them. The team support is what keeps some of content moderators going – the support among themselves - not the counselling the companies offer. The content moderators demand professional psychological support from independent parties.
Something you often hear when speaking to content moderators is that many of them did not know what kind of job they were getting themselves into when they accepted the job offers. A moderator mentions that the contract he signed was for a customer relations representative job. Now they are working for the Big Tech platforms and tools. They are moderating content for TikTok, Facebook and ChatGPT – but none of the moderators in the room is directly employed by the big and wealthy companies behind these platforms and services. They are all working for outsourcing companies. Big Tech is hiding the content moderation work in a maze not to be held accountable.
Over half of the moderators in the room are not from Kenya. They are from all over Africa: Somalia, Eritrea, Uganda, South Africa, Nigeria, Burundi, Ethiopia and many others. Many have left their homes hoping to escape poverty or war. Their situation is easy to exploit because their residency permit is directly linked to their job as content moderators. Many of the friends and families of the content moderators know nothing about their jobs nor about the impact it has on their lives. If they lose their job, they lose their permits for themselves and also their families - a fact employers are well aware of and therefore take full advantage of. As if this isn’t hard enough, strict non-disclosure agreements restrict people from telling their friends and families about their work. Some even have to fear for their life when they return to their authoritarian home countries because the government believes that social media platforms work for the opposition.
“Big Tech Is Built on the Backs of Broken African Youth”
Two hours into the gathering, one of the content moderators says: “Big Tech is built on the backs of broken African youth.“ Because many moderators start out in these jobs when they are young, for some, this is the first job after finishing university. The same person adds that her younger – 24-year-old-self – would not have started this job if she had known what she was getting herself into. The toll on one’s life is too high and the protections and pay are too low.
The cost of living in Nairobi is high, and the pay as content moderators is unfairly insufficient. Some moderators mention that they are unable to save money. They feel like they were deceived into these jobs. They were brought to foreign nations and enslaved. Another participant mentions that in their home country if you work in a factory with toxic chemicals, you get life insurance and are protected from inhaling fumes. There is no such insurance for the dangerous work the content moderators do. Content moderation is perceived as low-skilled work, even though the moderators must interpret complex policies and need deep cultural expertise and language skills. They receive basic training, but more is required to call it a proper qualification, making them easy to replace. If they speak up or are too sick to continue the job, they are swiftly replaced with new moderators. This is why Daniel Motaung puts a lot of weight on the fact that content moderation has to become a professionalised job. A job that requires extensive training to ensure a high-quality standard. If a company invests in your training and skill building, they are not prone to kick you out the next day.
The Union Will Stand For You - You Don’t Have to Stand Alone
During this Labour Day morning, many content moderators speak up and share their personal stories. A whole spectrum of emotions are present: anger, courage, sadness and hope. The feeling of solidarity connects everything and runs like a thread through the day. Solidarity among the moderators, no matter what background they have or where they come from. They stand united in their fight for better working conditions. All they ask for are fundamental human rights and labour rights. They demand fair pay, professionalisation of their job, adequate social and psychological support, transparency and a good work environment. The day closes with a vote — a vote on the formation of a union. When the moment comes, all hands go up. It feels like a historic moment on May 1st in the conference room of the Mövenpick Hotel in Nairobi, Kenya – at the world’s largest gathering of content moderators. With this vote, the moderators are one step closer to a union that protects their rights and the rights of future generations of content moderators.
The union will stand for you. You don't have to stand alone.
In the next couple of weeks the moderators will work with local partners to get the union off the ground.
Here's some further reading and newspaper coverage of the event:
- Time Magazine – "150 African Workers for ChatGPT, TikTok and Facebook Vote to Unionize at Landmark Nairobi Meeting"
- Nation Africa – "Facebook, Youtube and TikTok Content Moderators In Kenya Form Labour Union"
- Wired Magazine – "Meta’s Gruesome Content Broke Him. Now He Wants It to Pay"
- Nairobi News – "Moderators in case against Meta hold demos at Sama offices"
- Change Petition – "‘Facebook’ and ‘Samasource’- Obey Court Orders And Pay Us!"
For updates on the work of content moderators in Kenya, make sure to follow @ContmoderatorAf on Twitter.
This event is part of the content moderators series initiated by Foxglove, ver.di, Aspiration Tech and SUPERRR Lab.