In Kenya labor dispute, workers who clean up toxic content on Facebook, TikTok and ChatGPT for $3 an hour go to court

Content moderators in the African country have formed a union to defend themselves against Big Tech. Social media giants outsource work to these freelancers, who are tasked with removing violent rhetoric and extremism from digital platforms

Unpaid workers protest outside the Sama offices in Nairobi
Unpaid workers protest outside the Sama offices in Nairobi@ContmoderatorAf
Carlos Bajo Erro

“Without content moderators, there would be no Facebook. I assure you that if the content moderators weren’t there, you wouldn’t spend a single minute on those platforms… you cannot imagine the amount of toxic, dirty, unbearable content that’s published.”

From Nairobi – the capital of Kenya – this is how Nathan Nkunzimana describes the hidden side of social media. An enormous number of publications get caught in the content moderation filter, making the face of these platforms at least minimally tolerable.

For the past two years, Nkunzimana has been one of the anonymous pawns in that final line of defense, moderating content for much of sub-Saharan Africa. Throughout the workday – and virtually without interruption – a group of people like him have to view and filter out all kinds of aberrations, including extremely violent video footage that get posted online. The conditions in which they’ve been carrying out this work led them to form a union: African Content Moderators. This is one of the first unions in this sector created in the world. It was set up with support from the Kenyan Communications Workers Union (COWU). Ridiculously-low salaries, unfair dismissals, exploitation and psychological problems derived from the tasks were all factors that justified the labor struggle.

This past May 1, more than 150 content moderators and data taggers working for Facebook, ChatGPT and TikTok gathered at a Nairobi hotel to share their experiences and concerns. They also met to take a definitive step: agreeing on a constitution for the union. Benson Okwaro – a veteran trade unionist and general secretary of COWU – acknowledges that “many global companies don’t like unions in their offices,” but notes that local laws do recognize the right of workers to freely organize. However, he highlights the difficulties involved in this, as these large firms don’t have formal headquarters in the countries where they employ workers, as they try to avoid national legislation. “This is why we need to be united and look for joint solutions now,” Okwaro emphasizes.

In February of 2019, Meta announced the opening of the “first Facebook content review center in sub-Saharan Africa,” in Nairobi, as part of its “continued investment” in that part of the African continent and its “commitment to security” on the platform. In the same publication (which was posted on Facebook), the firm assured that it would do all of this “in collaboration with Samasource.” However, it was the American company Sama that formally hired the moderators. Okwaro explains that Sama was a subcontractor. Nkunzimana and the rest of the moderators allege that their main employer was Meta, the parent company of Facebook, Instagram and WhatsApp. They hold this conglomerate responsible for their working conditions.

Daniel Motaung – a South African employee of Sama – already tried to organize his colleagues during the first summer of Meta’s sub-Saharan expansion, with a union calling itself The Alliance (which never came to fruition). This attempt at mobilization was stifled.

Disgruntled workers were called to order, Motaung was immediately suspended and, a few weeks later, he was fired.

This former Facebook moderator unleashed a storm when, in February of 2022, he told his story to Time magazine and revealed the working conditions in the Nairobi offices, where social media content for East Africa was being reviewed: “The work we do is a kind of mental torture.” After this revelation, different legal processes were opened: first, Motaung sued Meta and Sama for labor exploitation and union repression. This subsequently triggered a cascade of lawsuits and media scandals, exposing the low salaries of data taggers hired by the same company to fix the toxicity of ChatGPT.

Following these problems, Sama resigned from its contract with Meta (replaced by Majorel) in January of this year and announced that it was shutting down operations. This triggered new, more discreet mobilizations. Employees denounced their dismissals before the courts, claiming that they had been let go while attempting to establish a union.

In April of 2021, Nathan Nkunzimana started working for Sama as a content moderator. During the average workday, he would review between 1,500 and 2,000 social media publications. Based on a quick analysis, he had to decide whether to delete a post, leave it be, or send it to another point for review. This Burundian citizen – who arrived in Kenya 12 years ago to complete his studies – explains what was included among the content: “Sexual harassment, child abuse, sexual activities... that happens on social media and it happens live. There are terrorist groups that kill people in broad daylight and try to distribute [the footage] publicly on those platforms.” He himself acknowledges that this exposure has caused him social, psychological and personal problems. “There were days when I came home and had the feeling that I didn’t feel anything,” he sighs, having to carry that weight while supporting his wife and three children.

In addition to the harshness of the content, moderators had to face working conditions that aggravated the situation. “You couldn’t talk to anyone about it, because there was a confidentiality clause. You couldn’t even share with your partner what you were going through… the nature of the job [is that it] destroys your personal life. It was frustrating,” he laments. To this was added the pressure of productivity: “If, in one week, you didn’t reach the required metrics, the next [week] you would receive an email warning you that you weren’t meeting the objectives. The programs controlled how much time you spent on each piece of content. You couldn’t take your eyes off the screen all day. It took two or three seconds from the moment you clicked on a publication until the machine placed another one for you to review. It didn’t give you a moment of calm… even taking a minute to go to the bathroom meant a problem with your supervisor.”

The icing on the cake was that, the moment in which Sama announced the cessation of activities, the moderators immediately stopped receiving their salaries, despite the fact that they had long denounced the irregularity of payment. “90% of content moderators are foreigners. What we have experienced during the process is very hard… spending three months without receiving a salary, in a country that isn’t yours. You cannot pay the rent, you cannot buy food,” Nkuzimana explains. Cori Crider – co-director of Foxglove, a British organization that is supporting the workers in this process – adds that this situation “forces [the content moderators] to continue accepting insecure jobs to remain in [Kenya], despite the serious risk to their mental health.” Moderators have resorted to crowdfunding, so that they can support their families as the legal fight unfolds.

This Burundian moderator says that “the text content was forwarded to other offices, but the system sent the images and videos to our offices in Africa.” As a result of other complaints in other content moderation centers, certain firms have obtained psychological support for their employees, but these conditions have not been the norm.

“The situation in this content moderation center is especially bad, because the remuneration is usually extremely low: around $2 or $3 per hour. Just 260 moderators worked in the Nairobi hub, responsible for reviewing content for the eastern and southern African region – about 500 million people. The result is an appalling workload,” explains Cori Crider. “We have the legitimacy to convince these big technology companies that they have a responsibility to regularize our working conditions. A content moderator in Africa is earning $500 or $600 [per month]. It gives you enough to pay the rent and the bare minimum you need to live, because life here [in Nairobi] is very expensive. The same goes for psychological support. When we have demanded it, we’ve received messages [from management that try to] frighten us: ‘If you continue like this, you will end up going home,’” Nkuzimana explains. Those who run Sama have declined to answer questions from EL PAÍS.

Siasa Place is a Kenyan youth organization that has also covered the mobilization of content moderators. Nerima Wako-Ojiwa, the director, emphasizes that “some Big Tech companies are taking advantage everywhere, but especially in the Global South. There are a number of gaps when it comes to policies, such as data protection or the working conditions of employees.”

“An unequal fight” with the tech giants

For his part, Nkuzimana makes it clear: “Our request is that our human, constitutional and labor rights be respected – we only ask for that.” He also demands that the platforms take responsibility for the people who moderate content. “We are more than moderators – we are the soldiers who sacrifice to make communities safe. But the companies that manage these communities do not take care of the people who protect them.”

Meanwhile, over the course of several trials, the courts have handed down encouraging rulings for employees. Meta attempted to avoid a legal complaint because it doesn’t have a corporate residence in Kenya, but the court rejected this argument. In the same way, another judge forced Meta to suspend its contract with Majorel – Sama’s replacement company – until the fate of former employees is decided on. Veteran unionist Benson Okwaro says that “Kenyan laws are very worker-friendly.” And for Foxglove’s Cori Crider, “companies like Facebook, Google and TikTok are some of the most powerful in the world, with almost unlimited resources. It takes incredible bravery to face them with collective power alone.”

Nerima Wako-Ojiwa approaches this situation as being key to a future model for organzing: “There are many questions about unions and labor rights, especially regarding virtual work and national labor laws. These are questions that countries will have to begin to answer. As tech companies continue to grow, the way people work and interact with them will require legal responses. Many large companies evade responsibilities or taxes by resorting to third-party companies.”

This activist insists on the particularities of the African continent: “Definitely, [the fight waged by] content moderators is an unequal fight… but that’s the future of work and it’s here to stay. We have to have decent work for people. These are things that the ministries and private companies will have to sit down at the table to negotiate.”

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

More information

Archived In

Recomendaciones EL PAÍS
Recomendaciones EL PAÍS