200 Facebook, TikTok & ChatGPT Moderators Unite In Nairobi Meeting

Moderators covering 14 different African languages came together to vote on the formation of the union that aims to address the mistreatment of workers by the tech giants.

200 Facebook, TikTok & ChatGPT Moderators Unite In Nairobi Meeting
Facebook and Meta logos on display. /NEWSWEEK

200 Kenya-based content moderators working for Facebook, YouTube, TikTok and Chat GPT had on Monday, May 1 converged in Nairobi for a meeting to push for better working conditions in the social media companies.

The Labour Day meeting saw the group resolve to register a workers union dubbed the Content Moderators Union, a one-of-a-kind union in Africa and declare that it will welcome content moderators from any major tech firm.

Moderators covering 14 different African languages came together to vote on the formation of the union that aims to address the mistreatment of workers by the tech giants.

Daniel Motaung, a former content moderator who got fired after trying to lead the unionization efforts, expressed fear that content moderation is in trouble and that "content moderators are paying for it with their lives". 

Content moderators based in Kenya vote in support of establishing their own union at a meeting held at Mövenpick Hotel in Nairobi on May 1, 2023. /DAILY NATION

"They serve as the first line of defence against harmful content, yet they face hazardous work conditions without hazard pay. Mental health support is severely lacking, job security is scarce, and some moderators feel silenced by strict non-disclosure agreements,” Motaung told the Nation.

The formation of the union came amidst a lawsuit filed by 184 Facebook content moderators in Kenya who were sacked in January and had sued the social media company's parent firm Meta for unfair dismissal.

The tremendous growth of platforms like Facebook and TikTok has led to a series of side effects that have gone unaddressed. One critical issue is how inadequate content moderation affects users and entire societies, that is, consequences are at risk in the event content moderation fails.

The incitement of violence online can lead to the loss of lives offline, as seen in the Tigray conflict in Ethiopia. Additionally, the spread of political falsehoods, as is the case with the Azimio la Umoja protests, has the potential to even compromise the integrity of elections, even in countries like Kenya.

This problem is worsened by the tech giants’ lack of investment in entities that they refer to as the “rest of world” countries. For example, in 2021, an investigation by the Wall Street Journal uncovered that Meta’s Facebook at the time spent 87 per cent of its misinformation resources on the United States and Western Europe, leaving the rest of the world vulnerable to the dangers of the spread of false information.

In 2019, content moderators at Sama, the Nairobi-based office responsible for Facebook’s content moderation, attempted to unionize due to low pay and poor working conditions, claiming that Sama and Facebook ignored their demands, instead destroying the union and forcing Motuang, who led efforts to unionise the moderators, to leave Kenya.

A 2022 Time Magazine exposé that lifted the lid on the exploitation of African Facebook moderators in Nairobi kickstarted a wave of legal action and organising and culminated in two judgments by Kenyan courts against Meta.

Motaung filed the first case early last year, accusing Meta and Sama of exploitation, union-busting and wage theft. A second case filed late last year claimed that Facebook’s moderation failures have caused death and mayhem in the Ethiopian war and across the African continent.

Facebook and its outsourcers however retaliated, announcing in January a mass sacking of all 260 moderators at its Nairobi hub.

The third case filed in March this year sued Meta and its outsourcers for sacking the entire workforce and blacklisting the laid-off workers. The court thereafter issued an order stopping Meta, which also owns Instagram and WhatsApp, from switching suppliers to Majorel because the case argues that the switch is being carried out in a discriminatory way.

Meta tried to challenge the order but the court dismissed it and further ruled that the injunction is extended until the legality of the redundancy is determined.

Meta in January had tried to have the case struck down, arguing that the local employment and labour relations court had no jurisdiction over it because it is neither based in nor trades in Kenya. However, the court noted that Meta can be sued in Kenya and declined to strike out the tech giant from the case.

According to the lawsuit, Sama began dismissing employees without justifiable cause and failed to hire replacements despite the acknowledged demand for more content moderators.

The suit further underscored the mental health toll of content moderation work, the inadequate support provided to workers, and inconsistencies in pay, a theme that repeated itself across various content moderation facilities set up by tech giants across the world.

Samasource offices in Nairobi. /TIME MAGAZINE