Facebook Content Moderators Suing Meta for Harm Suffered
Heading 1: Introduction
Heading 2: Brownie’s First Day
Heading 2: The Job
Heading 2: Legal Battles
Heading 2: Downplayed Content
Heading 2: Effects on Moderators
Heading 2: Support and Coping Mechanisms
Heading 2: Importance of the Job
Heading 2: Closing Thoughts
The job of a content moderator for Facebook is not for the faint of heart. Trevin Brownie’s first day as a content moderator for Facebook is etched in his memory, working out of a subcontractor’s nondescript office in the Kenyan capital Nairobi. He recalls his first video being a man committing suicide with a two- or three-year-old child playing next to him. Brownie watched hundreds of violent, hateful videos every day and removed them from Facebook for three years.
Brownie and over 180 of his former colleagues are now suing Meta, Facebook’s parent company, for the harm they suffered in the first major class action over content moderation since 2018. The moderators worked in Nairobi for Sama, a Californian company subcontracted by Meta to moderate Facebook content for sub-Saharan Africa between 2019 and 2023. Sama has since announced it will be closing its content moderation hub in Nairobi, which employed people from a number of African countries recruited in particular for their knowledge of local languages.
Today, Brownie is involved in one of three cases against Meta in Kenya related to content moderation. He and another 183 sacked Sama employees are contesting their “unlawful” dismissal and seeking compensation, saying their salaries failed to account for the risks they were exposed to and the damage to their mental health.
Testimonies collected by AFP in April from several former Sama content moderators support Brownie’s claims of the job’s horrific nature. The moderators claim an “average handling time” of 55 to 65 seconds per video was imposed on them, or between 387 and 458 “tickets” viewed per day. If they were too slow, they risked a warning, or even termination.
Moderators claimed the support offered by Sama through “wellness counsellors” was not up to par, with vague interviews, little follow-up, and concerns about the confidentiality of their exchanges. Despite their traumas, those employed by Sama say they stayed on because they needed the money.
The moderators turned to “coping mechanisms,” with some using drugs such as cannabis, according to those who spoke to AFP. Brownie immersed himself in horror films, saying it was a way to blur reality. “But one of the biggest coping mechanisms was that we are convinced that this job is so important,” he says.
The job of a content moderator may be important, but it comes at a steep cost. “It is damaging and we are sacrificing (ourselves) for our community and for the world… We deserve better treatment,” says one moderator. None of them said they would sign up for the job again. “My personal opinion is that no human should be doing this. This job is not for humans,” says Brownie.