Content Moderators Describe Traumatizing Work for Facebook and Twitter

Casey Newton of the Verge spoke with content moderators who are employed by Cognizant but working on Facebook’s behalf:

Collectively, the employees described a workplace that is perpetually teetering on the brink of chaos. It is an environment where workers cope by telling dark jokes about committing suicide, then smoke weed during breaks to numb their emotions. It’s a place where employees can be fired for making just a few errors a week — and where those who remain live in fear of the former colleagues who return seeking vengeance.

It’s a place where, in stark contrast to the perks lavished on Facebook employees, team leaders micromanage content moderators’ every bathroom and prayer break; where employees, desperate for a dopamine rush amid the misery, have been found having sex inside stairwells and a room reserved for lactating mothers; where people develop severe anxiety while still in training, and continue to struggle with trauma symptoms long after they leave; and where the counseling that Cognizant offers them ends the moment they quit — or are simply let go.

The moderators told me it’s a place where the conspiracy videos and memes that they see each day gradually lead them to embrace fringe views. One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: “I no longer believe 9/11 was a terrorist attack.”

A heads-up: as you might imagine, this is a troubling read.

A couple of years ago, two filmmakers created a short documentary about moderators working in India for low wages performing the same equally-disturbing job. They’re seeing the worst things that people share online, whether that’s a post that refers to a person by a racial epithet — allowed, by the way, if the epithet is used in a positive context — to videos of murder and sexual assault. According to Newton, contractors are paid $15 per hour at Cognizant’s facility in Arizona.

Perhaps part of the blame can be assigned to Cognizant. They are, after all, the employer. But it’s entirely possible that they are just one of several bidders for a Facebook contract, and it’s plausible that offering employees a lower wage helped ensure that they got the contract. That’s a crappy reason, but it is a reason.

J. Sack shared their perspective as a former moderator at Twitter. It’s short, but it’s worth reading as well. I bet it’s just as hard at YouTube; as of four years ago, three hundred hours of video were uploaded every minute overwhelming their moderation teams at the time.

A nagging feeling that sits in my brain is that these platforms seem designed to encourage users to share lots of stuff and to share it more often, with the implicit assumption that all the baby pictures we share will drown out a small amount unsavoury and disturbing stuff that could easily be taken care of by a smaller group of staff. Facebook has over two billion active users. They’re moderating the planet; if that’s not impossible, it’s damn near close, and it is ruining the lives of the people who are doing their best.

These platforms are putting the best and worst of humanity on an equal plane, with disastrous results.