Content note: this story mentions self-harm, anxiety and PTSD and discusses some graphic content.
Because people are terrible, awful stuff is uploaded to Facebook every day. From racist rants to countless videos of real-life murders, the platform would be covered with blood, guts and who knows what else on a daily basis if it weren’t for the 15,000 content moderators working around the world to pull it down.
It’s the world’s worst game of whack-a-mole.
A new report from tech news site The Verge interviews several former employees of one of Facebook’s content moderation contractors, Cognizant, and delves into what it’s really like to work at the call centre-like Arizona campus.
And it sounds like a nightmare:
“[A] workplace that is perpetually teetering on the brink of chaos. It is an environment where workers cope by telling dark jokes about committing suicide, then smoke weed during breaks to numb their emotions. It’s a place where employees can be fired for making just a few errors a week — and where those who remain live in fear of the former colleagues who return seeking vengeance.”
Even just the training has apparently seen people developing anxiety problems and secondary PTSD from watching videos of violent murders; one describes being haunted by a video where a man his own age being stabbed to death cries for his mother in his last moments.
being a Facebook content moderator (sorry, “process executive”) sounds like a form of psychological torture that we should make illegal https://t.co/JBtazZ2qp6 pic.twitter.com/27nLAWOa3B
— karl polanyeet (@said_mitch) February 25, 2019
So it’s no wonder they’re finding themselves “trauma bonding” with coworkers and sneaking off to have sex anywhere they can: “the bathroom stalls, the stairwells, the parking garage, and the room reserved for lactating mothers”.
“A former moderator named Sara said that the secrecy around their work, coupled with the difficulty of the job, forged strong bonds between employees. “You get really close to your coworkers really quickly,” she says.
“It feels like an emotional connection, when in reality you’re just trauma bonding.”
The sex details might be salacious and wild, but the report is actually incredibly distressing.
Moderators are exposed to so much conspiracy-theory content that they start to be persuaded by arguments by flat-earthers, 9/11 conspiracy nuts, and Holocaust deniers. Most of them have watched “hundreds of suicides”. A former manager is so paranoid now that he sleeps with a gun and sweeps his house for intruders with it every morning.
He says it’s basically impossible to do the job and not end up with some form of PTSD – and the former moderator suing Facebook after she developed the disorder would probably agree.
There are counsellors on site, but the minute they can’t stand to do the job any more, they’re on their own mental health-wise.
In short: it’s not what you think working for a tech giant is going to be like.
So next time you get an Insta post with a little bit too much sideboob taken down, spare a thought for the faceless moderator somewhere in the world for whom it was one quick decision in a day of hundreds: traumatised, exhausted, and quite possibly high.
Using ill-paid contract labor is extremely profitable for Facebook, and allows it to rapidly access new markets.
"The median #Facebook employee earns $240,000 annually in salary, bonuses, and stock options. A content moderator.. on the other hand will earn just $28,800 per year" pic.twitter.com/HHdCTrPwpX
— Michael Kelly (@michaelkelly68) February 25, 2019