Facebook’s traumatized former content moderators are finally receiving their settlement for the psychological damage caused by having to view extremely disturbing content to keep it off of Facebook.
The settlement is costing the company $52 million, distributed as a one time payment of $1,000 to each of the 10,000+ content moderators in four states. If any of these workers seek psychological help and are diagnosed with psychological conditions related to their jobs, Facebook also has to pay for that medical treatment. They pay up to $50,000 per moderator in additional damages (on a case-by-case basis).
Facebook also will offer psychological counseling going forward, and will attempt to create a type of screening for future candidates to determine a candidate’s emotional resiliency, and will make one-on-one mental health counseling available to content moderators going forward. They will also give moderators the ability to stop seeing specific types of reported content.
According to NPR, Steve Williams, a lawyer for the content moderators, said, “We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago. The harm that can be suffered from this work is real and severe.”
Honestly, this job is not for the faint of heart, to say the least. Like the hard-working, yet not unfazeable police officers on Law & Order SVU, seeing the worst of humanity takes a toll on one’s psyche. Facebook’s content moderators are only human, after all. These workers moderated every conceivable–and inconceivable–type of disturbing content people posted on the 2 billion-users-strong social media platform for a living. Some for $28,800 a year.
I wouldn’t last five minutes in this role. It is painful to even read about what these content moderators witnessed for eight hours a day, five days a week. While Facebook refuses to admit any wrongdoing, as part of the agreement, come on, man. Graphic and disturbing content that upset someone enough to report to Facebook is what these people viewed all day every day. It sounds almost like a blueprint for creating trauma.
This settlement surely sets the precedent for more class action lawsuits to come from traumatized content moderators on other social media platforms. The settlement also shows this business model for what it is: flawed. This isn’t sustainable. It’s disgusting to think there are people out there posting heinous acts, and I am grateful the platform removes them.
However, they have to come up with a better way. Facebook employs thousands upon thousands of really smart people who are brilliant at computer technology. Twitter and YouTube and similar platforms do, too. They need to come up with a better plan going forward, instead of traumatizing these unfortunate souls. I don’t know what that will look like. But with Facebook’s sky-high piles of money and access to so many brilliant minds, they can figure it out. Something’s got to give. Please figure it out.
Joleen Jernigan is an ever-curious writer, grammar nerd, and social media strategist with a background in training, education, and educational publishing. A native Texan, Joleen has traveled extensively, worked in six countries, and holds an MA in Teaching English as a Second Language. She lives in Austin and constantly seeks out the best the city has to offer.

Pingback: Facebook staff now remote - but move away from the Bay, and pay gets cut
Pingback: Facebook moderators should be brought in house to give them the support they need
Pingback: Facebook moderators should be brought in house to give them the support they need - TechCrunch App