Moderating illicit content on Facebook is an extremely demanding job, and, sadly, it isn’t getting any easier despite increased visibility from lawmakers and mental health workers alike.
Facebook moderators are tasked with addressing anything from non-compliant images and videos–things that, while legal, violate Facebook’s terms of use–to real-time depictions of abuse, crime, and other forms of dark content that would make even the most experienced of Redditors shudder. It’s a thankless job that, according to former mods, has left many workers with PTSD.
Unfortunately, the dark con of any social media network is that any kind of content may be uploaded, and–in the “right” environment, such as a quasi-community of like-minded users–that same content can prosper until addressed by a moderator. No pressure, of course–these contractors only have to browse an unending tidal wave of content while making split-second decisions about whether or not each piece is “bad enough” to warrant moderation.
To make matters worse, attempts to use AI moderation have been lackluster at best, according to Slate. Even if AI were advanced enough to make the crucial distinctions Facebook trusts moderators to shoulder every day, Slate reminds us that “a move to fully automated moderation has long been the nightmare of many human rights and free expression organizations” due to the potential for actual censorship of free speech.
But between the volume of content moderators have to peruse and the aforementioned traumatic tone of the majority of that content, it’s no surprise that prominent figures such as NYU’s Paul Barrett are getting involved–and they want change sooner rather than later.
Chief among the many critical aspects of content moderation that require reform is the practice of outsourcing the work, a strategy that creates a “marginalized class of workers,” argues Barret. It’s true that moderators receive low pay, no benefits, and little support–amenities that are all present in spades for full-time employees of Facebook and similar social media companies.
In fact, many of Facebook’s content moderators were, until recently, employed as subcontractors through Cognizant, a consulting company which exited the content moderation business in October of 2019. This model of operation often afforded the employees less than $30,000 per year with few–if any–health benefits.
This lack of health benefits, coupled with the sheer trauma inherent in content moderation, may be what led content moderators to successfully sue Facebook for $52 million this year. Many of these moderators were previously diagnosed with PTSD from the stress of the job.
“Content moderation isn’t engineering, or marketing, or inventing cool new products. It’s nitty-gritty, arduous work, which the leaders of social media companies would prefer to hold at arm’s length,” Barret adds in an interview with Washington Post. Such distancing, he posits, affords “plausible deniability” for missed content to the companies in question–a practice from which Facebook is not exempt.
But Facebook shouldn’t be worried about maintaining distance from moderated content when the NYU report postulates doubling down on moderation attempts could provide the breadth needed to keep Facebook clean (well, relatively) while giving the operators in question a much-needed break.
The plan also addresses training teams in every country, having moderators work in shifts so as to mitigate the effects of exposure to traumatizing content, and making counseling services available to those who need it immediately rather than funneling requests through the bureaucratic equivalent of a thimble.
Unsurprisingly, moderators have expressed an inability to advocate for themselves regarding this issue, claiming in an open statement on Medium that “We know how important Facebook’s policies are because it’s our job to enforce them…We would walk out with you—if Facebook would allow it” in response to Facebook walk-outs in the past few weeks.
Facebook moderators protect all of us from people who seek to expose us to frightening, dehumanizing content–and often advocate for the victims of that content in the process. It’s our responsibility to protect them from unfair working conditions and life-long trauma.
Jack Lloyd has a BA in Creative Writing from Forest Grove's Pacific University; he spends his writing days using his degree to pursue semicolons, freelance writing and editing, oxford commas, and enough coffee to kill a bear. His infatuation with rain is matched only by his dry sense of humor.
