Social media giant Facebook is under fire yet again, this time for failing to report widespread illegal drug trafficking.
This is hardly Facebook’s first time coming under fire for the issue of negligence. In years past, Facebook has faced criticism for failing to address adequately issues such as scams, abuse recorded via livestream, and even wildlife trafficking. However, this most recent exposure reveals something more materially sinister than diet Tiger King drama: opioids.
According to the Washington Post, a large group of moderators-turned-whistleblowers first reported this problem when it became clear that the tech giant’s focus was on “graphic content”–not illicit drug sales. Worse, employees who sought to report drug sales to Facebook Pay operators found themselves lacking any efficient channel through which to do so.
This highlights a serious disconnect between Facebook’s moderation team and the inner workings of Facebook’s infrastructure–a disconnect that, left unchecked, could spell disaster for countless victims of online crime.
Interestingly enough, this isn’t even Facebook’s first blunder in the narcotics department. In 2013, several tech firms–Google, eBay, and Craigslist among them–pledged to crack down on the sale of OxyContin and accompanying rip-offs on their platforms. Facebook, despite confirmation that OxyContin sales were rampant on their site, declined to partake in this initiative.
Anyone who has spent any substantial amount of time on Facebook knows that, sooner or later, you’re bound to stumble across an illicit deal of some sort, be it drugs or counterfeit Furbies (it’s a thing). The widespread nature of this trade, coupled with Facebook’s deliberately blind eye, is what makes it so concerning.
If tech giants are able to be complicit in large-scale drug trafficking–arguably one of the less disturbing forms of trafficking found on social media–who can hope to hold them accountable for their actions?
Fortunately, the answer to that question is mercifully simple: the SEC. Should the SEC find sufficient evidence that Facebook ignored drug trafficking on their platform, the company would face hefty fines.
The crux of this issue–that Facebook moderators have neither the time nor the venue through which to communicate these infractions–is likely to be swept under the rug in favor of the big, flashing, “Facebook Becomes De Facto Cartel” headlines you’ll see in the coming weeks, so let’s just address that here.
Employees who moderate Facebook content, in addition to needing access to immediate counseling on demand, require the resources necessary to communicate ALL misconduct discovered on Facebook in a timely manner. Affording them anything less is a humanitarian disservice, and to hold Facebook to any lower standard is to admit compliance with this disservice.