Meta’s Oversight Board is calling for new and improved guidelines regarding nudity on the platform, specifically citing clauses about breasts that are considered outdated and overly reliant on binary rhetoric.
According to The Hustle, Meta’s content moderation systems have been inconsistent in enforcing nudity policies across platforms, and user reporting has proved to be unreliable given the presence of “bad actors” who may flag posts that do not violate the terms as written.
Meta’s current policy regarding nudity does allow for male chest expression, but forbids “uncovered female nipples” that aren’t part of maternity (specifically breast-feeding or giving birth), related to medical endeavors, or part of a protest.
Predictably, trans and nonbinary users occupy a gray area that has reportedly left content moderation systems “utterly confused” and resulted in “wrongfully removed content,” including top surgery posts on Instagram that, despite falling into the category of gender-affirming surgery (which, in turn, constitutes acceptable use), were flagged as “sexual solicitation.”
It’s this disparity that led one particular couple, who – in the process of fundraising for top surgery for one of them – were flagged on Instagram for the same policy, to file an appeal that sparked the board’s decision to question Meta’s “convoluted and poorly defined” set of standards for moderation.
The board instead requests that Meta go through the process of assessing the human rights impact of an updated strategy in order to inform a new definition for content moderation. They also asked for more “public-facing” transparency about the reason behind post removal, and actual content moderators would need to have access to those same guidelines as well.
It’s a start, but experts also acknowledge that the human aspect (people who wrongly flag posts for nudity for whatever reason) can still be a heavy draw against any efforts Meta makes.
AI doesn’t handle this kind of conflict particularly well, nor is it able to identify body parts properly, often mistaking sand dunes for bums, for example. Meta’s human moderation team is notoriously overworked to the point that expecting them to be a mitigating factor is unreasonable, especially as more and more are quietly laid off.
Meta, for their part, may consider the board’s request for two months before responding to these allegations that their current policies represent, in the words of their board, “greater barriers to expression” to women and LGBTQ+ members.