Connect with us

Social Media

Facebook group admins will now see “auto-reported by Facebook”

Last month, quietly and without fanfare, Facebook began using an algorithm to automatically report suspicious content within a group.

Published

on

women in IT

Snitches get stitches

If you use Facebook, and especially if you administrate a Facebook Group, you are probably well aware that users can flag one another’s posts for inappropriate or offensive content. When a member of a group flags someone else’s post, the post is sent to the group administrator, who then assesses the post and decides whether or not it needs to be removed.

bar
Pretty simple, right? But last month, quietly and without fanfare, Facebook began using an algorithm to automatically report suspicious content before it gets published. If you administrate a Group, you may have already received posts “auto-reported by Facebook,” with the option to publish the post or delete it, as well as the option to block the group member who made the offending post. Facebook’s watchdog robot automatically detects inappropriate content, such as pornography, then reports it to group administrators.

auto-reported-by-facebook

Still in need of some tuning

At this time, it appears that the auto-reporting robot only works on Groups and not on the site in general or on Pages — but that could change. Some administrators are celebrating the new innovation, saying it will save them lots of time moderating, since they won’t have to go hunting for offensive posts. FBtutorial.com, who reported the change on their site after receiving auto-reported posts for their Facebook Group, says that auto-reporting is “definitely well needed in Facebook Groups and is a time-saver.”

Others, however, complain that the auto-report feature actually wastes their time, because Facebook often reports posts that are actually completely acceptable. After receiving many questions and complaints, Facebook added an explanation of the feature to their Help page, and also attempted to fine tune the algorithm.

Facebook still has the right to remove content

Facebook says that if administrators continue to approve similar auto-reported posts, the robot will eventually learn not to flag such posts. However, any posts approved by administrators must still comply with Facebook’s Community Standards, and the company retains the right to remove offensive content, even if group administrators approved it.

Note from the Editor: We’ve seen several “auto-reported by Facebook” posts in various groups we have on Facebook, but many aren’t NSFW, they’re more in line with posts we’ve deleted from members reporting to us as admins in the past. It is our belief that the algorithm is smarter than just “oh, that’s porn,” and based more on the administrators’ history of deleted and/or reported posts. This feature could save us a lot of time in our rowdier and heavily populated groups.

What about your Facebook Group? Have you received any auto-reported posts? Has it saved you time, or is it a nuisance?

#AutoReport

Ellen Vessels, a Staff Writer at The American Genius, is respected for their wide range of work, with a focus on generational marketing and business trends. Ellen is also a performance artist when not writing, and has a passion for sustainability, social justice, and the arts.

Continue Reading
Advertisement
10 Comments

10 Comments

  1. FB Tutorial

    May 11, 2016 at 5:40 pm

    We hope more media outlets cover Facebook’s new auto reporting in Groups, while also highlighting the pros and cons associated with it.

    — FBtutorial.com

    • Lani Rosales

      May 12, 2016 at 9:43 am

      You guys were the first, and it appears we’re the only others. We’re baffled as to why this has gone completely overlooked. It speaks to Facebook’s shift toward automation, and as you noted, we MUST examine whether that is positive or negative.

      • FB Tutorial

        May 17, 2016 at 4:53 pm

        @Lani — you are absolutely correct on the fact that (major) media outlets are not really reporting this new automated policing by Facebook, especially the new controversial robot in groups.

        With Facebook being at the forefront of “artificial intelligence”, it is our opinion that many tasks on Facebook will soon be automated. We’ll keep reporting our discoveries, so do check our website often for updates.

        Cheers,
        FBtutorial.com team

  2. BRUCE SMITH

    May 17, 2016 at 9:58 pm

    I am an Admin for several groups. We are having major problems with auto-report not allowing even face photos. When i try to approve an auto-reported photo it never does. Example a girl in modest shorts and top has been auto-reported to me . I try to approve over and over and still dont post. Wish we could TURN OFF AUTO REPORT UNTIL IT WORKS CORRECTLY

  3. Pingback: Grytics weekly review (10 June, 2016) - Grytics.com

  4. Christina

    August 23, 2016 at 6:13 pm

    I’m an administrator of a buy/sell type group. I keep getting notifications of posts that have been auto reported and they are far from offensive or innapropriate. One was someone selling a tv stand just a few minutes ago… it’s really annoying and there is no explanation as to what about the posts was are against the rules…

  5. Tamara

    September 8, 2016 at 9:52 pm

    I’ve just started having this show up in one of the buy and sell groups I admin, but so far, it seems to be a real pain. It’s reported two, an ad for a local window and door company, and just today, an sale post for a car. Neither of which go against the group rules. I think this is just another example of facebroke screwing things up, instead of fixing what should be fixed.

  6. Brynn

    October 4, 2016 at 1:38 am

    I admin a group and while I haven’t seen any auto reports at all, I mostly post in the group and find it annoying that any event I make the pictures seem to go missing or I cannot invite people to the event unless they are my personal friend. Which is hard to come by in a group I admin that other people have added their own friends. I think it’s another way for Facebook to have their hands in everything. Not a help at all.

  7. Daniel

    December 6, 2016 at 12:30 am

    I think this Auto-reported is waste of time for admins Im admin of my own group and everyday something gets reported and it dumb. Us admin should be able to run are own group the way we want not have babysitter to watch us other fail Facebook thanks

  8. Pingback: Facebook’s new suicide prevention algorithm will literally save lives - The American Genius

Leave a Reply

Your email address will not be published. Required fields are marked *

Social Media

Zillow launches real estate brokerage after eons of swearing they wouldn’t

(MEDIA) We’ve warned of this for years, the industry funded it, and Zillow Homes brokerage has launched, and there are serious questions at hand.

Published

on

zillow group

Zillow Homes was announced today, a Zillow licensed brokerage that will be fully operational in 2021 in Phoenix, Tucson, and Atlanta.

Whoa, big huge yawn-inducing shocker, y’all.

We’ve been warning for more than a decade that this was the end game, and the company blackballed us for our screams (and other criticisms, despite praise when merited here and there).

Blog posts were penned in fiery effigy calling naysayers like us stupid and paranoid.

Well color me unsurprised that the clarity of the gameplan was clear as day all along over here, and the paid talking heads sent out to astroturf, gaslight, and threaten us are now all quiet.

Continue reading…

Continue Reading

Social Media

We watched The Social Dilemma – here are some social media tips that stuck with us

(SOCIAL MEDIA) Here are some takeaways from watching Netflix’s The Social Dilemma that helped me to eliminate some social media burnout.

Published

on

Neon social media like heart with a 0

Last weekend, I made the risky decision to watch The Social Dilemma on Netflix. I knew it was an important thing to watch, but the risk was that I also knew it would wig me out a bit. As much as I’m someone who is active “online,” the concept of social media overwhelms me almost more than it entertains (or enlightens) me.

The constant sharing of information, the accessibility to information, and the endless barrage of notifications are just a few of the ways social media can cause overwhelm. The documentary went in deeper than this surface-level content and got into the nitty gritty of how people behind the scenes use your data and track your usage.

Former employees of high-profile platforms like Facebook, Twitter, Instagram, Google, and Pinterest gave their two cents on the dangers of social media from a technological standpoint. Basically, our data isn’t just being tracked to be passed along for newsletters and the like. But rather, humans are seen as products that are manipulated to buy and click all day every day in order to make others money and perpetuate information that has astronomical effects. (I’m not nearly as intelligent as these people, so watch the documentary to get the in-depth look at how all of this operates.)

One of the major elements that stuck with me was the end credits of The Social Dilemma where they asked interviewees about the ways they are working to eliminate social media overwhelm in their own lives. Some of these I’ve implemented myself and can attest to. Here’s a short list of things you can do to keep from burning out online.

  1. Turn off notifications – unless there are things you need to know about immediately (texts, emails, etc.) turn it off. Getting 100 individual notifications within an hour from those who liked your Instagram post will do nothing but burn you (and your battery) out.
  2. Know how to use these technologies to change the conversation and not perpetuate things like “fake news” and clickbait.
  3. Uninstall apps that are wasting your time. If you feel yourself wasting hours per week mindlessly scrolling through Facebook but not actually using it, consider deleting the app and only checking the site from a desktop or Internet browser.
  4. Research and consider using other search tools instead of Google (one interviewee mentioned that Qwant specifically does not collect/store your information the way Google does).
  5. Don’t perpetuate by watching recommended videos on YouTube, those are tailored to try and sway or sell you things. Pick your own content.
  6. Research the many extensions that remove these recommendations and help stop the collection of your data.

At the end of the day, just be mindful of how you’re using social media and what you’re sharing – not just about yourself, but the information you’re passing along from and to others. Do your part to make sure what you are sharing is accurate and useful in this conversation.

Continue Reading

Social Media

WeChat ban blocked by California judge, but for how long?

(SOCIAL MEDIA) WeChat is protected by First Amendment concerns for now, but it’s unclear how long the app will remain as pressure mounts.

Published

on

WeChat app icon on an iPhone screen

WeChat barely avoided a US ban after a Californian judge stepped in to temporarily block President Trump’s executive order. Judge Laurel Beeler cited the effects of the ban on US-based WeChat users and how it threatened the First Amendment rights of those users.

“The plaintiffs’ evidence reflects that WeChat is effectively the only means of communication for many in the community, not only because China bans other apps, but also because Chinese speakers with limited English proficiency have no options other than WeChat,” Beeler wrote.

WeChat is a Chinese instant messaging and social media/mobile transaction app with over 1 billion active monthly users. The WeChat Alliance, a group of users who filed the lawsuit in August, pointed out that the ban unfairly targets Chinese-Americans as it’s the primary app used by the demographic to communicate with loved ones, engage in political discussions, and receive news.

The app, along with TikTok, has come under fire as a means for China to collect data on its users. U.S. Department of Commerce Secretary Wilbur Ross has stated, “At the President’s direction, we have taken significant action to combat China’s malicious collection of American citizens’ personal data, while promoting our national values, democratic rules-based norms, and aggressive enforcement of U.S. laws and regulations.”

This example is yet another symptom of our ever-globalizing society where we are learning to navigate between connectivity and privacy. The plaintiffs also pointed out alternatives to an outright ban. One example cited was in Australia, where WeChat is now banned from government officials’ phones but not others.

Beeler has said that the range in alternatives to preserving national security affected her decision to strike down the ban. She also explained that in regards to dealing with national security, there is “scant little evidence that (the Commerce Department’s) effective ban of WeChat for all US users addresses those concerns.”

Continue Reading

Our Great Partners

The
American Genius
news neatly in your inbox

Subscribe to our mailing list for news sent straight to your email inbox.

Emerging Stories

Get The American Genius
neatly in your inbox

Subscribe to get business and tech updates, breaking stories, and more!