Josh Constine of TechCrunch spoke to Facebook about what’s allowed under their “Community Standards” guidelines. They weren’t completely forthcoming, and some of the aspects of their policy are concerning:
Even a single report flag sends the content to be reviewed by Facebook’s Community Standards team, which operates 24/7 worldwide. These team members can review content whether it’s public or privately shared. The volume of flags does not have bearing on whether content is or isn’t reviewed, and a higher number of flags will not trigger an automatic take-down.
This probably prevents abuse, insomuch as a mob can’t force the takedown of a post through excessive flagging. However, a high number of reports could be indicative of something that ought to be addressed with immediacy; as it’s very hard to tell which is the case, Facebook’s policy is probably best.
There have been instances of Facebook posts mysteriously disappearing and they often blame it on a bug or an infrastructure problem. That seems fishy to me. It has long been speculated that a certain threshold of flags would get a post automatically pulled; as this no longer seems to be the case, I’m not sure what would get a post removed in a seemingly-automated way.
There is no option to report content as “graphic but newsworthy,” or any other way to report that content could be disturbing and should be taken down. Instead, Facebook asks that users report the video as violent, or with any of the other options. It will then be reviewed by team members trained to determine whether the content violates Facebook’s standards.
This is a serious omission. As I noted earlier today, there ought to be a mechanism for separating news items from the rest of the site. They clearly need to be treated differently.