Facebook’s Internal Rulebook for Moderators ⇥ theguardian.com
Nick Hopkins of the Guardian received a copy of about one hundred training manuals used to guide Facebook’s moderation policies:
One document says Facebook reviews more than 6.5m reports a week relating to potentially fake accounts – known as FNRP (fake, not real person).
Using thousands of slides and pictures, Facebook sets out guidelines that may worry critics who say the service is now a publisher and must do more to remove hateful, hurtful and violent content.
Yet these blueprints may also alarm free speech advocates concerned about Facebook’s de facto role as the world’s largest censor. Both sides are likely to demand greater transparency.
I would wager that it’s impossible to come up with a single set of guidelines that can clearly guide the moderation policy for two billion users spread across hundreds of countries. Even being more aware of their existing rulebook is unlikely to be helpful — someone acting nefariously could use them as guidance, while others will certainly see the rules as needlessly prohibitive and claim that Facebook shouldn’t censor any viewpoint, no matter how objectionable.
Facebook currently gets to decide its own level of squeamishness — they’re a private company, of course. But is there a size or scale at which it’s no longer okay for a company to create their own oversight? There has never been a single company that connects a quarter of the world’s entire population until now. Is it okay for that many people in so many places to be communicating using a rulebook developed by twenty- and thirty-somethings in California?
See Also: The Moderators.