Moderating the Planet vanityfair.com

Jason Koebler and Jordan Cox of Vice penned a blockbuster investigation into Facebook’s content moderation practices that’s worth your time. They interviewed “dozens” of sources, including several on-the-record conversations with Facebook employees in charge of their moderation efforts:

The thing that makes Facebook’s problem so difficult is its gargantuan size. It doesn’t just have to decide “where the line is” for content, it has to clearly communicate the line to moderators around the world, and defend that line to its two billion users. And without those users creating content to keep Facebook interesting, it would die.

Size is the one thing Facebook isn’t willing to give up. And so Facebook’s content moderation team has been given a Sisyphean task: Fix the mess Facebook’s worldview and business model has created, without changing the worldview or business model itself.

“Making their stock-and-trade in soliciting unvetted, god-knows-what content from literally anyone on earth, with whatever agendas, ideological bents, political goals and trying to make that sustainable—it’s actually almost ridiculous when you think about it that way,” Roberts, the UCLA professor, told Motherboard. “What they’re trying to do is to resolve human nature fundamentally.”

In that sense, Facebook’s content moderation policies are and have always been guided by a sense of pragmatism. Reviewing and classifying the speech of billions of people is seen internally as a logistics problem that is only viable if streamlined and standardized across the globe.

Maya Kosoff, Vanity Fair:

The problem, of course, is Facebook’s tireless drive to expand. Until recently, for example, the company reportedly had few moderators who spoke Burmese, allowing the platform in Myanmar to be infiltrated by anti-Muslim hate speech. (Facebook’s hate-speech detecting A.I., it said, hadn’t yet learned Burmese.) But instead of treating the issue as the result of a choice to expand into a country where it knew it couldn’t adequately evaluate and police what was posted, Facebook viewed the issue as a failure of technology. “We still don’t know if it’s really going to work out, due to the language challenges,” Guy Rosen, V.P. of product management at Facebook, told Motherboard. “Burmese wasn’t in Unicode for a long time, and so they developed their own local font, as they opened up, that is not compatible with Unicode.” In the meantime, United Nations human-rights experts have cited Facebook’s struggle to remove hate speech as playing a role in a possible genocide in the country.

Facebook may be a publicly-traded company that is trying to do right by its shareholders — and the best thing for them, it perceives, is conquering the world. But this is an abhorrent dereliction of ethical responsibility. Kosoff is entirely correct: it is a choice for them to expand to places they don’t fully comprehend. It is arrogant, and demonstrates a lack of sensitivity in attempting to merge American values with those in every region they operate. I don’t think that’s possible.