Nearly Six Million High-Profile Facebook Users Are Effectively Exempt From Platform Rules wsj.com

Jeff Horwitz, Wall Street Journal:

Mark Zuckerberg has publicly said Facebook Inc. allows its more than three billion users to speak on equal footing with the elites of politics, culture and journalism, and that its standards of behavior apply to everyone, no matter their status or fame.

In private, the company has built a system that has exempted high-profile users from some or all of its rules, according to company documents reviewed by The Wall Street Journal.

The program, known as “cross check” or “XCheck,” was initially intended as a quality-control measure for actions taken against high-profile accounts, including celebrities, politicians and journalists. Today, it shields millions of VIP users from the company’s normal enforcement process, the documents show. Some users are “whitelisted” — rendered immune from enforcement actions — while others are allowed to post rule-violating material pending Facebook employee reviews that often never come.

I do not think it is surprising that moderation of high-profile accounts is treated differently than that of average users, nor do I necessarily think it is wrong. Social media is all grown up, with celebrities and organizations treating it as an official broadcast system. The U.S. Securities and Exchange Commission treats Facebook posts as adequate investor disclosure.

But what Facebook has built, according to Horwitz, is not a system to protect the integrity and security of Facebook users with a large audience. It is an over-broad attempt to ward off what employees call “PR fires” — a side effect of which being that the users with the biggest megaphones are given another channel by which to spread whatever information they choose with little consequence.

Also, nearly six million users are enrolled in this thing?

Horwitz:

The documents that describe XCheck are part of an extensive array of internal Facebook communications reviewed by The Wall Street Journal. They show that Facebook knows, in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands.

Moreover, the documents show, Facebook often lacks the will or the ability to address them.

This is the first in a series of articles based on those documents and on interviews with dozens of current and former employees.

I recently finished “An Ugly Truth”. If you have been paying attention to reporting on Facebook recently, you likely will not be surprised by its contents, but it is worthwhile to have so much encapsulated in a single work.

“An Ugly Truth” is a deliberate summary of about the last five years of Facebook’s internal practices and external controversies. In a way, that is fair: some of the most consequential actions in the company’s history were made from the 2016 U.S. presidential election onward. But many of the problems raised by the book have their roots in decisions made years prior, when mainstream publications — like the one its authors work at — were more comfortable extolling the assumed virtues of connecting as many people on a single discussion platform.

The outcome of that election caused many publications to question those assumptions, as acknowledged by the book’s authors, and I think it tainted some of the investigations critical of Facebook as merely being “anti-Trump”. As much as he singlehandedly tested the limits of platform moderation, that should not be the case. Privacy advocates were raising similar concerns about Facebook for years before that election and, when mainstream outlets got more involved, they were able to use more resources to dig deeper.

Aside from the new information that may be uncovered in this Journal series, it may also be able to present it in a way that could seem less politically charged. I welcome that.