In a remarkable and, I think, poetic coincidence, Facebook spent hours today being completely unreachable, just one day after the whistleblower exposing new information about the company’s wrongdoing went public and one day before she is set to testify before Congress. I really do think it was coincidental, for what it is worth. Facebook’s problems also brought Instagram and WhatsApp down, and all of these are critical infrastructure in different parts of the world by default. We should probably reconsider having mostly private and mostly American companies running the world’s internet, but that is a matter for another time.
At any rate, Facebook is back, so let’s talk about it.
Scott Pelley, correspondant for CBC News’ 60 Minutes:
Her name is Frances Haugen. That is a fact that Facebook has been anxious to know since last month when an anonymous former employee filed complaints with federal law enforcement. The complaints say Facebook’s own research shows that it amplifies hate, misinformation and political unrest—but the company hides what it knows. One complaint alleges that Facebook’s Instagram harms teenage girls. What makes Haugen’s complaints unprecedented is the trove of private Facebook research she took when she quit in May. The documents appeared first, last month, in the Wall Street Journal. But tonight, Frances Haugen is revealing her identity to explain why she became the Facebook whistleblower.
Frances Haugen: The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money.
“The Social Dilemma” may have been oversimplified, but these documents and interviews with Haugen indicate its broad strokes are closer to the truth than not. Facebook has historically optimized for engagement metrics and, as also reported by Karen Hao for MIT Technology Review earlier this year, changes that reduce engagement are kneecapped internally. During the 2020 U.S. election, Facebook adjusted its News Feed algorithm to preference links to reputable news sources over bullshit, but it rolled back that change shortly afterward. Kevin Roose of the New York Times, who first reported the rollback, noted that this reversal was likely because prioritizing newsworthiness either hurt partisan publishers, or because it reduced key usage figures.
The documents sourced by Haugen seem to reinforce this narrative. One interpretation is that engagement is so deeply-engrained into Facebook’s culture that it robs the company of its social responsibility. I think this is very possible — likely, even.
But Hanlon’s Razor instructs us not to assume malicious intent when ignorance or incompetence explains the same — or, perhaps, fear. That is more-or-less what Times reporter Kevin Roose argues these documents illustrate:
It has become fashionable among Facebook critics to emphasize the company’s size and dominance while bashing its missteps. In a Senate hearing on Thursday, lawmakers grilled Antigone Davis, Facebook’s global head of safety, with questions about the company’s addictive product design and the influence it has over its billions of users. Many of the questions to Ms. Davis were hostile, but as with most Big Tech hearings, there was an odd sort of deference in the air, as if the lawmakers were asking: Hey, Godzilla, would you please stop stomping on Tokyo?
But if these leaked documents proved anything, it is how un-Godzilla-like Facebook feels. Internally, the company worries that it is losing power and influence, not gaining it, and its own research shows that many of its products aren’t thriving organically. Instead, it is going to increasingly extreme lengths to improve its toxic image, and to stop users from abandoning its apps in favor of more compelling alternatives.
The thing is that Facebook, the company, may be “for old people”, as a kid responded in internal research. But older people are still people, and much of the world’s communications still depend on the stability of Facebook as a company. I do not think it is as fragile as Roose believes, but it is awfully defensive and sensitive for being one of the most valuable companies ever to exist.