Facebook is having a bit of a rough week. On Monday, Michael Nunez of Gizmodo published a report in which a former Facebook employee alleged that the company actively suppressed conservative topics:
This individual says that workers prevented stories about the right-wing CPAC gathering, Mitt Romney, Rand Paul, and other conservative topics from appearing in the highly-influential section, even though they were organically trending among the site’s users.
Several former Facebook “news curators,” as they were known internally, also told Gizmodo that they were instructed to artificially “inject” selected stories into the trending news module, even if they weren’t popular enough to warrant inclusion—or in some cases weren’t trending at all.
This has made some waves, to put it mildly. There are two different stories here: that news stories of interest to conservatives are deliberately being omitted, and that non-trending stories are manually being included. Happily, they can largely be addressed together thanks to a great scoop for the Guardian’s Sam Thielman:
Leaked documents show how Facebook, now the biggest news distributor on the planet, relies on old-fashioned news values on top of its algorithms to determine what the hottest stories will be for the 1 billion people who visit the social network every day. […]
[The] documents show that the company relies heavily on the intervention of a small editorial team to determine what makes its “trending module” headlines – the list of news topics that shows up on the side of the browser window on Facebook’s desktop version. The company backed away from a pure-algorithm approach in 2014 after criticism that it had not included enough coverage of unrest in Ferguson, Missouri, in users’ feeds.
Facebook’s statement that they “do not insert stories artificially into trending topics, and do not instruct our reviewers to do so” is, therefore, wrong. According to this document, there is near-constant manual human intervention to blend similar topics together, add topics to the list, remove topics when they become stale, and make all kinds of adjustments.
But are they suppressing conservative news? The document obtained by the Guardian doesn’t say that, but it does clarify a topic’s eligibility for newsworthiness:
You should mark a topic as ‘National Story’ importance if it is among the 1-3 top stories of the day […] We measure this by checking if it is leading at least 5 of the following 10 news websites: BBC News, CNN, Fox News, The Guardian, NBC News, The New York Times, USA Today, The Wall Street Journal, Washington Post, Yahoo News or Yahoo.
That list of websites is pretty old-guard, and there’s nothing to suggest that it’s deliberately omitting conservative viewpoints or news. As Fusion’s Kashmir Hill says, it might simply be a quality barrier:
Regarding these particular topics being omitted by curators, New York Magazine’s Brian Feldman writes, “Given that list of overlooked topics, which range from IRS conspiracy theories, to an unreliable news aggregator, to a brutally unfunny conservative comedian, can you blame them?” […]
[What] that suggests is that Facebook preferred that news come from non-biased sources. Which is not a crazy thing to do. And it suggests that the bias might exist for news from the other side of the aisle as well, but it seems that Gizmodo didn’t rigorously assess whether liberal news and news sources were ignored by curators.
This story has understandably riled up reliably conservative media personalities, but it’s only newsworthy if Nunez did his due-dilligence to examine whether conservative stories were singled out.
Whatever the case, it’s clear that what remains unanswered is Facebook’s responsibility for its new status as one of the top referrers to any major news site. Brian Stelter for CNN:
Facebook has a unique ability to turn on a firehose of traffic — and the ability to turn it off. Publishers may not live or die by Facebook alone, but they certainly thrive or struggle based on the company’s decisions.
So Gizmodo’s recent reports about the production of Facebook’s ‘trending” stories have gained a ton of attention. Journalists, academics and some average users want to understand how and why Facebook does what it does.
This isn’t exclusive to Facebook; Google, Twitter, and Apple all have their own ways of displaying ostensibly algorithmically-determined trending news stories. All of them also lack transparency in how these stories are determined.
Update: Facebook has officially published the 2016 version of the document leaked to the Guardian. There are some redactions, but they appear to be nothing more than internal points of contact. Meanwhile, Sarah Emerson of Vice ponders whether Facebook lied in its earlier press statement.