Meta Says Threads Users Will Not Be Recommended ‘Political’ Posts From Accounts They Do Not Follow axios.com

Sara Fischer, Axios:

Meta will not “proactively recommend political content from accounts you don’t follow” on Threads, the company said in a statement provided to Axios.

[…]

Users who post political content can check their account status to see whether they’ve posted too much of it to be eligible for recommendation.

Fischer says Meta is using the same policies it has used for Facebook and Instagram, which Meta describes in vague terms:

As part of this, we aim to avoid making recommendations that could be about politics or political issues, in line with our approach of not recommending certain types of content to those who don’t wish to see it.

Still, there is a fundamental question: what are “political issues”? I looked through Meta’s documentation without finding a good answer. Does any topic which has been politicized count? Are all posts about global warming, trans rights, healthcare, and intellectual property law considered political, or just those which advocate for a particular position? If advocacy is demoted, it likely benefits the status quo and creates a conservative bias by definition. Surely the answer is not to restrict people telling stories which could be construed as politically motivated. Would a post arguing the advantages and disadvantages of different types of stoves be demoted depending on which specific disadvantages are listed for gas stoves? Is this something Meta is able to reproduce in different regions?

An investigation last year by Jeff Horwitz, Keach Hagey, and Emily Glazer of the Wall Street Journal was illuminating in highlighting the consequences of testing different policy adjustments on Facebook in 2021–2022:1

Just testing the broad civic demotion on a fraction of Facebook’s users caused a 10% decline in donations to charities via the platform’s fundraising feature. Humanitarian groups, parent-teacher associations and hospital fundraisers would take serious hits to their Facebook engagement.

An internal presentation warned that while broadly suppressing civic content would reduce bad user experiences, “we’re likely also targeting content users do want to see….The majority of users want the same amount or more civic content than they see in their feeds today. The primary bad experience was corrosiveness/divisiveness and misinformation in civic content.”

The actual problem Meta has is that users spend time in the fun house mirror maze of its recommendations and the real world. Politicians operating in bad faith use nuanced issues circulating on social media platforms as an opportunity to create publicity materials that, in turn, perform well on those same platforms. Ideological obligations transform fact-based policy into thoughtless dunking.

Meta is in this mess because it believes it should recommend things to users it think they might find interesting. It transformed its platforms from opt-in systems where users affirmatively agree to see posts from particular users and topics into places where they must take steps to exclude unwanted things from their feed. It puts Meta in a position where it influences what users see, something which many people find objectionable when it does not comport with their values.

As a reminder, Meta says this only affects the recommendations of posts from accounts users do not follow. You should still see posts from the reporters, pundits, and activists you follow which, if you follow any, are likely ideologically consistent. Oh, and Meta still sells a way for people and organizations to advocate directly.


  1. Another tidbit from this story:

    The company is still debating whether it should also restrict how it promotes other types of content. When newsfeed dialed back the reward for producing inflammatory posts on politics and health, some publishers switched to more sensationalistic crime coverage.

    I wonder if this partly explains that wave of panicked shoplifting stories↥︎