Twitter Begins Limiting the Spread of QAnon Conspiracy Theories, Bans Seven Thousand Harassing Accounts
Ben Collins and Brandy Zadrozny, NBC News:
Twitter will stop recommending accounts and content related to QAnon, including material in email and follow recommendations, and it will take steps to limit circulation of content in features like trends and search. The action will affect about 150,000 accounts, said a spokesperson, who asked to remain unnamed because of concerns about the targeted harassment of social media employees.
The spokesperson said that as part of its new policy, the company had taken down more than 7,000 QAnon accounts in the last few weeks for breaking its rules on targeted harassment.
The sweeping enforcement action will ban QAnon-related terms from appearing in trending topics and the platform’s search feature, ban known QAnon-related URLs and prohibit “swarming” of people who are baselessly targeted by coordinated harassment campaigns pushed by QAnon followers.
If you have mercifully avoided the world of the QAnon conspiracy theory, know that it is stupid and absurd even by the low standards of conspiracy theories. It is truly distressing to see that it has crossed over from a fringe internet thing to a real-world violent extremist movement. But Twitter doesn’t ban people simply for posting stupid and absurd things, and it hasn’t banned QAnon topics overall. Rather, it is reducing the artificially high impact of accounts associated with the theory.
We will permanently suspend accounts Tweeting about these topics that we know are engaged in violations of our multi-account policy, coordinating abuse around individual victims, or are attempting to evade a previous suspension — something we’ve seen more of in recent weeks.
Twitter also says that it will not recommend QAnon-related accounts and topics in its algorithmic features. Many of these users coordinate on message boards and in Discord servers to flood Twitter, thereby brute-forcing their way into its trending topics features, which effectively advertises the theory to a broader audience. It isn’t clever and these people are not smart; it is simply abusing Twitter as the president’s platform. Twitter’s policy isn’t limited to QAnon, either, but it is only one platform.
Julia Carrie Wong, reporting last month for the Guardian:
Moreover, Facebook is not merely providing a platform to QAnon groups. Its powerful algorithms are actively recommending them to users who may not otherwise have been exposed to them.
The Guardian did not initially go looking for QAnon content on Facebook. Instead, Facebook’s algorithms recommended a QAnon group to a Guardian reporter’s account after it had joined pro-Trump, anti-vaccine and anti-lockdown Facebook groups. The list of more than 100 QAnon groups and accounts was then generated by following Facebook’s recommendation algorithms and using simple keyword searches. The Instagram accounts were discovered by searching for “QAnon” in the app’s discovery page and then following Instagram’s algorithmic recommendations.
Receiving QAnon recommendations from Facebook does not appear to be that uncommon. “Once I started liking those pages and joining those groups, Facebook just started recommending more and more and more and more, to the point where I was afraid to like them all in case Facebook would flag me as a bot,” said Friedberg. Erin Gallagher, a researcher who studies social media extremism, said she was also encouraged to join a QAnon group by Facebook, soon after joining an anti-lockdown group.
Spokespeople for social media platforms insist that software-based recommendations are not sending people down pathways to extremism, but reporters keep finding that they do.
Update: A tangental story from Rachel E. Greenspan at Insider:
A national organization fighting to end human trafficking says the believers in the unfounded Wayfair human trafficking conspiracy theory are overwhelming the organization with reports and making it harder to do its work.
Believers in the conspiracy theory think that the furniture company is selling human children who have gone missing by disguising them as pillows and other goods. The theory went viral in the last few weeks after being spread by QAnon believers on Twitter and Facebook, though both platforms told Insider they had removed certain posts containing this misinformation.
These idiots live in the kind of corrupted universe where they won’t listen to actual authorities on human trafficking but are instead convinced by a few tweets.