Nuance in the YouTube Rabbit Hole nytimes.com

Shira Ovide, New York Times:

A group of academics found that YouTube rarely suggests videos that might feature conspiracy theories, extreme bigotry or quack science to people who have shown little interest in such material. And those people are unlikely to follow such computerized recommendations when they are offered. The kittens-to-terrorist pipeline is extremely uncommon.

That doesn’t mean YouTube is not a force in radicalization. The paper also found that research volunteers who already held bigoted views or followed YouTube channels that frequently feature fringe beliefs were far more likely to seek out or be recommended more videos along the same lines.

“Nuance” is used in the headline of Ovide’s article, and I think that is a good way of framing this research. Just as it was never the case that YouTube’s recommendation always pushed people toward extremism, it is also not the case that it never does; this research does not automatically disprove past studies or articles about extremist pipelines on YouTube.

Zeynep Tufekci wrote in 2018 about her experiences seeing extremist videos recommended by YouTube after watching, for example, a Trump rally. I remember when YouTube used to recommend the same Ben Shapiro and Jordan Peterson clips on its homepage, no matter whether I accessed YouTube regularly or in private browsing. But in 2019, Google made changes so YouTube would less frequently recommend fringe videos and conspiracy theories.

This new study (PDF), from Annie Chen et al., suggests those changes may have worked. Their participants browsed YouTube between July and December 2020:

Using data on web browsing, we provide behavioral measures of exposure to videos from alternative and extremist channels on YouTube. Our results indicate that exposure to these videos after YouTube’s algorithmic changes in 2019 is relatively uncommon and heavily concentrated in a small minority of participants who previously expressed high levels of hostile sexism and racial resentment. These participants frequently subscribe to the channels in question and reach the videos that they produce via external links. By contrast, we find relatively little evidence of people falling into so-called algorithmic “rabbit holes.” Recommendations to videos from alternative and extremist channels on YouTube are very rare when respondents are watching other kinds of content and concentrated among subscribers to the channels in question.

The last part of this paragraph is, I think, still concerning. On page 20, the researchers show that recommendations typically match the type of materials users are already watching. So if someone saw a video from a mainstream media channel, they got mostly mainstream media recommendations. Similarly, someone watching videos from an extremist channel would fill their recommendations for other extremist media. To me, this appears to be an acknowledgement that YouTube’s recommendations can serve to deepen a hole the company began digging many years ago, but it is mostly sequestering those users into their own bubble. I am not sure that is a good thing — is it good for society that YouTube automatically encourages some people to binge-watch David Duke’s bile and spew? It seems more responsible to remove videos from these kinds of channels from everyone’s recommendations.

Notably, the study found that there is still a small pipeline from dreadful but not extremist YouTube channels to more extreme videos. Compare the list of what the researchers refer to as “alternative channels” on page seven against the referral chart shown on page 17. Perhaps just as significant is the “off-platform referrer” chart shown on page 18, which indicates that “alternative social” media is the biggest external referral source for extremist videos.