Day: 22 April 2022

Sam Biddle and Jack Poulson, the Intercept:

Anomaly Six software lets its customers browse all of this data in a convenient and intuitive Google Maps-style satellite view of Earth. Users need only find a location of interest and draw a box around it, and A6 fills that boundary with dots denoting smartphones that passed through that area. Clicking a dot will provide you with lines representing the device’s — and its owner’s — movements around a neighborhood, city, or indeed the entire world.

[…]

To fully impress upon its audience the immense power of this software, Anomaly Six did what few in the world can claim to do: spied on American spies. “I like making fun of our own people,” Clark began. Pulling up a Google Maps-like satellite view, the sales rep showed the NSA’s headquarters in Fort Meade, Maryland, and the CIA’s headquarters in Langley, Virginia. With virtual boundary boxes drawn around both, a technique known as geofencing, A6’s software revealed an incredible intelligence bounty: 183 dots representing phones that had visited both agencies potentially belonging to American intelligence personnel, with hundreds of lines streaking outward revealing their movements, ready to track throughout the world. “So, if I’m a foreign intel officer, that’s 183 start points for me now,” Clark noted.

Clark was able to show the location history for each of those nearly two hundred devices for, according to Biddle and Poulson, up to a year’s worth of tracking. Any of these devices were easily de-anonymized because, well, Anomaly Six had their entire location history. It is worth being cautious about their capabilities given the self-promotional context of these claims, but multiple experts told the Intercept they felt believable.

Byron Tau of the Wall Street Journal has previously reported on Anomaly Six’s capabilities, which are derived from the inclusion of its SDK in third-party apps as well as the broader data broker economy. That economy is potentially open to users from other countries, given the United States’ almost non-existent protections on personal data privacy. Much of the world’s tech industry is also based in the U.S. and their privacy policies often say U.S. jurisdiction applies.

Not only does the American military-industrial complex have the ability to spy on the world’s devices, adversarial nations could create similar capabilities — again, partly thanks to the weak privacy protections afforded by U.S. law and its concentration of tech companies.

It does not really matter how well-educated you are as a consumer or user. Short of not owning anything that connects to the internet, there is no reliable way of opting out of surveillance by a company nobody really thinks about. The only way this gets improved is by minimizing data generation and collection, and through stricter privacy laws. Perhaps this is one reason why American lawmakers have been reluctant to pass such laws.

Shira Ovide, New York Times:

A group of academics found that YouTube rarely suggests videos that might feature conspiracy theories, extreme bigotry or quack science to people who have shown little interest in such material. And those people are unlikely to follow such computerized recommendations when they are offered. The kittens-to-terrorist pipeline is extremely uncommon.

That doesn’t mean YouTube is not a force in radicalization. The paper also found that research volunteers who already held bigoted views or followed YouTube channels that frequently feature fringe beliefs were far more likely to seek out or be recommended more videos along the same lines.

“Nuance” is used in the headline of Ovide’s article, and I think that is a good way of framing this research. Just as it was never the case that YouTube’s recommendation always pushed people toward extremism, it is also not the case that it never does; this research does not automatically disprove past studies or articles about extremist pipelines on YouTube.

Zeynep Tufekci wrote in 2018 about her experiences seeing extremist videos recommended by YouTube after watching, for example, a Trump rally. I remember when YouTube used to recommend the same Ben Shapiro and Jordan Peterson clips on its homepage, no matter whether I accessed YouTube regularly or in private browsing. But in 2019, Google made changes so YouTube would less frequently recommend fringe videos and conspiracy theories.

This new study (PDF), from Annie Chen et al., suggests those changes may have worked. Their participants browsed YouTube between July and December 2020:

Using data on web browsing, we provide behavioral measures of exposure to videos from alternative and extremist channels on YouTube. Our results indicate that exposure to these videos after YouTube’s algorithmic changes in 2019 is relatively uncommon and heavily concentrated in a small minority of participants who previously expressed high levels of hostile sexism and racial resentment. These participants frequently subscribe to the channels in question and reach the videos that they produce via external links. By contrast, we find relatively little evidence of people falling into so-called algorithmic “rabbit holes.” Recommendations to videos from alternative and extremist channels on YouTube are very rare when respondents are watching other kinds of content and concentrated among subscribers to the channels in question.

The last part of this paragraph is, I think, still concerning. On page 20, the researchers show that recommendations typically match the type of materials users are already watching. So if someone saw a video from a mainstream media channel, they got mostly mainstream media recommendations. Similarly, someone watching videos from an extremist channel would fill their recommendations for other extremist media. To me, this appears to be an acknowledgement that YouTube’s recommendations can serve to deepen a hole the company began digging many years ago, but it is mostly sequestering those users into their own bubble. I am not sure that is a good thing — is it good for society that YouTube automatically encourages some people to binge-watch David Duke’s bile and spew? It seems more responsible to remove videos from these kinds of channels from everyone’s recommendations.

Notably, the study found that there is still a small pipeline from dreadful but not extremist YouTube channels to more extreme videos. Compare the list of what the researchers refer to as “alternative channels” on page seven against the referral chart shown on page 17. Perhaps just as significant is the “off-platform referrer” chart shown on page 18, which indicates that “alternative social” media is the biggest external referral source for extremist videos.