Harnessing Our Existing Surveillance Capitalist Infrastructure for Good Instead of Evil idlewords.com

Natasha Singer and Choe Sang-Hun, New York Times:

As countries around the world race to contain the pandemic, many are deploying digital surveillance tools as a means to exert social control, even turning security agency technologies on their own civilians. Health and law enforcement authorities are understandably eager to employ every tool at their disposal to try to hinder the virus — even as the surveillance efforts threaten to alter the precarious balance between public safety and personal privacy on a global scale.

Yet ratcheting up surveillance to combat the pandemic now could permanently open the doors to more invasive forms of snooping later. It is a lesson Americans learned after the terrorist attacks of Sept. 11, 2001, civil liberties experts say.

Maciej Cegłowski:

The most troubling change this project entails is giving access to sensitive location data across the entire population to a government agency. Of course that is scary, especially given the track record of the Trump administration. The data collection would also need to be coercive (that is, no one should be able to opt out of it, short of refusing to carry a cell phone). As with any government surveillance program, there would be the danger of a ratchet effect, where what is intended as an emergency measure becomes the permanent state of affairs, like happened in the United States in the wake of the 2001 terrorist attacks.

But the public health potential of commandeering surveillance advertising is so great that we can’t dismiss it out of hand. I am a privacy activist, typing this through gritted teeth, but I am also a human being like you, watching a global calamity unfold around us. What is the point of building this surveillance architecture if we can’t use it to save lives in a scary emergency like this one?

The lack of legal separation of the widely useful attributes of the universal tracking we all endure from its usual implementation in targeted advertising — or its potential in powering a dystopian police state — is a massively consequential failure. It may have been possible to gain acceptance for this moderate intrusion of privacy if there were some framework of trust.

Alas, no such assurance is in place and users’ trust has badly been abused, so it’s understandable why so many are treating this as a horrible idea.