Choices, Choices

This Washington Post article by Geoffrey Fowler about privacy policies caught my eye. Researchers acknowledge we do not read them and, thus, Fowler argues we should reform laws to encourage abolishing the make-work practice of clicking “I Agree” every time we sign up for anything. I thought this was a promising perspective, closely aligning with my own thoughts. But the more I re-read Fowler’s argument, the more I realized how inconsistent this article is because of its failure to grapple with core truths.

Privacy policies, as Fowler writes, often harbour vague and innocent-seeming language that can give companies wide latitude over the use of private data. Fowler suggests a law of some kind to encourage a reduced level of data collection:

But to protect our privacy, the best place to start is for companies to simply collect less data. “Maybe don’t do things that need a million words of explanation? Do it differently,” said Slaughter. “You can’t abuse, misuse, leverage data that you haven’t collected in the first place.”

Apps and services should only collect the information they really need to provide that service — unless we opt in to let them collect more, and it’s truly an option.

I’m not holding my breath that companies will do that voluntarily, but a federal privacy law would help. […]

Sounds great. There absolutely should be a law that notifies people of how their data will be collected and allows more granular control. Only thing, though — here is what Fowler writes a couple dozen paragraphs earlier in this same article:

Some government efforts have made things worse. Thanks to a recent European law, lots of websites also now ask you to “opt in” to their use of tracking technology, throwing a bunch of dials on the screen before you can even see if it’s worth looking at.

I do not see how Fowler’s proposal, quoted before, materially differs from GDPR, which mandates a similar notify-and-consent policy and encourages data minimization.

GDPR benefits websites which do not collect identifiable information because they do not need to ask for permission. But the ad tech machine is a needy beast, and many websites could not shake their practices and, so, they gave visitors a choice. Fowler:

Many people, including a generation setting up their first tablets and smartphones, just click “agree” to everything because they think privacy is a lost cause. “We’re teaching everyone the wrong thing,” said Mika Shah, co-acting general counsel of the tech nonprofit Mozilla.

There is nothing to suggest another piece of legislation will be any different if its main goal is to allow people to pick-and-choose what they opt into. That the many of the most common kinds of cookie consent forms are not compliant with GDPR is almost besides the point — again, there is no reason to believe a similar American law would be followed more meticulously.

That is not to say that such a law would be useless. European users have their location collected half as often by real-time bidders compared to American users. Should we be happy about that? Sure; I would welcome halving the data available to the ad tech industry worldwide. But it is not as effective as Fowler makes it out to be, and it makes for an internally inconsistent argument.

Fowler’s other suggestion is similarly flawed:

Second, we need to replace the theater of pressing “agree” with real choices about our privacy.


Even better, technology could help us manage our choices. Cranor suggests data disclosures could be coded to be read by machines. Companies already do this for financial information, and the TLDR Act would require consistent tags on privacy information, too. Then your computer could act kind of like a butler, interacting with apps and websites on your behalf.

Picture Siri as a butler who quizzes you briefly about your preferences and then does your bidding. The privacy settings on an iPhone already let you tell all the different apps on your phone not to collect your location. For the past year, they’ve also allowed you to ask apps not to track you.

As has been repeatedly reported, App Tracking Transparency has been less effective than it initially seemed. Developers flout Apple’s rules and, even when perfectly followed, the policies do not prohibit aggregate tracking.

Fowler writes about other possible solutions for addressing privacy policies. Standardized labels and tables, like those used for App Store apps but legally binding, are one such idea. But this is a way to permit continued unnecessary data collection obscured by simplified language and diagrams.

As long as it is left up to companies to negotiate individually with users and nominally inform them about their privacy sacrifices, this problem will never go away. There is just too much money encouraging the worst behaviour. The only thing stopping this is to require this industry to rebuild, from scratch, with privacy at the core — and the only way that happens is to destroy, with law, any incentives for collecting, merging, and sharing incidental or behavioural user data. Only then can we consider ending the ridiculous practice of requiring a separate, wildly permissive contract for every digital product and service we use.