Julia Angwin, the Markup:
The idea is elegant in its simplicity: Google and Apple phones would quietly in the background create a database of other phones that have been in Bluetooth range — about 100 to 200 feet — over a rolling two-week time period. When users find out that they are infected, they can send an alert to all the phones that were recently in their proximity.
The broadcast would not identify the infected person, it would just alert the recipient that someone in his or her recent orbit had been infected. And, importantly, the companies say they are not collecting data on people’s identities and infection status. Nearly all of the data and communication would be stored on users’ phones.
But building a data set of people who have been in the same room together — data that would likely be extremely valuable to both marketers and law enforcement — is not without risk of exploitation, even stored on people’s phones, security and privacy experts said in interviews.
I appreciate any commentary on the privacy design of contact tracking that acknowledges the existence of an entire unregulated industry already trafficking in this information. The current effort is noble but fundamentally ridiculous, like painting bike lanes on the interstate.
I wish people had picked any other moment than the start of a pandemic to come to terms with the implications of living in a surveillance society. But by the iron law of 2020, the dumbest thing has to happen, and that means a principled debate about fictional forms of privacy.
Zack Whittaker and Darrell Etherington, TechCrunch:
TechCrunch joined a media call with Apple and Google representatives, allowing reporters to ask questions about their coronavirus tracing efforts.
The companies said only public health authorities will be allowed access to the contact tracing API.
This limited API use will be restricted in the same spirit that you restrict individual healthcare to licensed medical professionals like physicians. In the same way, use of the API will be restricted only to authorized public health organizations as identified by whatever government is responsible for designating such entities for a given country or region. There could be conflict about what constitutes a legitimate public health agency in some cases, and even disagreements between national and state authorities, conceivably, so this sounds like it could be a place where friction might occur, with Apple and Google on tricky footing as platform operators.
Third, the companies said they would prevent abuse of the system by routing alerts through public health agencies. (They are also helping those agencies, such as Britain’s National Health Service, build apps to do just that.) While the details are still being worked out, and may vary from agency to agency, Apple and Google said they recognized the importance of not allowing people to trigger alerts based on unverified claims of a COVID-19 infection. Instead, they said, people who are diagnosed will be given a one-time code by the public health agency, which the newly diagnosed will have to enter to trigger the alert.
Fourth, the companies promised to use the system only for contact tracing, and to dismantle the network when it becomes appropriate. Some readers have asked me whether the system might be put to other uses, such as targeted advertising, or whether non-governmental organizations might be given access to it. Today Apple and Google explicitly said no.
Aside from bad faith arguments from those who assume that this framework will be implemented in the dumbest possible way, I think the answers provided by Apple and Google representatives ought to assuage overall worries about the design of this contact tracing system. I am emphatically not saying that there are no further criticisms that can be levelled at it, only that they should be far more precise. It can be true that a reckoning is needed to correct the privacy failures of the past twenty years and also that this is a particularly unwelcome time to suddenly have that realization.