The Office of the Privacy Commissioner of Canada has been investigating Clearview’s behaviour since Kashmir Hill of the New York Times broke the story a little more than a year ago. In its overview, the Office said:
Clearview did not attempt to seek consent from the individuals whose information it collected. Clearview asserted that the information was “publicly available”, and thus exempt from consent requirements. Information collected from public websites, such as social media or professional profiles, and then used for an unrelated purpose, does not fall under the “publicly available” exception of PIPEDA, PIPA AB or PIPA BC. Nor is this information “public by law”, which would exempt it from Quebec’s Private Sector Law, and no exception of this nature exists for other biometric data under LCCJTI. Therefore, we found that Clearview was not exempt from the requirement to obtain consent.
Furthermore, the Offices determined that Clearview collected, used and disclosed the personal information of individuals in Canada for inappropriate purposes, which cannot be rendered appropriate via consent. We found that the mass collection of images and creation of biometric facial recognition arrays by Clearview, for its stated purpose of providing a service to law enforcement personnel, and use by others via trial accounts, represents the mass identification and surveillance of individuals by a private entity in the course of commercial activity. We found Clearview’s purposes to be inappropriate where they: (i) are unrelated to the purposes for which those images were originally posted; (ii) will often be to the detriment of the individual whose images are captured; and (iii) create the risk of significant harm to those individuals, the vast majority of whom have never been and will never be implicated in a crime. Furthermore, it collected images in an unreasonable manner, via indiscriminate scraping of publicly accessible websites.
The Office said that Clearview should entirely exit the Canadian market and remove data it collected about Canadians. But, as Kashmir Hill says, it is not a binding decision, and it is much easier said than done:
The commissioners, who noted that they don’t have the power to fine companies or make orders, sent a “letter of intention” to Clearview AI telling it to cease offering its facial recognition services in Canada, cease the scraping of Canadians’ faces, and to delete images already collected.
That is a difficult order: It’s not possible to tell someone’s nationality or where they live from their face alone.
The weak excuse for a solution that Clearview has come up with is to tell Canadians to individually submit a request to be removed from its products. To be removed, you must give Clearview your email address and a photo of your face. Clearview expects that it is allowed to process facial recognition for every single person for whom images are available unless they manually opt out. It insists that it does not need consent because the images it collects are public. But, as the Office correctly pointed out, the transformative use of these images requires explicit consent:
Beyond Clearview’s collection of images, we also note that its creation of biometric information in the form of vectors constituted a distinct and additional collection and use of personal information, as previously found by the OPC, OIPC AB and OIPC BC in the matter of Cadillac Fairview.
In our view, biometric information is sensitive in almost all circumstances. It is intrinsically, and in most instances permanently, linked to the individual. It is distinctive, unlikely to vary over time, difficult to change and largely unique to the individual. That being said, within the category of biometric information, there are degrees of sensitivity. It is our view that facial biometric information is particularly sensitive. Possession of a facial recognition template can allow for identification of an individual through comparison against a vast array of images readily available on the Internet, as demonstrated in the matter at hand, or via surreptitious surveillance.
The Office also found that scraping online profiles does not match the legal definition of “publicly accessible”.
This is such a grotesque violation of privacy that there is no question in my mind that Clearview and companies like it cannot continue to operate. United States law has an unsurprisingly permissive attitude towards this sort of thing, but its failure to legislate on a national level should not be exposed to the rest of the world.
Unfortunately, this requires global participation. Every country must have better regulation of this industry because, as Hill says, there is no way to determine nationality from a photo. If Clearview is outlawed in the U.S., what is there to stop it registering in another nationality with similarly weak regulation?
Clearview is almost certainly not the only company scraping the web with the intent of eradicating privacy as we know it, too. Decades of insufficient regulation have brought us to this time. We cannot give up on the basic right to privacy. But I fear that it has been sacrificed to a privatized version of the police state.
If a government directly created something like the Clearview system, it would be seen as a human rights violation. How is there any moral difference when it is instead created by private industry?