Kashmir Hill, New York Times:
For $29.99 a month, a website called PimEyes offers a potentially dangerous superpower from the world of science fiction: the ability to search for a face, finding obscure photos that would otherwise have been as safe as the proverbial needle in the vast digital haystack of the internet.
A search takes mere seconds. You upload a photo of a face, check a box agreeing to the terms of service and then get a grid of photos of faces deemed similar, with links to where they appear on the internet. The New York Times used PimEyes on the faces of a dozen Times journalists, with their consent, to test its powers.
PimEyes found photos of every person, some that the journalists had never seen before, even when they were wearing sunglasses or a mask, or their face was turned away from the camera, in the image used to conduct the search.
You do not even need to pay the $30 per month fee. You can test PimEyes’ abilities for free.
PimEyes disclaims responsibility the results of its search tool through some ostensibly pro-privacy language. In a blog post published, according to metadata visible in the page source, one day before the Times’ investigation, it says its database “contains no personal information”, like someone’s name or contact details. The company says it does not even have any photos, storing only “faceprint” data and URLs where matching photos may be found.
Setting aside the question of whether a “faceprint” ought to be considered personal information — it is literally information about a person, so I think it should — perhaps you have spotted the sneaky argument PimEyes is attempting to make here. It can promote the security of its database and its resilience against theft all it wants, but its real privacy problems are created entirely through its front-end marketed features. If its technology works anywhere near as well as marketed, a search will lead to webpages that do contain the person’s name and contact details.
PimEyes shares the problem found with any of these people finding tools, no matter their source material: they do not seem dangerous in isolation, but it is their ability to coalesce and correlate different data points to create a complete profile. Take a picture of anyone, then dump it into PimEyes to find their name and, perhaps, a username or email address correlated with the image. Use a different people-based search engine to find profiles across the web that share the same online handle, or accounts registered with that email address. Each of those searches will undoubtedly lead to greater pools of information, and all of this is perfectly legal. The only way to avoid being a subject is to submit an opt-out request to services that offer it. Otherwise, if you exist online in any capacity, you are a token in this industry.
PimEyes users are supposed to search only for their own faces or for the faces of people who have consented, Mr. Gobronidze said. But he said he was relying on people to act “ethically,” offering little protection against the technology’s erosion of the long-held ability to stay anonymous in a crowd. PimEyes has no controls in place to prevent users from searching for a face that is not their own, and suggests a user pay a hefty fee to keep damaging photos from an ill-considered night from following him or her forever.
This is such transparent bullshit. Gobronidze has to know that not everybody using its service is searching for pictures of themselves or those who have consented. As Hill later writes, it requires more stringent validation of a request to opt out of its results than it does a request to search.
Update: On July 16, Mara Hvistendahl of the Intercept reported on a particularly disturbing use of PimEyes:
The online facial recognition search engine PimEyes allows anyone to search for images of children scraped from across the internet, raising a host of alarming possible uses, an Intercept investigation has found.
It would be more acceptable if this service were usable only by a photo subject or their parent or guardian. As it is, PimEyes stands by its refusal to gate image searches, permitting any creep to search for images of anyone else through facial recognition.