Worldcoin’s First Half-Million Test Users Are Treated More Like Unconsenting Subjects

Eileen Guo and Adi Renaldi, MIT Technology Review:

Gunungguruh was not alone in receiving a visit from Worldcoin. In villages across West Java, Indonesia — as well as college campuses, metro stops, markets, and urban centers in two dozen countries, most of them in the developing world — Worldcoin representatives were showing up for a day or two and collecting biometric data. In return they were known to offer everything from free cash (often local currency as well as Worldcoin tokens) to AirPods to promises of future wealth. In some cases they also made payments to local government officials. What they were not providing was much information on their real intentions.

This left many, including Ruswandi, perplexed: What was Worldcoin doing with all these iris scans?

This is a distressing read. It seems that Worldcoin, based in San Francisco, recruited people — primarily in developing countries like Indonesia and Kenya — to scan the irises of hundreds of thousands of others without their full understanding or consent. It says its privacy bonafides will improve as it grows, but it is providing little information about how it is treating the sensitive data it has collected so far, excusing these practices by its small size:

“I’m not sure if you’re aware of this,” he [Worldcoin CEO Alex Blania] said, “but you looked at the testing operation of a Series A company. It’s a few people trying to make something work. It’s not like an Uber, with like hundreds of people that did this many, many times.”


By the time we spoke to Blania in March, Worldcoin had already scanned 450,000 eyes, faces, and bodies in 24 countries. Of those, 14 are developing nations, according to the World Bank. Eight are located in Africa. But the company was just getting started — its aim is to garner a billion sign-ups by 2023.

If you are planning to scale from hundreds of thousands to a billion people in a year — a laughable goal, but bear with me — you cannot use the excuse of an early stage startup. Exploiting poor people for their biometric data with financial incentives is scummy enough; treating privacy as a problem for later is inexcusable.