Worldcoin is scanning eyeballs to build a global ID and finance system. Governments are not impressed
Millions of people worldwide are lining up to stare into a silver sphere about the size of a bowling ball so their irises can be scanned in exchange for online identity verification and “free” cryptocurrency.
Millions of people worldwide are lining up to stare into a silver sphere about the size of a bowling ball so their irises can be scanned in exchange for online identity verification and “free” cryptocurrency.
The silver spheres, known as “Orbs”, are part of the Worldcoin platform, which officially launched in July 2023 after an 18-month testing phase. Led by Sam Altman (chief executive of OpenAI, the company behind ChatGPT) and entrepreneur Alex Blania, Worldcoin offers users a “digital passport” known as World ID and small allocations of a cryptocurrency token also called Worldcoin (WLD), “simply for being human”.
Worldcoin aims to provide a “proof of personhood” to distinguish humans from artificial intelligence (AI) systems online.
However, critics say the company is essentially bribing people to hand over highly sensitive biometric data. Governments are taking note: the Worldcoin platform has already been suspended in Kenya, and is under investigation in several other countries.
Gaze into the Orb
Users can download the WorldApp on their mobile phone, then find their “nearest Orb”. The Orb uses iris scans to uniquely identify a person.
Once the person has their iris scanned, they receive a World ID which will function as an online ID much like a Google or Facebook login. World ID is meant to be different because it can prove the user is human – and more private, because it does not link to other personal information about the user.
Despite the “digital passport” label, World ID is not intended to reveal or verify a user’s identity in the conventional sense. It merely establishes the user as “a unique and real person”, rather than a bot.
In most countries, the user is also entitled to units of WLD cryptocurrency once their iris scan is complete.
The Worldcoin website currently lists 60 Orb locations worldwide, particularly in Europe, Asia, North America and South America, and notes there will also be Orb “pop-ups”.
At the time of writing, there appear to be no Orb locations in Australia, so people in Australia cannot earn WLD tokens “for being human”. But they can purchase the WLD cryptocurrency via certain cryptocurrency exchanges and download the World App, which also functions as a cryptocurrency wallet.
Cash for eyeballs jeopardises human rights
Altman is a key player in the AI boom that supposedly makes Worldcoin necessary, so critics have suggested he is “simply profiting from both AI’s problem and solution”.
When the Worldcoin platform officially launched, after signing up some 2 million users in a testing phase, Altman said the Orbs were scanning a new user every eight seconds.
In Kenya, the launch saw “tens of thousands of individuals waiting in lines over a three-day period to secure a World ID”, which Worldcoin attributed to “overwhelming” demand for identity verification.
Independent reporting suggests the promise of “free” cryptocurrency was a more common motive. In most locations, Worldcoin offers a “genesis grant” of 25 units of its WLD cryptocurrency when users scan their irises. (The value of WLD fluctuates, but the grant has been worth around US$50, or $A75, over the past month.)
People queuing for the Orb in Kenya told the BBC “I want to register because I’m jobless and I’m broke,” and
I really like Worldcoin because of the money. I’m not worried about the data. As long as the money comes.
Orb operators are also paid for each user they sign up.
Critics have labelled this strategy of paying people to scan their irises as dystopian and equivalent to bribery.
Offering money for sensitive data arguably makes privacy – a human right – a luxury only the wealthy can afford. People experiencing poverty may risk future harms to meet their immediate survival needs.
‘Cataloguing eyeballs’: the risks of using biometric data
Worldcoin uses irises for verification because every iris is unique and therefore difficult to fake. But the risks of handing over such data are very high. Unlike a driver’s licence or a passport, you cannot replace your iris if the data is compromised.
Surveillance whistleblower Edward Snowden has criticised Worldcoin for “cataloguing eyeballs”, and tweeted about the unacceptable risks:
Don’t use biometrics for anything. […] The human body is not a ticket-punch.
Worldcoin claims the iris scans are deleted after being converted into a unique iris code, which becomes the user’s World ID. The World ID is then stored on a decentralised blockchain, with the aim of preventing fakes or duplicates.
However, the iris scan is only deleted if the user opts for the “Without Data Storage” option (which may mean they need to return to an Orb to re-verify in the future). If the user selects the “With Data Storage” option, Worldcoin states the iris scan is sent via encrypted communication channels to its distributed data stores where it is encrypted at rest.
In either case, the user must simply trust the company to delete the biometric data, or appropriately secure it against misuse.
There have been many instances in which Silicon Valley companies have promised to secure data and to strictly limit its use, only to break those promises by disclosing the data to other companies or government agencies or failing to secure it against attack.
Journalist Eileen Guo also points out that Worldcoin has not yet clarified whether it still uses stored biometric data to train AI models and whether it has deleted biometric data collected during its test phase.
And despite the supposed security of biometric scanning, there have already been reports of fraudulent uses of the Worldcoin system. For example, black market speculators are alleged to have persuaded people in Cambodia and Kenya to sign up for Worldcoin and then sell their World IDs and WLD tokens for cash.
Regulatory action
Regulators in several countries are taking action. The Kenyan government has now suspended Worldcoin’s activities, stating regulatory concerns surrounding the project “require urgent action”.
The Communications Authority of Kenya and Office of the Data Protection Commissioner say they are concerned about the offer of money in exchange for consent to data collection; how securely the data are stored; and “massive citizen data in the hands of private actors without an appropriate framework”.
The German privacy watchdog is investigating Worldcoin’s business practices with support from the French privacy regulator, which called Worldcoin’s data practices “questionable”. The UK Information Commissioner’s Office has announced it will investigate Worldcoin, referring to the high risk of processing special category biometric data.
While there are no Orbs in Australia yet, the federal privacy regulator has previously found some companies in breach of the privacy law for failing to obtain valid consent for the use of biometric data and collecting it when it was not reasonably necessary.
Katharine Kemp, Associate Professor, Faculty of Law & Justice, and Deputy Director, Allens Hub for Technology, Law & Innovation, UNSW Sydney
This article is republished from The Conversation under a Creative Commons license. Read the original article.