WorldCoin: saviour of the Internet or simply a bad idea?

Job loss due to AI, bot-network sustained spread of disinformation, monopolisation of internet all of these issues have recently gained traction in the media. Particularly, the ChatGPT provoked corporate “AI arms race” and the resulting issue of identifying real people on the web has caused increasing concern among politicians, activists and many human beings in general. The prospects are grim, there are estimates, made by Goldmann Sachs, that already around 300 million full time jobs are at risk of being lost or diminished due to AI automatization. And overall between Russian botfarms, deep fakes and sybil attacks, it seems that all the advances democracy made in the last decades, related to freely spread information on the web, might be lost to misinformation. But not all hope is lost, Tools for Humanity, a San Francisco based startup, might have a solution. It is called WorldCoin and promises to not only create a way to securely identify unique humans on the web but also to establish the largest, most accessible, publicly owned financial network. Today we will take a deep dive into the world of WorldCoin, discussing how it is supposed to work, the criticism and whether the network will finally enable people to gain the freedom promised by Web3 or if it is ultimately bound to fail, whilst burning through 250 million of investor dollars. 

Tools for Humanity, the developer behind World Coin, was first announced to the public in October 2022. At this point already it was clear that the company was a big deal. It was backed by Sam Altman, the current silicon valley wiz and ironically the recently fired ex-CEO of OpenAi, Andreessen Horowitz, Bain and at this point still unconvicted Sam Bankman-Fried. The startup promised a solution to a problem that has bothered Web3 enthusiasts since its inception. Namely, the company aimed to create an ethereum based identity and financial network that would enable efficient and secure Proof-of-Personhood, based on biometrics. This concept is fundamentally important to any Web3 project. You see, the core idea of decentralised networks, such as Worldcoin, is that they are owned by their users. Logically, these networks should then be governed by the users decision, rather than by some central authority like Google or Meta. However, the other core value of Web3 - anonymity, creates a conflict. If you do not know if a user is human and unique, how can you count their vote in the decision making process? A reliable Proof-of-Personhood would solve this problem. And this is exactly what Tools for Humanity aims to achieve. In addition to that the network should also contain a financial aspect: the Worldcoin. This cryptocurrency would be the main method of exchange for the users of the network. In total the release of 10 billion Worldcoin tokens is planned over the next 15 years. According to the company, every unique human being is eligible to get a share of world coins, simply by being human. Additionally, as Altman claims, if the network is successful enough, it could make reliable financial instruments more available and prevent fraud in distribution of social resources such as the Universal Basic Income. Programs like this have already been successful. An example is India’s Aadhar, biometric system, which was partially an inspiration for WorldCoin, that allowed the country to prevent more than $500 million in covid relief funds from ending up in the hands of bad actors. 

All of this sounds fabulous in theory but how does it actually work? Well, Worldcoin has three standing pillars: WorldApp, WorldID and the token itself. The WorldApp is the user oriented interface of the network. It stores the different crypto wallets, the individuals WorldID and is utilised to perform identification on other parts of the network. In order to use it, you need to download it, then you will receive a personalised QR-code that is used to identify you. To use the app any further, you will have to create a WorldID, the actual proof that you are unique and human. For that you will need to find something called an Orb. This device will scan your Iris and check that your Iris is real and that you have not registered in the network before. After that, a unique hash called the IrisHash will be assigned to your WorldID and you officially become a member of the network. Now, whenever you need to prove your “existence” on the web, you will be able to use your private key. The proof itself is what is called a “zero knowledge proof”, which basically translates to proving a fact without revealing any further information. So, in theory, proving your “existence” is completely anonymous. No one on the network knows what user you are, just that you are a real existing human being. Finally, as already mentioned the token itself is just a means to execute transactions. It is nothing more than another digital currency. 

ORmJQTd3H7Prg3fqZ0o9TlQzgf1Q8BaZH1_NeAF2CAkuWhgrUN8fyjjRPtkGNmKC-_3LGqlaCY3W6ypFTQ4Ea0mCvBI9PN0gIRbYLKcB7Sq90MKkUuLgqg1nuTqEnf8lUk_2meeZJaXmB3TI2z7RoD0

Right now, the network already has over 2.4 million unique signups and orbs are available in over 30 countries. It is so far governed by the Worldcoin Foundation, however in the future it is supposed to become fully decentralised, in the sense that the users will be able to make decisions based on a one-person-one-vote model.  

This certainly sounds good, humanity gets a more accessible way to execute transactions, the network will be communally owned and at the same time it will solve the issues of security identification online. So where is the problem? Well you see there is more than one issue. 

First and foremost, there is the obvious concern about data security. The company is gathering enormous amounts of biometric information, when identifying its users. Additionally, they are also gathering contact information such as emails and phone numbers. This has already caused several countries such as Germany and France to launch investigations into the company. Other states, such as Kenya have already prohibited WorldCoin from signing up any more people in their jurisdiction. For a project that has been operational for only over a hundred days this is an awful lot of regulatory attention. One cause for concern are of course the Iris scans themselves. Whilst the company claims that they will be deleted immediately after they have sufficiently trained their AI-algorithm, it is unclear when this will happen. The company also promises that once the algorithm is complete, all new Iris scans will be deleted immediately after the person has been identified as human and unique. But even if we believe the Worldcoin whitepaper, the orb-operators could still present an issue. They are private contractors who have physical control over the orb and so far it is unclear if a bad faith operator could export the scans before they are deleted using some sort of backdoor. Finally the company might have already broken the European Union's General Data Protection Regulations(GDPR), when assembling the first 500 thousand users for its Beta test. During that time Worldcoin used language such as “ we have not adopted a board-approved data privacy and security policy describing the means and the methods by which we plan to protect your Data to meet the standards prevalent in the GDPR” in their data consent form amongst many other similarly vague statements. As Tools for Humanity has a subsidiary based in Germany, they are subject to the GDPR and are therefore legally required to protect data in accordance with the regulations. This violation alone could result in fines of up to 20 million euros or 4% of global revenue. Quite a blow for a young company. 

The second big concern are the practices the company deployed in order to gain users. An MIT Technology Review investigation revealed that the company was and might still be engaged in dubious marketing schemes and consent manipulation. You see, during the Beta test of the Orb and the entire network the company has signed up an unproportionally large number of users from developing countries. The participants, often people who lost their jobs or source of income due to the pandemic, were approached by the orb operators and offered financial assistance in return for submitting their biometric data and thus signing up for the network. The assistance came in different forms and amounts, some people received WorldCoin or other CryptoCurrencies, others received money. At a particular instance, after having struggled to explain digital currencies to people without an email in Sudan, the company just decided to do an apple airpods giveaway amongst the participants. An additional issue is that often participants did not realise, the collection of biometric data was not related to their governments. The signups often took place in the same areas that the government used to distribute covid related assistance and local officials were often present. In some instances, those officials are said to have received money for “coffee and cigarettes”, an eastern slang referring to paying public servants to facilitate certain action. Furthermore, to many participants it was unclear why their data was even collected. Many of them did not know that it would be used to train an AI-algorithm. There is more than one report, of a situation, where the consent form was not in the country's native language and could therefore be only understood by a small proportion of the population. Finally, at the point of the Beta test, Worldcoin was impossible to trade. Up to July 2023 no single major cryptocurrency exchange has accepted it. Therefore, those unlucky enough to receive Worldcoin as compensation were basically paid in a useless “I owe you”. This is made particularly wretched, if you check the companies risk page, which contains the following formulation: “WLD tokens may in the future be usable as payments and do not convey any ownership rights to you.” In its rebuttal, the company claimed that all of the incidents are isolated and can be attributed to the actions of individual orb operators. The operators on the other hand, refute, saying that the company provided barely any information about how the data collection should be handled, so they could solely rely on their social marketing skills. This shows that even if the Worldcoins claim is true, there was more to be done by the company from the beginning. 

The third concern is that the network is ultimately bound to be centralised and therefore defies the core Web3 values. Firstly, the entire identity network is built on a piece of hardware -  the orb. So far it is produced in a single factory somewhere in Germany. So, how could a decentralised network work if it is bound to a centrally produced piece of hardware. The company answer is that in the future there can be other manufacturers of the orb. And this is true, since all the hardware of the orb is open-source it could be manufactured by other companies. However, so far the majority of the software behind the orb is not open source, so other companies would basically just make a chrome ball. Additionally, even if we assume that the company publishes its entire software, there is still the issue of licensing. How do we make sure that a certain manufacturer of the orb did not mingle with the software, and who will be doing that? This brings us back to the centralisation issue. Finally, the AI iris-detection algorithm is a damocles sword hovering over the entire network. Let’s imagine for a second that Tools for Humanity shares its algorithm with orb manufacturers. Couldn’t this algorithm then be used in order to create an AI that generates Iris images? The answer is yes, and since Worldcoins algorithm now has the largest iris database privately owned, the generated images could be indistinguishable from the real thing. There are already reports of people creating fake accounts using generated iris images. Imagine what will happen if Worldcoins algorithm becomes public.  Any single individual in possession of such an algorithm could immediately take control of the entire network. 

In conclusion, it seems that Sam Altman has tried to solve an issue, which to some extent he created. To that end he took an idea, dear to all cypherpunks, namely the zero-knowledge proof-of-personhood. And then tried to bring it to live. The result, a flawed, centralised hardware-based monstrosity of a network that is more reminiscent of dystopian science fiction novel, than of something that could change the world in a positive direction. In my opinion, the biggest issue the company had from its inception was the lack of understanding of how fundamental the challenge ahead is. This is clearly seen in the company's goals. Tools for humanity has hoped to have over a billion unique signups on their network by 2023, if you have already forgotten they are barely over 2.4 million users. The company did not realise the difficulty of balancing between the Web3 values they promised to uphold and the pragmatic approach to the issue. This was worsened by their, to say the least, doubtful marketing and consents strategies during the beta test, which resulted in a deep sense of distrust. Not only from the crypto and tech communities, but more importantly from the people they exploited. The people the company originally planned to help, by making financial instruments more accessible. If the company survives the current legal challenges it faces, I have no doubt they will be able to secure more funding, but honestly to what end?

P.S. For the readers interested in reading more about proof-of-personhood, I can recommend this blog entry by the creator of ethereum Vitalik Buterin. 

About this article

Written by:
  • Mikhail Muradov
| Published on: Nov 18, 2023