Worldcoin has the potential to receive as much press treatment as ChatGPT, coincidentally both linked to one of the founders of OpenAI, Sam Altman. More and more people around the planet are scanning their irises in exchange for cryptocurrencies and, although Worldcoin maintains that all protection laws are respected, there are a beyond this and it is the ethical aspect.
To understand the risks of this “eye for an eye and iris for crypto,” we spoke with Luis Corrons, cybersecurity expert at Avast.
“What Worldcoin is proposing is to create a “global ID” – he explains to us in a telephone conversation -. You scan your iris to uniquely identify yourself. To get users as an incentive they give you cryptocurrencies that only you can access. In my opinion, people are very happily registering on the platform and providing their iris”.
Here is one of the keys to this dilemma: would there be so many people willing to give up such private information, like the iris, if there were no exchange, in this case for cryptocurrencies. The reality is that obtaining and collecting biometric data, especially the iris, which is unique to each individual and does not change over time, poses privacy and security risks. Corrons mentions some of them:
“Data storage and management. How this sensitive data is stored, protected and managed to prevent unauthorized access or leaks. There is also the use of one's own data. The concerns about how this data could be used in the future, beyond the original purposes of the project. And finally informed consent, that is, the importance of ensuring that participants fully understand what data collection entails and how it will be used.”
Worldcoin maintains that the information obtained from the scanned iris allows the creation of a specific algorithm for that person and that it is not the iris that is used in managing the final data. But it's not that simple when money is involved.
“There are also ethical implications – adds Corrons -. The idea of ”selling” or exchanging biometric data for cryptocurrencies raises ethical questions. “It stands to reason that many of the participants are motivated by economic need, which could influence their ability to make fully informed and voluntary decisions.”
But… What is the problem with our iris if we already use biometric data, such as the fingerprint on our mobile phones, to activate it, to open applications like those of a bank? It's not as simple as this.
“It is true that we are used to using biometric identification on a daily basis through our smartphones – concludes Corrons -, whether through fingerprint recognition, face recognition, etc. but the biggest difference with respect to what Worldcoin proposes is that in these cases biometric information does not leave our deviceso the privacy of our data is safeguarded.”
Today there may be no consequences and, as Corrons says, many people make the decision driven by economic necessity, but a private company has a way to identify millions of people in the world, it gives them enormous power.