Lucía Camacho: ‘In the iris business, it’s no coincidence that World Foundation has focused on Latin America’
The public policy coordinator at the NGO Derechos Digitales discusses the decisions made by regulatory authorities in Colombia, Brazil, and Chile regarding the company linked to Sam Altman. She asks: Who controls, for what purposes, and under what safeguards, the proof that we are human?

Over the past three years, several Latin American countries have witnessed the arrival of the Orb, a futuristic-looking spherical device used to read irises and capture biometric data. This striking technology, developed by World Foundation and created by Sam Altman, a leading figure in artificial intelligence and CEO of OpenAI, along with its operational partner, Tools for Humanity, has been installed in shopping malls, gas stations, and other locations in Colombia, Chile, and Brazil.
Thousands of Latin Americans have lined up in front of that giant eye to hand over their personal data in exchange for cryptocurrencies or tokens, in an operation that generates massive biometric databases. Regulatory authorities in Chile, Brazil, and Colombia have made decisions that, however, clash with a complex business model.
At the end of 2025, Colombia — through its Superintendency of Industry and Commerce (SIC) — ordered the immediate and definitive closure of the operations of World Foundation and Tools for Humanity. This ruling has become a precedent for biometrics experts. Lucía Camacho, a Colombian public policy coordinator at the NGO Derechos Digitales (Digital Rights), warns about the scale of the iris scanning business and the lack of controls in the region. “They present it as something harmless, but it’s a phenomenon that forces us to ask questions about who controls, for what purposes, and under what safeguards, the proof that we are human?”
Question. How do these companies land in Latin America?
Answer. They arrive in countries through local logistics operators. They hire individuals or companies based in the country, hand them the Orb and give them very general instructions on what to do: ask people to register, find a well-lit space. There is no adequate training on key issues such as consent. They are not trained on how to ensure that the person registering is of legal age — something that is especially sensitive, because minors are registered, as happened in Chile. Nor are they taught how to explain what consent entails, what compensation consists of, how to withdraw data, or how to exercise rights over that personal information.
Q. Where does the scanned data go?
A. The Orb is one of the most sophisticated facial and iris recognition technologies on the market. According to the company, this transmission is fast and secure, but flaws have already been identified. The iris image ceases to exist as such: algorithms transform that information into long strings of fragmented pattern code that are stored on servers belonging to actors considered “trusted,” primarily in the United States. The company’s argument is that, since they are applying “a technological process, the data is no longer personal, therefore, we no longer have obligations in this matter.” This is the major problem.
People are not participating to contribute to training an algorithm or because they believe in Worldcoin’s good intentions and the need to refine technologies that benefit humanity, but rather for the money.
Q. You say that’s the toughest nut to crack in data protection. Why?
A. Many Big Tech companies are using technologies that allow them to transform personal data into other types of information, which, in the long run, makes it easier for them to evade data protection authorities. Their argument is that without personal data, there are no longer any affected individuals, and therefore they no longer have to comply with the law.
Q. What happens if someone wants their data deleted?
A. In Chile, some parents, upon learning that their children had submitted their iris scans to this company, requested protective measures and demanded the deletion of the data. They initiated a legal battle in which they were told, for example, that the company is not based there. Furthermore, before the Brazilian authorities and in multiple documents, the company stated that it cannot delete the collected information.

Q. The other problem is consideration for the iris.
A. Yes. People who register are asked to download an app because World [Foundation] not only operates the Orb, but also offers a digital wallet where they distribute cryptocurrency as compensation. They call it an incentive, not payment. This point was especially problematic in Brazil, where the authorities argued that compensation cannot be given in exchange for personal data because that implies commodifying identity, which is prohibited. World’s defense was that the “incentive” is not mandatory: those who don’t want to receive it don’t [have to] download the app. However, when people in line were asked why they were there, the answer was clear: they were there for the incentive.
Q. What is the scope of Brazil’s decision?
A. The Brazilian data authority ordered Worldcoin (a company created by Altman) to suspend cryptocurrency distribution through iris scanning, stating that, given the vulnerability of the target population, the incentives become lifelines that ultimately exploit the data of those with no other economic alternative. Without this “incentive,” as they call it, the company would not have obtained the iris scans of more than 10 million people worldwide by January 2025. People are not participating to contribute to training an algorithm or because they believe in Worldcoin’s good intentions and the need to refine technologies that benefit humanity, but rather for the money.
Q. In Chile, a gender gap was also evident.
A. An investigation by the NGO Amarantas found that single mothers with economic needs were the majority of those who sold their biometric data, because they received $50 or $70 that covered them for a day. Minors also did so, seeking money to spend on video games.
Q. Is it a business that is based on inequality?
A. Social elites and those in advantageous economic positions haven’t even heard of this. Worldcoin’s target audience is found in public transportation stations or popular shopping malls. They target very specific audiences who are attracted because the Orb is eye-catching. In addition, they advertise using cryptocurrency language, which appeals to many people.
Q. Why did they choose Latin American countries?
A. Firstly, in our region there is no friction when it comes to identification: we are used to handing over our ID, fingerprints, or other data to verify who we are or access services, unlike in Europe and countries like the United States or the United Kingdom, where there isn’t even a centralized identification system. Secondly, Latin America faces weaknesses in the rule of law when it comes to enforcing decisions by authorities against foreign companies. Europe doesn’t have this problem: there, large technology companies have been required to establish headquarters within its territory to guarantee clear rules on consumer protection, data protection, and intellectual property. It’s no coincidence, then, that World has focused on our region.
What is the ethical and legal responsibility of those who generate inscrutable technological “solutions” to problems they have partly helped create?
Q. How do the authorities in the region exercise control?
A. The main official reactions against the company come from the Global South — from Kenya and Latin America — while in Europe only a few, such as Germany, Portugal, and Spain, have opened specific investigations. The authorities in the region are doing what they can with the resources they have, and they are acting in their own timelines. But their decisions are noteworthy.
Q. Why is the decision by the Colombian authority so noteworthy?
A. The SIC’s decision sets a significant precedent. In addition to suspending World’s operations for violating national data protection legislation, it imposed the obligation to delete all personal information in its custody, both in the virtual wallet and on the Orb, where everything remains a mystery. It also raises questions that transcend technology and become profoundly political: What price are we willing to pay to prove our existence? And above all, what is the ethical and legal responsibility of those who generate inscrutable technological “solutions” to problems they have partly helped create?
Q. Can they really implement those measures?
A. The dilemma shared by the authorities in our countries is how they will act in the face of non-compliance with their decisions. World admitted to Brazil that it cannot delete the data. So, there are open questions: how will they guarantee that the information — even that which has been anonymized — has been completely removed from the datasets used to train their algorithms? And how will they commit to ensuring that no records of the hash code remain in the hands of “trusted third parties”?
Q. You claim that in the case of Colombia, World’s strategy was to dispute the proceedings and feign ignorance.
A. As can be inferred from the proceedings, the response from World Foundation and Tools for Humanity was predictable: alleging due process violations, specifically supposed errors in notifying World Foundation of the authority’s decisions, when Tools for Humanity was responsible for responding, or vice versa. Over a year and a half, they did acknowledge receiving every response and request from the authority, as is evident in the proceedings. This decision raises questions about how far authorities can go and how to ensure they fulfill their legal responsibilities.
Q. Have you identified a pattern for this company in the region?
A. The toolbox they have for exerting pressure in our countries is very diverse. It ranges from legislative lobbying and public relations investment to claims that these are positive measures for the country, to judicial intimidation. The operations of these transnational companies have been designed in such a sophisticated and well-oiled manner, with such complex technologies, that they hinder the practical exercise of a right we consider crucial today: the right to consent. Not only to provide data, but also to have it withdrawn when citizens decide to do so.
Q. What can this data actually be used for?
A. The company uses that information to train a sophisticated digital identification algorithm, which they call the “human proof.” But they’re also going to use that identification algorithm to sell it to anyone who wants and needs to use it.
Q. Who can buy it?
A. They’re already running pilot programs, like in South Korea, where they’re integrating World into services like Tinder to address the problem of fake accounts, for example. The underlying goal isn’t to save the internet, but to refine a biometric technology — the Orb and its algorithm — for mass production, even to governments.
Q. In what type of state processes could it be used?
A. World could present itself to governments for electoral processes, offering the Orb as an easily accessible identification technology that has been trained on more than seven billion records. This technology is trained on enormous volumes of data, which are now the main asset of AI, and needs to continue capturing data over time to adapt to physical changes such as aging or eye diseases.
Q. Are they already making money from this iris data?
A. The company is absorbing large venture capital investments without immediate returns: the strategy is to monetize when this technology is offered on the market to those who need to solve the increasingly critical problem of identity, especially in protecting children. This necessitates age and identity verification systems for platforms, a scenario in which World presents itself as the one that “holds the key.”
Q. You have warned that they can also be used for surveillance without transparency.
A. Today, Big Tech companies are cooperating with states in unprecedented ways, especially with authoritarian regimes, which allow them to operate without oversight or accountability. Consider cases like Meta, which has begun handing over information from social media messages sent by women seeking abortions to their country’s government. Therefore, caution is necessary both with the state, given its history of abuses and increasingly sophisticated mass surveillance, and with the private sector, which also currently inspires little confidence.
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition
Tu suscripción se está usando en otro dispositivo
¿Quieres añadir otro usuario a tu suscripción?
Si continúas leyendo en este dispositivo, no se podrá leer en el otro.
FlechaTu suscripción se está usando en otro dispositivo y solo puedes acceder a EL PAÍS desde un dispositivo a la vez.
Si quieres compartir tu cuenta, cambia tu suscripción a la modalidad Premium, así podrás añadir otro usuario. Cada uno accederá con su propia cuenta de email, lo que os permitirá personalizar vuestra experiencia en EL PAÍS.
¿Tienes una suscripción de empresa? Accede aquí para contratar más cuentas.
En el caso de no saber quién está usando tu cuenta, te recomendamos cambiar tu contraseña aquí.
Si decides continuar compartiendo tu cuenta, este mensaje se mostrará en tu dispositivo y en el de la otra persona que está usando tu cuenta de forma indefinida, afectando a tu experiencia de lectura. Puedes consultar aquí los términos y condiciones de la suscripción digital.









































