‘What seemed like science fiction is already here’: why it’s important to talk (seriously) about neurorights
In Paris, UNESCO gathered a group of experts to discuss the challenges and opportunities for humanity regarding neurotechnology
Recovering the ability to communicate after having lost it due to a degenerative disease. Turning switches on or off with your mind. Writing on the computer by just thinking about the words. All of these things may sound like science fiction, but they’re actually realities. Humanity has had access to these abilities for several years, thanks to brain implants.
Scientists, government representatives and senior UN officials gathered this past Thursday in Paris to address the ethical problems arising from these significant advances in neurotechnology, while trying to draw up a roadmap to regulate this branch of engineering as soon as possible.
“UNESCO’s interest is to build the basis of an understanding about the challenges of neurotechnology, from an ethical perspective,” says Gabriela Ramos, deputy director-general of Social and Human Sciences at UNESCO. The specialized UN agency organized the meeting.
The biggest debate of all – according to the experts who spoke throughout the day – is the dichotomy between the technology’s development and respect for human rights. There’s an active search for a balance to guarantee that freedom of thought and privacy are respected, without stopping scientific research that can benefit humanity.
“What seemed like science fiction is already here. If we don’t act quickly, the same thing will happen with the internet, social media or artificial intelligence, which got out of control,” explains Spanish neuroscientist Rafael Yuste, a professor at Columbia University. In addition to being an expert when it comes to deciphering the secrets of the human brain, Yuste is also a pioneer when it comes to safeguarding the human rights that can be compromised by this technology. In 2017 – years before managing to manipulate the behavior of mice by intervening directly in their brains – the Spaniard created the Neurorights Foundation, to promote five fundamental principles: mental privacy, personal identity, free will, equal access to life-improving technologies and protection from biases.
The risk, explains Yuste, is that the same tools which – in medicine – can help improve people’s lives, can also end up violating the information stored in the brain. “Although the roadmap is beneficial, these technologies are neutral and can be used for better or worse,” he notes. This isn’t only about securing personal data, such as shopping habits, a home address, or which political party one supports – it also involves things as intimate as memories and thoughts. And, in the not so distant future, even the subconscious.
“The same thing happens with foreign languages – at first, it’s easier to read them than to speak them. We’ve been reading the brain for a long time… there are about five or 10 years left until we’re also capable of manipulating it,” Yuste adds.
Regulation to protect neurorights
Back in 2021, Chile was the first country to spearhead neurorights, after the government introduced a constitutional amendment to protect brain activity. The reform – which was approved unanimously – recognizes the need to carry out any type of scientific and technological development “with respect for life and the physical and mental integrity of people.” Meanwhile, the Congress of Chile is continuing to work on a bill that codifies neurorights, according to the recommendations of a group coordinated by Yuste and made up of 25 international specialists in neuroscience, law and ethics.
“We understand that the regulation of these technologies cannot be based on fear of technological development, but rather on an unwavering commitment to human rights. And that’s a complex task, because the legislation has to leave enough space for innovation,” explains Carolina Gainza, the Chilean government’s undersecretary for Science and Technology. “This is why it’s important to promote a discussion informed by evidence and ethical awareness. [We need to] have an open mind to be able to imagine new possibilities that, until now, we haven’t even imagined.”
Gabriela Ramos – who recently moderated the debate among the main international actors in this field – acknowledges that the Chilean model works, since it begins from a very basic concept: the idea that neural information shouldn’t be commercialized. “If we have a solid regulatory framework, with transparency and accountability, there are no reasons to be afraid of this [technological] revolution,” affirms the deputy director from UNESCO.
Currently, Yuste’s group is working in Brazil – the second country that has introduced a constitutional amendment similar to the Chilean one. It will be voted on in the coming months by the Senate. Spain, for its part, published a Charter of Digital Rights – the first document of its kind in Europe – which came to light after more than a year of work. Multiple experts participated in this project, coordinated by Carme Artigas, secretary of state for Digitization and Artificial Intelligence. “There are two aspects that concern me the most in this field. First, I think it’s important that we don’t make the same mistake as with artificial intelligence, when we let ourselves be guided by the industry, instead of the academic world,” the secretary acknowledged during her speech at the Paris forum. “Second, the potential benefits of this research must be made accessible, so that everyone can take advantage of these advances when it comes to health.”
The Spanish document – which isn’t yet legally applicable – sets out some initial bases that will guide future policy on tech. “The guidelines are a good place to start talking about [this subject] – there are many countries and international organizations that are doing it. However, they don’t solve the problem. What you have to do is really take charge of it and change the constitution to protect the citizenry,” Yuste emphasizes.
A market in private hands
The strong component of private investment in this type of technology is one of the factors that worries experts the most. A market analysis by Yuste’s Neurorights Foundation has calculated that there are more than $33 billion currently invested in private neurotechnology projects… an exorbitant figure compared to the $10 billion that are invested globally in all publicly-funded research projects that examine the brain.
A fundamental role in this area is played by Milena Costas, a UN human rights expert. Together with her team, she’s working on a study into the impact, opportunities and challenges of neurotechnology, based on a questionnaire that is being supplied to various governments and international organizations. “The opportunities are endless. Especially in the medical field, when we talk about applications to make diagnoses and determine treatments for neurological diseases,” Costas explains. “What may be more problematic is the rapid commercialization of these technologies, which are already available on the market.”
The scenario is even more disturbing if we look at the results of another study that Yuste presented during the UNESCO conference. It will be published in September. After analyzing the consumer contracts of the 24 largest neurotechnology companies in the world – most of which are located in the United States and Canada – the researchers have been able to verify that all companies, without exception, take control of all user neural data. “Not only do they have [the data], but they can do whatever they want with it. This information can be destroyed, decoded, sold. And half of the companies make users pay to consult their own data,” Yuste laments. He didn’t want to provide the names of the firms in question.
Another of the emerging concerns from the scientific community is the difficulty in monitoring non-invasive technology, such as video game glasses or fitbits, which already have access to a lot of information about the lives of their users. “From the point of view of patients, this is perhaps the most relevant revolution. But the fact that these are external implants makes their regulation feel less urgent… this is a mistake,” Yuste warns. In fact, the progress of more daring projects that plan to implant chips in the brain – such as the case of Neuralink, Elon Musk’s company, which has been waiting for years to test its implants in humans – is slow, since activity depends on permission being granted by regulatory agencies.
With these questions in mind, Costas insists on the need to regularize this technology, thinking – above all – about the rights of the most vulnerable groups, such as children, people with disabilities and the elderly. “We must never forget that, despite the fact that advances may be advantageous, they cannot be accepted without [precautions] – [users shouldn’t have to] sacrifice mental privacy or freedom of thought. Trying to define the red lines more precisely isn’t a way of limiting the growth of this technology, but a boost for it to develop in the most useful way for humanity,” the expert affirms.
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition