_
_
_
_
_
Neuralink
Tribune
Opinion articles written in the style of their author." These texts are to be based on verified facts and must be respectful towards people, even though their actions may be criticized. shall feature, along with the author's name (regardless of their greater or lesser renown), a footer stating their office, academic title, political affiliation (if any) and main occupation, or the occupation related to the topic being assessed

Elon, the monkeys and you: the atrocity of giving up control of our thoughts

The owner of Neuralink recently announced that he has transplanted one of his devices into a human, without offering more technical details beyond the fact that the subject has survived

Elon Musk Neuralink
Elon Musk next to the logo of Neuralink, his neurotechnology company that now implants chips in humansDADO RUVIC (REUTERS)
Paloma Llaneza

We can hate others from the intimacy of our minds. We are the owners of our thoughts and our silence. We have the right not to testify against ourselves and to take advantage of the Fifth Amendment. Some of us think in words; others, in images. But we all take refuge in that hardware that is our brain, retreating there to criticize our boss, hate our roommate who doesn’t pay the bills, or lust after the neighbor’s wife. As long as we don’t carry out bad intentions, we’re protected by the confidentiality that the mind offers. And that’s very powerful… so powerful that it guides revolutions, drives hostile takeovers, invents realities — such as states and laws — and disrupts social peace. Religions have even attempted to program us to not have impure thoughts and to cap our imaginations, lest the line between intention and action be lost.

Which ruler wouldn’t want to know what’s going on in their citizens’ heads? How many fights between couples would end if you could actually verify what a partner is thinking? Interesting people would be unmasked amidst their silence; police would read the minds of the people they interrogate.

In front of everyone, our thoughts are a space of rest. We are what happens in our heads: the conscious, the unconscious, our ego and our superego. That’s why two recent pieces of news have recently plunged me into unrest.

I’ve been ruminating on the first since Elon Musk decided to create a company to develop brain chips. I already had a shock when he killed a dozen monkeys that had been implanted with these devices. After that, he asked for human volunteers… and he got them.

Through his Twitter account (I refuse to call it “X”), Musk announced that his company had transplanted one of its devices into a human, without giving more technical details other than noting that the subject had survived. We don’t know for sure if the transplant recipient has any disease related to motor skills, but what Musk has promised everyone is that we’ll be able to go on our mobile phones using nothing but our minds.

Anyone who has observed how he’s been managing his social media site should have no doubt about what will happen to their identity, ideas and secrets if they decide to put them in the hands of a depraved oligarch.

The other piece of news that startled me was that the director of Spain’s Data Protection Agency recently told EL PAÍS that — driven by the legitimate mission of protecting minors from the evils that lurk behind the screens — ”the agency is going to collaborate in the preparation of [legalization] for the comprehensive protection of minors on the internet, with the inclusion of so-called ‘neurorights.’ According to experts, young people are more vulnerable to the impacts of technology on their neurodevelopment, as their brains are still in training.”

Let’s stop for a moment here, because so much good will –—mistaken in its objective and definition — requires clarification. Neurorights (free will, mental privacy, etc.) are designed based on neurotechnologies and their obvious dangers.

According to a report issued by the Office of Science and Technology of Spain’s Congress of Deputies, neurotechnology “allows a direct connection between a device and the nervous system (central and peripheral) to record or modify nervous activity. It combines neuroscience with other advances in artificial intelligence, robotics, or virtual reality, to modulate or measure various aspects of brain activity, including consciousness and thinking.”

Is the head of the Data Protection Agency assuming that children are going to connect something to their heads to enter TikTok, the metaverse (whatever that is), or a video game? Is such a device supposed to protect them?

Those of us who believe that brains are configured according to what happens to them in their formative years must want to ensure that the wiring is healthy. We don’t want to be on the side of pseudoscientists, who believe that brain-machine technologies should be consumer goods, like smart phones.

Giving up control of our thoughts seems atrocious to me. If anyone believes that whoever accesses our brain and collects our neural data isn’t going to misuse it, they’ve been living in a different dimension for the last 20 years… or they have some sort of financial interest to advocate for hooking up technology to our brain.

Brain implants can help those who suffer from paralysis as they attempt to recover mobility. However, this technology shouldn’t be handed out so that everyone can use it while playing Fortnite, changing TV channels with their mind, or answering emails with their thoughts. In such a scenario, climate change won’t kill us: comfort will kill us. Wall-E isn’t a movie anymore — it’s a premonition.

It’s sad when those responsible for securing our data consider the battle to be lost. Members of the scientific community are working on the development of neurorights under the assumption that this neural data is going to be collected en masse. They seem to assume that any resistance to the advancement of science ― even if no one has asked for this giant step — is unstoppable. For them, resistance isn’t only futile — it’s a mistake.

Neurorights — amidst the vastness of the internet, with limitations on people’s economic, personal and technical knowledge — won’t be possible to guarantee. Are “data supervisors” really going to sanction the biggest video game companies in the world for collecting the thoughts of players and using them for their own benefit?

We already know what the cost of this erroneous thinking is. Our countries should have the same courage as the Supreme Court of Chile: there’s no need for universal neurorights if we control the manufacture, sale and distribution of brain-machine devices and regulate them as medical devices. Let’s use this technology in environments in which it can actually be beneficial to human beings, while prohibiting its general use. Otherwise, not only will it not be beneficial: we will be incapable of controlling it. No data is better protected than the data that simply isn’t collected.

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

More information

Archived In

Recomendaciones EL PAÍS
Recomendaciones EL PAÍS
_
_