translated from Spanish: How scientists want to stop them from hacking your brain

The phrase sounds like the introduction of a chapter of the series “Black Mirror”, but it is told to the BBC Mundo Rafael Yuste , director of the Center for Neurotechnology at Columbia University in New York.
Yuste is one of the scientists determined to regulate the future use of neurotechnologies: tools that are developed today to map and modify the activity of the human brain.
As a spokesperson for a group of 25 scientists and engineers, Yuste proposed in 2017 to incorporate five inalienable neurorights to the human rights chart: mental privacy, personal identity, free will, equitable access and non-discrimination.
“Our human brain is already in the cloud, but it remains to be seen whether our ideas or feelings can be turned on a computer”
The extraordinary scientist who studied Einstein’s brain and left us excellent news about our own brain
In June, he visited Santiago de Chile, where every year the Senate technology committee organizes a Congress of the Future where they exhibit some of the world’s leading scientists and intellectuals.
Through this instance, the scientist initiated talks to incorporate these neurorights into the Chilean constitution in the future.
“Chile would be a precedent,” he says. “I’m happy to be able to work on it.”
One hundred billion neurons
Yuste was one of the first advisers to the “Brain” project launched in 2013 by then-US President Barack Obama to drive and fund neurotechnologies capable of “mapping” the brain.
A year earlier, the Madrid-born researcher had been named one of the most influential scientists in the world by the British journal Nature.
“Obama launched the ‘Brain’ project as the start of the space race was launched,” Yuste recalls.
In this race, the United States is now accompanied by other countries. Japan, China, South Korea, Australia, Canada, Israel and Europe, which have their own versions of the project.”
Yuste explains the scientific appeal of the project.
“The brain works electrically: we have 100 billion neurons inside the skull. The number and connections are astronomical, in our head there are more connections and nodes than all over the Earth’s internet,” he explains.
“All that complexity of neurons is being fired electronically and through processes that we don’t understand. From there arises the vision, the sensations, the behavior, the ideas, the memory, the emotions, the consciousness, the mind, everything we are. That’s why it’s so important to have neurotechnologies capable of mapping them.”
The risks of an “increased” person
Neurotechnologies use optical, electronic, magnetic and nanotechnologie techniques to understand these processes and in the future, “read and write” brain activity.
“It’s something similar to what we had to decipher the human genome: no one knows who’s going to get there first,” Yuste says. “But the concrete thing is that someone is going to come, opening up new opportunities. And also risks.”
“In the U.S., a flexible two-square-centimeter computer chip, with a thickness of 100 microns, is being manufactured to implant it under the skull, in the brain,” Yuste says.
“After the intervention, the person may wear a cap or helmet with the electronic components to control this chip implanted in his brain.”
This neurotechnology is designed, to, for example, connect a camera to a non-seer patient and transmit the images to their brain through a chip.
“We know that vision is generated in the cerebral cortex and that most blindness is caused by problems in the eye. In these blind patients, you could install a visual prosthesis connected to a camera. The camera would function as the eye and the cortex would receive the signals through the prosthesis, making the person able to see,” explains the scientist to BBC Mundo.
“But imagine that you install the same prosthesis to a person who sees well, and that prosthesis no longer connects it to a camera, but to a group of cameras capable of viewing in infrared, or to a camera installed elsewhere on the planet, or to a television screen where the person I could read information,” he adds.
“That person might perceive things that the rest can’t, and he would have access to information that the rest he couldn’t have. He would be an ‘augmented’ person… Combined with an artificial intelligence system, the person could go down the street looking at people and detecting each person’s information, these kinds of uses of neurotechnology have to be regulated before it is too late.”
Another risk of neurotechnology according to Yuste is its military use.
Because the same chip implanted in the brain that allows you to receive information could transmit it from the brain to a robotic arm, or a tank.
There is no standard for these developing neurotechnologies today: there are no laws that prioritize their use, whether among patients with disabilities or healthy people who want to “increase” their abilities.
“The brain never stops changing, so we never stop learning and transforming”: neuroscientist Marian Sigman responds to readers
How to know when your brain is most efficient in the day
There are also no regulations on theft or manipulation of brain data.
“I have a very positive opinion on neurotechnologies and I think it is essential to develop them in order to help patients with neurological or mental diseases. But the same tools can be used for better or worse,” warns the scientist.
Human Neurorights
Yuste describes to BBC Mundo each of the five neurorights with which it seeks to avoid the misuse or inequalities that neurotechnologies could generate.
The issue is already of concern to the scientific community. In addition to Yuste and his group, in 2017 the neuroethics expert Marcelo Ilenca and the Swiss human rights lawyer Marcelo Adorno published another document in the same line warning on the same subject.
The first neuroright is mental privacy. “We want it to be a fundamental human right: that the content of your mind cannot be extracted without your consent and that it has the same legal treatment as human organs,” Yuste explains.
Yuste was one of the pioneers and first advisors to the project “Brain” launched in 2013 by ObamaPersonal identity and free will are two other rights to secure in a world where neurotechnologies will be able to act on cognitive abilities and individual decisions, the scientist explains.
“Imagine the case of a soldier who could be handled from the outside, connecting his mind to a network through a prosthetic. That person’s identity could be totally dissolved and so could his decision-making ability.”
The fourth right is responsible for ensuring equitable access to neurotechnologies.
“These technologies are going to be very expensive, and only certain social groups in certain countries will have access to them. In the case of neurotechnologies used to increase certain sensory or cognitive abilities, we want to avoid a social fracture, where some people have superior abilities than others.”
One of the risks of neurotechnologies is their possible military use. A chip implanted in a brain could transmit information to a tankYuste proposes the example of transplants.
“Today when you have several patients waiting for an organ, the medical community decides who is transplanted, based on medical and justice criteria. The same criteria should define the possibility of increasing a capacity through neurotechnology,” he says.
The fifth right is to protect people from the discriminatory biases and traits of artificial intelligence algorithms.
“If we decide to use artificial intelligence algorithms that change the functioning of your brain from the outside, we must ensure that those algorithms do not project those biases in your brain. Otherwise, there would be no way forward in creating more just, more peaceful societies.”
Neuroscience and terrorism: A group of extremists lets us analyze their brains in Barcelona
“A New Renaissance”
Yuste has hopes for both neurotechnology and humanity’s ability to regulate them.
“These technologies will have an impact on society as a whole, allow us to treat patients, but they will also open up new fields of development to countries; will allow us to change education, justice.”
Neurotechnologies pose the challenge of protecting the five inalienable neurorights according to Yuste: to mental privacy, personal identity, free will, equitable access and non-discrimination”Today we educate children with methods we inherit from past, but if we understood how the mind works, we could have a much more efficient education,” he says.
“Today you catch a criminal and imprison him. But if we understood why he did what he did, that criminal would become a patient,” he adds.
“I think we are in a new renaissance: in the first, man began to understand his role in the world. Now, we can understand each other inside, finally understand who we are.”
“First it is up to us as a society to organize the rules for these neurotechnologies to be used in the sense of the common good. And the time to do it is now,” he concludes.

Original source in Spanish

Related Posts

Add Comment