Neurorights against mental surveillance
Neurotechnology experts say that new human rights must be introduced quickly to protect our mind against risks of cerebral intrusions and manipulations.
In my first article, I explained that neurotechnologies can record our brain activity and that artificial intelligence can analyze this data to decode our state of focus, our emotions and even the content of our thoughts. It represents a great opportunity for technological companies which thrive by collecting data from their users, especially to sell targeted ads to advertisers. In the age of surveillance capitalism, our personal data is already very valuable but our neural data, very rich in sensitive informations, will be worth gold.
Many enterprises will soon get busy to record as much brain data as they can to train artificial intelligence systems, do targeted advertising, neuromarketing or to sell the data to other companies or institutions. As these intrusive but more or less freely consented commercial practices happen, armies, militias, terrorist groups, surveillance and weapons companies will also use neurotechnologies to move towards their goals.
Our mind is the last bastion of privacy in our societies where mass surveillance is becoming more and more omnipresent and invasive. This last defense is now under severe threat.
An increasingly widespread technology
The fast development of technologies capable of sensing and modifying brain activity opens new possibilities in medicine, for example to assist paralyzed or blind persons. But neurotechnologies won’t be used solely for therapeutic purposes. Devices equipped with brain sensors are already being distributed to employees of thousands of companies around the world (read The Battle for Your Brain from Nita Farahany, pp.40-64) and even to young children in classrooms to monitor their concentration levels.
Brain sensors are also embedded in earbuds, for example the MN8 from Emotiv and maybe soon the Airpods from Apple. The California-based company has filed a patent to equip its earphones with electrodes that can detect brain electrical activity.
Consumer grade devices are available as well, for example to facilitate meditation, improve sleep quality or play video games.
Neurotechnologies are also progressively embedded in extended reality devices. Virtual reality headsets such as the Vision Pro from Apple are already equipped with multiple biometric sensors. Eye-tracking, for example, allow users to interact with the device via their eyes. The multinational corporation doesn’t intend to stop there and has filed a patent to use this feature to get biofeedbacks about the consumers, in particular their mental states.
It is very likely that most extended reality headsets and glasses will also be equipped with brain sensors in the future, whether embedded in the earphones or placed on the head of the users.
Risks related to neurotechnology
Neurotechnologies will progressively be added to many consumer devices and will thus impact of the life of many people. But the list of risks is long:
Mass surveillance by governments and private actors (Read the United Nations report “Impact, opportunities and challenges of neurotechnology with regard to the promotion and protection of all human rights”, p.18)
Manipulation of brain processes involved in the intentions, emotions and decisions of people
Risks for mental integrity, human dignity, personal identity, freedom of thought and autonomy (Read the Unesco report “Unveiling the neurotechnology landscape”, p.27)
Cyberattacks against brain implants, which can have numerous consequences such as the theft, modification or erasement of memories. They can also impact the physical autonomy of the person or have important psychological effects such as anxiety, depression or other mental health issues (Read the Unesco report “Unveiling the neurotechnology landscape”, p.27)
Crafting of weapons to disable and disorient the human brain
Sale of our brain data to third parties (Read the Neurorights Foundation report “Safeguarding Brain Data: Assessing the Privacy Practices of Consumer Neurotechnology Companies”, pp.51-57)
A United Nations report published in August 2024 summarizes the current situation in a few lines:
“The possibility that, in the coming years, those [neurotechnology] products with inadequate safety measures and unclear or underestimated human rights risks may be widely commercialized is real. They may become pervasive throughout daily life despite the fact that, in most countries, applicable regulations are unclear, weak or non-existent. Existing loopholes in regulations, lack of technical expertise and capacity and the absence of adequate oversight bodies are factors that will certainly be exploited by large companies seeking profits. The risk is that, without the necessary guardrails, the industry will continue growing unfettered in the same direction: prioritizing profitability and convenience over ethical and human rights considerations.”
- United Nations report “Impact, opportunities and challenges of neurotechnology with regard to the promotion and protection of all human rights”, p.4
In the face of these many dangers, neuroethicists and neurobiologists are mobilizing to protect our rights at the dawn of this new era of neurotechnologies.
Neurorights
In April 2017, Marcello Ienca, a biomedical ethics researcher at the University of Basel in Switzerland, and Roberto Andorno, a law professor and bioethics researcher at the University of Zurich in Switzerland, published an ethical-legal analysis of human rights in the age of neurotechnology, together with an ancillary article. In these papers, they assessed that current rights are not sufficient to respond to risks related to neurotechnologies. They thus promote the reconceptualization of certain human rights or even the creation of new rights. The term neurorights was born. (Read On Neurorights by Marcello Ienca, p.2)
The aim of neurorights is to protect us from the dangers of neurotechnology by ensuring:
Cognitive liberty
It protects against a coercive and nonconsented use of neurotechnology. It also protects our right to make free and competent decisions regarding our use of this technology. It is the main right from which the three following rights follow.The right to mental privacy
It protects us against the intrusion by third parties into our brain data as well as the unauthorized collection of those data.The right to mental integrity
This right, which is already recognized in the European Union’s Charter of fundamental rights, should be broadened to also protect against illicit and harmful manipulations of our mental activity enabled by neurotechnologies, according to Marcello Ienca and Roberto Andorno.The right to psychological continuity
It preserves our personal identity and the continuity of our mental life from unconsented alteration by third parties.
The year 2017 was fundamental for neurorights. Only one month after the publications by Marcello Ienca and Roberto Andorno, neuroscientists, neurotechnologists, clinicians, ethicists and machine-intelligence engineers from several countries gathered for three days at Columbia University, in the United States, to discuss the ethics of neurotechnologies. The academic leaders, united as the Morningside Group, also came to the conclusion that the existing ethics guidelines were insufficient in this realm. They thus elaborated recommandations to address this deficit, which they also called neurorights.
Based on the work of the Morningside Group, the Neurorights Initiative was launched in 2019 at Columbia University to serve as an advocacy organization for human rights and to develop further ethical guidance for neurotechnological innovations. It was then incorporated into the Neurorights Foundation, founded shortly after.
The Neurorights Foundation is one of the institutions at the forefront of the defense of human rights in the face of the dangers of neurotechnology. Led by neurobiologist Rafael Yuste, the foundation has been engaging discussions for years with many governments, international institutions, regional organizations, companies and scientists to raise awareness about the ethical implications of neurotechnology. Its objective? That five neurorights be incorporated into international human rights law, national legal frameworks and ethical guidelines so that people will be protected from the potential misuse or abuse of neurotechnology.
Some of these rights are identical to those defended by Marcello Ienca and Roberto Andorno:
The right to mental privacy
It protects our brain data so it will be kept private and can’t be decoded, transferred and sold without our consent.The right to personal identity
It prohibits neurotechnologies from disrupting our sense of self.The right to free will
It prohibits neurotechnologies from manipulating our perceptions, memories and behavior so that we have control over our own decision making.The right to equal access to mental augmentation
It allows all citizens to have equality of access to neurotechnologies.The right to protection from algorithmic bias
It protects us from potential prejudices and discriminations created or amplified by neurotechnologies.
The Neurorights Foundation specifies that it does not advocate for the creation of new human rights. The neurorights approach involves further interpreting and clarifying human rights law and amending existing laws.
Rafael Yuste, the chair of the Neurorights Foundation, is part of a documentary on neurotechnologies and the human mind directed by Werner Herzog.
Marcello Ienca, Roberto Andorno and Rafael Yuste are leading figures in the defense of human rights in the face of neurotechnologies, but they are not the only ones.
The battle for your brain
Nita Farahany is a law and philosophy professor at Duke University, in the United States. Author of The Battle for Your Brain, which defends the right to think freely in the age of neurotechnology, she has been exploring the ethical challenges of brain technologies for more than a decade. Her articles have been published in The New York Times, The Washington Post, The Wall Street Journal, Wired, BBC, CNN, Politico and The Atlantic, among others, and she has been invited to numerous video shows and podcasts. Her two Ted conferences on our right to privacy in the era of neurotechnology, When technology can read minds, how will we protect our privacy? and Your right to mental privacy in the age of brain-sensing tech, have reached more than three million views.
Nita Farahany also asserts that the current human rights don’t protect individuals from the risks of neurotechnologies. To fix this problem, she promotes a right to cognitive liberty on an international level.
The neuroethicist defines cognitive liberty as a right from mental interference by others and a right to self-determination over our mental experiences. This implies the recognition of three interrelated human rights:
The right to mental privacy
It safeguards us from interference with our automatic mental reactions, our emotions and our thoughts.Freedom of thought
It protects us from interception, manipulation and punishment of our thoughts.Self-determination
It secures self-ownership over our brain and mental experiences.
Nita Farahany considers cognitive liberty as an update of liberty in the digital age. She thinks this right can be recognized as existing within the Universal declaration of human rights, which should be adapted to specify that the right to privacy also includes mental privacy.
Steps in the right direction
The experts presented in this article all agree that our laws and human rights charters are not sufficient to protect us from the neurotechnologies which are quickly coming into our lives. Fortunately, their recommendations for the introduction of neurorights are sometimes heard. Several governements and international institutions are indeed moving in the right direction so that we can benefit from updated rights for our digital age. But time is running out. In my next article about neurotechnologies, I will present an overview of the progress of neurorights on the planet.
- Arnaud Mittempergher




