• Melon Mag
  • Posts
  • Vol. 9 | Mental Privacy in the Age of Neurotechnology

Vol. 9 | Mental Privacy in the Age of Neurotechnology

Who do your thoughts belong to?

In age of neurotechnology, who owns your thoughts ?

The story we brought you in last edition of Melon Mag about the world’s first bilingual brain implant could seem like a one-off story in some distant white-walled laboratory, but the reality of mass-scale consumer neurotechnology is not as far off as we might imagine.

In just one example, Apple recently applied for a patent on sensors built into Airpods to read the electromagnetic brainwaves put out by a user (Source: ABC). Of course, patents are often big ideas that may not materialise, but with a powerhouse like Apple behind them (and their recent partnership with OpenAI), the prospect of headphones that can track and decode your brainwaves is strikingly close.

Last time, we talked about how the combination of artificial intelligence (AI) and brain-computer interfaces (BCIs) is supercharging advancements in brain technology. This exciting field is growing at an incredible pace, with breakthroughs happening almost constantly (Source: Nature Reviews Bioengineering). We're seeing everything from non-invasive tech that you can wear, to more complex implants that can help with medical conditions. These technologies are changing the game when it comes to brain health, from catching diseases early on to helping people recover from injuries (Source: Neurology). In fact, some experts think it's our “moral imperative” to encourage these advancements and help those suffering from brain-related diseases and brain injury (Source: Neuron). However, as rapid advancements reach incredible new heights, and tech giants embark on a race to bring neurotech to the masses, how will our right to mental privacy be maintained? 

As these tools get even more advanced, and big tech companies start racing to bring them to everyone, we need to ask ourselves an important question: How do we protect our right to privacy when it comes to our own minds? Well, with neurotechnology comes neuroethics; the subset of bioethics wants to ensure that any technologies that directly affect the brain are developed in an ethical manner (Source: Nature).

Our thoughts, emotional states, and cognitive processes have long been considered the final frontier of privacy. But what makes many tech and social media companies so profitable is the ability to collect (and even sell) data of its users. While this is currently limited to the interactions a user makes with their devices, there is a not-so distant future where a user won’t need to physically interact with a device in order to control it, and can simply use their thoughts.

So, who owns your brain data? Well, according to many neuroethicists… we aren’t really sure. With the field moving so rapidly, many nations have been left with gaps in policy that aren’t able to capture the complexity of what it means when big tech meets intimate brain data.

This edition, we’ll be smooshing a lot of words together as we dig in to what brain implants mean for neuro-autonomy, the privacy perils faced by neurotechnology, and the potential models for neuro-governance.

But first a quick word about this week’s sponsor - Beehiiv!

If a BCI has the ability to process information to make decisions for you, does that mean you aren’t completely autonomous anymore?

It’s as much a philosophical question as it is a practical one. And it’s the question at the crux of the neuro-autonomy debate that neuroethicists are currently grappling with.

In recent years, there has been an explosion of BCI devices aimed at reducing health burdens and supporting improved quality of life for those with health conditions. One such example is the case of ‘Patient 6’ (Source: Nature), who had an invasive BCI implanted to predict seizures. The BCI would use AI to analyse large amounts of electromagnetic brain data to predict the onset of a seizure, successfully alerting Patient 6 prior to an event so that they could take anti-seizure medication. Patient 6 described a symbiosis between herself and the technology, stating that it felt like “part of me...it became me."

This fascinating and life changing technology has and will continue to change lives for the better. But what does it mean for a private corporation to be responsible for a device implanted in your brain? In the example of Patient 6, the company who implanted her BCI eventually went bankrupt and the device had to be removed. Patient 6 grieved the loss of the device, stating “I lost myself.”

As BCIs combine our neural signals with AI algorithms to translate thoughts into actions, it creates "shared agency" where part of the decision comes from the user's mind and part from the intelligent device's predictions, blurring the lines between “real” and “artificial.”

The idea of BCIs inserting data directly into neural circuitry takes this a step further by machines potentially influencing the inputs to our thought processes themselves. With 'closed-loop’ BCI and AI circuitry gaining traction (Source: Frontiers in Neuroscience), this is a very real likelihood.

These examples highlight the importance of solid policy that can safeguard mental privacy while these technologies are still emerging.

“The age of brain surveillance has begun,” according to Nita Farahany, a lawyer and philosopher with a focus on neuroethics (Source: Harvard Business Review).

Beyond just autonomy, keeping our thoughts and cognitive processes private is one of the new vulnerabilities we may face. As neuroethicist Marcello Ienka warned "brain information is probably the most intimate and private of all information" (Source: Nature). While it all sounds like Orwellian ‘thought-police’ sci-fi, having our thoughts recorded and stored presents a very real risk for improper storage or use.

Farahany states that there is potential for our neural data to be exposed through hacking or misuse by companies providing BCI services (Source: Harvard Business Review). Farahany notes that there is potential for these technologies to improve employee performance, wellbeing and productivity. But there are also wider societal risks of corporations using BCIs and AI to businesses decoding and monetising our motivations, emotional vulnerabilities and decision-making patterns using this neural data trove. Farahany

Now, this is not at all to say that these technologies are bad. They are incredible feats of human ability and have the potential to change millions of lives for the better. But, as Ienka says, "when a technology is in its germinal stage, it's... hard to predict the outcomes. But when the tech is mature...it can be too societally entrenched to improve it" (Source: Nature). Therefore, ensuring that privacy is safeguarded before these technologies become a part of everyday life is key to maintaining our mental privacy.

Neuroethicists caution that we are at a critical juncture where policies can still be put in place before our current (and somewhat limited) security practices become redundant and fail to protect mental autonomy and privacy. As Farahany states, “recognising cognitive liberty” will offer safeguards to mental privacy.

Voids in current regulatory frameworks allow unrestricted decoding and commerce of neurodata, something which the authors of a paper in Nature Protocols are arguing must be rectified immediately. Coughlin and colleagues argue for a “technocratic oath” similar to the Hippocratic oath which all medical professionals must abide by.

Their other recommendations for policy are:

  1. Implement ethical and human rights guidelines specific to neurodata

  2. Uptake of technical solutions such as data encryption to ensure data protection

  3. Categorise all brain data as sensitive health data

  4. Apply medical regulations to all data gathered by existing neural devices

At the core of these principles is the importance of maintaining mental privacy as a human right, mandating personal neurodata ownership rights, safeguarding against commercial exploitation without consent, regulating governmental use of brain data, and creating protocols to protect our minds from involuntary bridging or hacking by AI systems.

Thank you ;

for joining us for another edition of Melon Mag!

If you’re still wrapping your head around the hypothetical dilemmas that come with the future of neurotech, we are right there with you. It’s an ever-changing space that deserves a lot of attention as wearable brain devices expand from clinical use into the consumer market.

For more info, Nita Farahany is an expert in the field and has navigated the complexities in her book ‘The Battle for Your Brain: Defending The Right To Think Freely In The Age Of Neurotechnology’ or in her TED Talk ‘Your right to mental privacy in the age of brain-sensing tech’. Both are excellent starting points that get to the mind-blowing benefits and potential hurdles the neurotech industry are facing.

Where do you stand on the BCI brain data debate? Let us know in the poll below!

Would you wear headphones that can read your brainwaves?

Tell us below.

Login or Subscribe to participate in polls.

We also love hearing your thoughts, so be sure to let us know what you want to hear more of by clicking the survey below. The survey takes less than 2 minutes to fill out, and helps us bring you the best brain science we can!

💙 The MM Team

REFERENCES

  • Belkacem, A. N., Jamil, N., Khalid, S., & Alnajjar, F. (2023). On closed-loop brain stimulation systems for improving the quality of life of patients with neurological disorders. Frontiers in Human Neuroscience, 17. https://doi.org/10.3389/fnhum.2023.1085173

  • Farahany, N. A. (2023a, March 14). Neurotech at work. Harvard Business Review. https://hbr.org/2023/03/neurotech-at-work

  • Farahany, N. A. (2023b). The battle for your brain: Defending the Right to Think Freely in the Age of Neurotechnology. St. Martin’s Press.

  • Robinson, J. T., Rommelfanger, K. S., Anikeeva, P. O., Etienne, A., French, J., Gelinas, J., Grover, P., & Picard, R. (2022). Building a culture of responsible neurotech: Neuroethics as socio-technical challenges. Neuron, 110(13), 2057–2062. https://doi.org/10.1016/j.neuron.2022.05.005

  • Schalk, G., Brunner, P., Allison, B. Z., Soekadar, S. R., Guan, C., Denison, T., Rickert, J., & Miller, K. J. (2024). Translation of neurotechnologies. Nature Reviews Bioengineering. https://doi.org/10.1038/s44222-024-00185-2

  • Trilling, J. (2024, May 21). How do you feel about your earphones monitoring your thoughts, your pulse, your emotions? The reality is closer than you think. ABC Listen. https://www.abc.net.au/listen/programs/perth-drive/smart-ear-phones/103876336

  • Voigtlaender, S., Pawelczyk, J., Geiger, M., Vaios, E. J., Karschnia, P., Cudkowicz, M., Dietrich, J., Haraldsen, I. R. J. H., Feigin, V., Owolabi, M., White, T. L., Świeboda, P., Farahany, N., Natarajan, V., & Winter, S. F. (2024). Artificial intelligence in neurology: opportunities, challenges, and policy implications. Journal of Neurology. https://doi.org/10.1007/s00415-024-12220-8

  • Yuste, R. (2023). Advocating for neurodata privacy and neurotechnology regulation. Nature Protocols, 18(10), 2869–2875. https://doi.org/10.1038/s41596-023-00873-0