Neurotechnologies and Data Protection – Challenges to Consent

Thorsten Kranz//

The emergence of neurotechnologies represent a ground breaking intersection between science, engineering and medicine. Whilst their recent proliferation has been largely contained to the health and research sector, use may enter – and in some cases, already has entered – our workplaces and our homes, for example under the guise of providing more personalised services.1

Neurotechnologies can be defined as devices and procedures, both invasive and non-invasive, that directly record and process neurodata with the aim of gathering data, controlling interfaces or devices, or modulating neural activity. They include implantable, semi-invasive and non-invasive (wearable) devices.

Essentially, these technologies can be considered under the following divisions:

  • Read devices such as a medical functional magnetic resonance image (fMRI) scanner, designed to image the brain activation patterns or electroencephalogram device (EEG) which can detect electrical activity of the brain.

  • Read-write devices such as headsets designed to assist with mental health or wellbeing. The write aspect of these devices can be further broken down onto two aspects; neuromodulation and neurostimulation. Neuromodulation relates to processes seeking longer-term change in brain activity, such as with the treatment of a neurodegenerative condition. Neurostimulation seeks to provide a shorter-term effect.

Neurotechnologies are already being adopted across a number of different sectors, with growing interest for new and novel uses. They could conceivably be embedded in the next generation of our consumer devices and wearables such as earbuds, headphones and augmented reality headsets. If they were rolled out in consumer devices and became more cost-accessible, other use cases would be likely to emerge and we can anticipate their possible use in employment sectors from office-based roles to high-risk environments such as the use of heavy machinery.

While the accessibility of neurodata comes with positive uses, it also raises new ethical questions around human agency, human dignity and identity, augmentation and enhancement, beyond privacy and consent. After all, what could be more intimate than our very minds and potentially our thoughts and autonomy? Effective data protection and privacy standards, intertwined with the need to respect other fundamental rights, such as the right to mental integrity and human dignity, will be a critical aspect in preventing misuse of information that may lead to new forms of discrimination or reinforcing those that already exist, the undermining of our ability to provide meaningful consent and possibly impacting fundamental notions of personhood and identity.

The very specific and sensitive nature of these data, which may not be by default, health or biometric data under regimes such as the GDPR, raises the question of, first of all ‘if’, and then how, they should be used and which specific safeguards should be put in place.

The data processing in the context of deployment of these technologies needs to be assessed having regard in particular to the data protection principle of necessity and proportionality.

On 15th May 2025, the International Working Group on Data Protection and Technology (IWGDPT), known as the “Berlin Group,” has published a working paper on data protection in connection with neurotechnologies. The paper explores questions relating to lawful basis and consent, the use of neurotechnologies on children, security and technical requirements for neurodata.

The paper sets out high level definitions of neurodata and neurotechnologies, as well as brief overviews of key terminology, data flows and current uses as well as an overview of the current regulatory context. Critical issues and challenges are discussed and recommendations to different stakeholders are formulated. In the following, we want to highlight some challenges with consent specific to the context of neurotechnologies that have been identified by the Berlin Group.

Challenges to consent

The complexity of both neurotechnology and the data it processes creates intrinsic challenges to consent as a legal basis for processing. As set out under the GDPR and other regulatory frameworks, valid consent must be freely given, informed, specific, and unambiguous. Here, we want to stress challenges arising from the first three of these requirements.

Freely given: Consent must be voluntary, without coercion or negative consequences derived from refusal. Consent may only be freely given in circumstances where there is no power imbalance at play. While there are many considerations to keep in mind when assessing whether consent has actually been “freely given,” there are some circumstances in which “freely given” consent is near impossible. This, in addition to concerns on the necessity and proportionality of the interference of the data processing with the fundamental rights and freedoms of the person concerned, will largely rule out consent-based neurodata processing in scenarios such as employment, education, the military, or justice, to mention only some significant examples.

Informed: Data subjects must be fully informed about the purpose, duration, risks, and benefits of neurodata processing. They should understand how their data will be used, and inappropriate expectations should be avoided. But it is exceptionally difficult for data subjects to be fully informed when it comes to neurotechnology and neurodata processing. The technology itself is incredibly complex and individuals without high technical expertise are unlikely to fully grasp the nature of the technology or data flowing through it. In addition, experts in the field of neurotechnology are still determining what can be gleaned or inferred, meaning researchers, doctors, or companies behind the neurotechnology may also not be capable of fully disclosing the range of data that may be collected or drawn from the information. When scope and possibilities are unknown even to experts, how can we expect data subjects to be fully informed of these matters? To address this challenge, it is recommended to perform user testing to receive feedback on the accessibility, understandability, and ease of use of the proposed transparency approach.

Specific: The specificity of consent refers both to the specific data being processed and the specific purpose it may be processed for. The specificity of data is particularly difficult due to the nature of neurodata. Neurodata is involuntarily and subconsciously created by data subjects. Individuals may not be able to control what elements of neurodata will be picked up by neurotechnology. Crucially, data subjects may often be unaware of the volume and nuances of their neurodata. For example, let’s say a data subject is using neurotechnology for gaming. They may believe they are just consenting to collection of neurodata related to the desired movements of the game, reaction time, etc. In reality, the neurodata may contain broad brain patterns and responses that derive from many other areas (focus, mental illness, emotional state, the physical health of the brain, etc.), as well as inferences that can be drawn from neurodata to reveal much more sensitive and personal information. It is incredibly difficult to be specific about the scope and nuances of neurodata since our understanding of what is and can be revealed in it is constantly evolving. However, this very ambiguity must be described to data subjects to ensure they understand the scope of what they may be revealing with their consent. This can be partially addressed by mandating that neurodata controllers be extremely specific about what exactly they will seek to glean from the neurodata, exactly how that data will be used and who it will be shared with, and strictly limit themselves to solely the processing activities clearly described and agreed upon within the consent. Given the very high risks for the rights and freedoms of the data subject, neurodata should only be collected for the functioning of the device (device functionality) and be deleted as soon as no longer needed for such functionality in strict compliance with the principle of data protection by design and by default.2

Even more challenges to consent arise when focussing on the specific neurodata processing for neurostimulation and neuromodulation (collectively called neurofeedback). Furthermore, besides consent, also other legal bases need to be scrutinised. Examples of claimed legal bases for neurofeedback could include:

  • Vital interest: In cases where neurofeedback is crucial for an individual’s health (the data subject or a different person), for example.

  • Contractual necessity: If neurofeedback is part of a contract, for example, for psychological therapy.

  • Legal obligation: Compliance with legal obligations, for example, in the case of medical record-keeping.

  • Legitimate interest: This could be the case, for example, of certain types of scientific research projects.

Further work is required to explore the lawfulness of processing in these situations.

Dr. Thorsten Kranz, lawyer working at Federal Commissioner for Data Protection and Freedom of Information, Bonn (Germany)

1This article is based on the IWGDPT Working Paper on Emerging Neurotechnologies and data protection, https://www.bfdi.bund.de/SharedDocs/Downloads/DE/Berlin-Group/20250515-WP-Neurotechnologies.pdf?__blob=publicationFile&v=1

2See EDPS AEPD Techdispatch, TechDispatch #1/2024 – Neurodata, at page 16, on conditions and limits for

the processing of data such as ‘brain fingerprinting, https://www.edps.europa.eu/data-protection/our-work/publications/techdispatch/2024-06-03-techdispatch-12024-neurodata_en