My views on the path to Neural Data regulations and the implications for GRC Programmes
Image source: www.easy-peasy.ai

My views on the path to Neural Data regulations and the implications for GRC Programmes

A few weeks ago I came across a LinkedIn post by a contact who was mulling the privacy protections of Neural Data (ND)-data produced by the brain-because it is used to monitor fatigue in employees.

It turns out that ND is not strictly regulated, despite its proliferation in wearable consumer technology via medical-grade devices and it is far more widely deployed than I realised i.e. for occupational safety.

Naturally the topic piqued my interest on a few levels, because it touches on my current and past focus areas like Governance, Risk & Compliance (GRC), Data Privacy Management and Supply Chain Digitisation.

Looking into the global ND discourse I came across the article “Is Your Neural Data Part of Your Mind? Exploring the Conceptual Basis of Mental Privacy” by Abel Wajnerman Paz, Assistant Professor at the Institute of Applied Ethics at one of Chile’s leading universities-Pontificia Universidad Católica de Chile.

Professor Paz made a compelling case for the special protection and designation of ND, which led me to a few insightful detours via the references for grappling with this data category.

This LinkedIn article is largely based on the aforementioned article with my thoughts on it, the regulatory developments on ND and concludes with my principles-based suggestions for managing ND in GRC programmes.


The case for the stringent protection of Neural Data by the Morningside Group and Professor Abel Wajnerman Paz

Professor Paz starts with an explanation of the recent ND growth by highlighting that the public-private U.S. BRAIN Initiative funded research in neurotechnology and artificial intelligence.

This spurred similar initiatives in China, Korea, the European Union, Japan, Canada and Australia, which led to various consumer neurotechnologies for commercial non-clinical uses that monitor brain activity i.e.? Sleep & Meditation Apps, caps and devices like helmets that transcribe mental states.

Despite this inherently personal and sensitive information getting processed, it is not explicitly protected by most privacy laws.

Neural Data falls through the cracks because it is considered to be electrical data instead of biological data in the US for instance-the latter is accorded special protection through the California’s Consumer Privacy Act (CCPA) and the EU's General Data Protection Act (GDPR).

Reading the article, a friendly juxtaposition of arguments for the stringent protection of ND crystalises, which I'll outline with my views afterwards.

The Morningside Group’s case for Neural Data Protection: Classifying ND as organic tissue

The Morningside Group (MG) is a global consortium of interdisciplinary experts advocating for the ethical use of neurotechnology and artificial intelligence led by Dr. Rafael Yuste, who is a neuroscientist and Professor of Biological Sciences and Neuroscience at Columbia University as well as the initiator of the Neurorights Foundation.

Their argument is that ND constitutes a form of biological data, which should be covered by physical privacy regulations associated with access to organs and blood samples for instance.

This framework for the treatment of ND enables the Morningside Group to call for safeguards related to organ donation like?

  • Explicit opt-in without compulsion is required for collection
  • Prohibition of commercial transfer even if subjects consent
  • Donation is only allowed for altruistic purposes

Interestingly, the Morningside Group provided a steer on the approach to ND classification that made it into Chile’s Neuroprotection Bill, which is explored in the next section.

Professor Paz’ case for Neural Data Protection: Uniquely personal data that affects psychological Integrity

Some understanding of how the brain functions is in the realm of (contested) hypotheses. Therefore my precis of Professor Paz’ article will attempt to distil the crux of the arguments for the classification and treatment of ND by limiting the interesting but dense technical discourse and terms.

Professor Paz posits that ND is fundamentally personal information about neural states, structures and processes-interoceptive mechanisms for instance.

Interoceptive mechanisms enable the awareness of internal sensations in the body from the heart rate, respiration, hunger and pain to emotional sensations.

Crucially, the pathways along with the mechanisms for sending and responding to signals differ from person to person and are triggered by properties in the subject’s body, which makes it deeply personal data about the subject’s neurocognitive identity.

In the GDPR this data would likely fall under the “special categories of data”, because a data breach would risk undermining some of the fundamental rights highlighted by the UK’s Information Commissioner Office like bodily integrity or more aptly psychological integrity here.

To make the case for ND handling affecting psychological integrity, Professor Paz invokes the work of Dehaene, Changeux and colleagues on the Global Neuronal Workspace (GNW).

The GNW is a platform for sending neural signals to distant areas for exchanging information between processors, which become “the subject of a sentence, the crux of a memory, the focus of our attention, or the core of our next voluntary act” (Dehaene).?

Dehaene also hypothesises that the GNW inputs constitute the neural basis for access to consciousness, which heighten the need for ND protection should this become scientific consensus.

Importantly, the GNW chooses which signals to amplify and broadcast to brain systems for deliberate actions like forming a sentence.?

Consequently, Professor Paz argues that access to the GNW inputs via ND is a form of intrusion, because the GNW's mechanism for amplifying signals is bypassed-akin to “brain-hacking” that violates psychological integrity and mental privacy (i.e. privacy of the mind/consciousness).


?? My View on both cases for ND Protection:

Both the Morningside Group and Professor Paz make proactive and compelling cases for ND protection to avoid the “delay fallacy”, which would lead to delaying regulation until the ND uses cases are mature but too entrenched for effective guardrails.

Equating ND with organic tissue like the Morningside Group proposes might be a path to get a consensus on the sensitivity of ND if it is accepted.?

However it could also lead to arduous litigation on the essence of ND, which might extend the protection gap.

The approach by Professor Paz seems a bit more straightforward, since there is already a legal framework for personal data protection and the psychological integrity dimension is a way to elevate it to a "Special Category", which is also covered in existing data privacy regulations like the GDPR.?

To answer the rhetorical question in the title of Professor Paz’ article, not sure whether ND is part of the subject’s mind, but ND is certainly deeply personal data deserving of tighter protections associated with personally identifiable information.


Chile & Colorado: Pioneering Neural Rights legislations in the Americas

Despite the aforementioned medical-grade consumer products that collect Neural Data, it is not covered by the rigorous data protection requirements of the Health Insurance Portability and Accountability Act (HIPAA) in the US.?

Quite confoundingly, if an electroencephalogram (EEG) device is used for medical purposes the data collected is covered by HIPAA, however if the same EEG device is used for the cited commercial purposes it is not covered by HIPAA according to Jared Genser, Legal Counsel at the Neurorights Foundation.

To compound matters, a few policy networks aligned with technology companies lobbied against neural data regulations that enforce higher consumer protections.

The first US state to address this apparent data protection loophole with signed legislation is Colorado. On April 27, 2024? the Senate Bill 21-190 (SB 21-190) was enacted to add neural data to the Colorado Privacy Act and demand “...heightened protections for collected data about bodily or mental functions”.

Section 1(e) of Colorado’s legislative declaration regarding SB 21-290 caught my eye, because it follows the route Professor Paz took to argue for ND protection by classifying it as distinctive and sensitive personally identifiable information:?

"Every human brain is unique, meaning that neural data is specific to the individual from whom it was collected. Because neural data contains distinctive information about the structure and functioning of individual brains and nervous systems, it always contains sensitive information that may link the data to an identified or identifiable individual."?

Several years before the Colorado Neural Data bill, Chile became the first country to enshrine ND protection via a constitutional amendment.

Chile's constitutional ND protections incorporate the “5 neurorights” developed by Dr. Rafael Yuste via the Neurorights Foundation (see table below). It enshrines neurorights like “The Right to Mental Privacy” (Article 6 & 7) and “The Right to Personal Identity” (Article 4) as well as a provision to harness ND with “The Right to Equal Access to Mental Augmentation” in Article 10.

Interestingly, Article 7 takes the Morningside Group’s approach to ND protection by requiring compliance with organ donation provisions when handling this data category:

”...the collection, storage, treatment, and dissemination of neuronal data and the neuronal activity of individuals will comply with the provisions contained in Law No. 19.451 regarding transplantation and organ donation.”

Organisations that are using ND in their GRC programmes, manufacturing or supply chain processes could look to Chile for gauging the potential impact of global ND regulations, since they enshrined it in their constitution.?

In addition to that, Chile upheld its constitutional neurorights stance in the case Emotiv Inc. v. Guido Girardo (a Chilean Senator at the time). The Chilean Supreme court ruling forced the neurotechnology provider Emotiv Inc. to delete ND, because the measures pertaining to informed consent, commercialisation risks and re-identification were not satisfactory.

The current ND legislative developments in the state of California, Spain, Brazil and Mexico suggests that handling ND could become a challenge for GRC programmes that lack a plausible framework for it.


?? My non-lawyer take on the Chilean Supreme Court decision:

I am not a lawyer and don’t play one on LinkedIn, but I get the sense that treating ND like organs might have some unintended consequences.

With ND being equated to biological matter, you could probably never use it for the commercial applications cited, even if ND is donated for altruistic purposes.

The donation would inherently be in service of commercialisation by neurotechnology providers, because the subject pays for ND processing in some way i.e. subscription, buying a device, or opting-in to broad ND processing (i.e. ND becomes currency here) etc. .

So the approach taken for ND protection by the Chilean Supreme Court, that requires a treatment in line with organ transplantation and donation, seemingly prohibits commercial use altogether in Chile.

This might counteract one of the neurorights in the constitutional amendment, that will be significant for aging populations: ”The right to equal access to cognitive enhancement


From Kraftwerk's Man Machine scenarios to principles-based Data Privacy Management

One doesn’t have to go too far afield to conjure up sci-fi scenarios of fusions between humans and machines (efficiently) described in the Kraftwerk tune “Man Machine” ("The man machine, machine, machine, machine, machine...Superhuman being ?? ??).

In the Industrial Internet of Things (IIoT) context machines and human beings already enter into data unions as a working paper on the “Future of Manufacturing in Europe” by the Eurofund and European Commission outlines below:

“With IoT in the manufacturing industry, machines and devices will be connected to interact with each other and with humans. For that, sensors and actuators are essential…”

Human movement along with that of robots is routinely tracked in the manufacturing context, and with ND making it into production workflows the stakes are heightened in my estimation.

To me ND in manufacturing throws up these questions, informed by GDPR adherence considerations, that ought to be resolved before its wide deployment or at least explored:

  • Can a data subject freely consent to ND collection for measuring mental states aiding the secure operation of machines, if the data controller is the employer?

?? Implications to explore: GDPR Recital 43 states that consent for personal data processing needs to be freely given. If there is a clear power imbalance between data subject and controller, the data subject’s volition needs to be probed. Recital 43 suggests providing the data subject with the option to consent to"..different personal data processing operations..." (i.e. not ND in this case) to address the power imbalance and eliminate (soft) coercion.


  • Are data subjects in IIoT scenarios? giving informed consent for limited ND processing or consenting for other uses (i.e. legal proceedings)?

?? Implications to explore: Once there is a data union between the machine and humans operating the machine, it becomes part of the IIoT data infrastructure with all that entails. For instance, if ND needs to be assessed to determine the mental state of the operator in legal proceedings against a machinery vendor, ND can become part of invasive discoveries and subpoenas by third parties-a data subject/employee needs to be informed of this possibility before giving consent to ND processing.


Suggestions for principles-based Neural Data Management

As mentioned in the previous section, when ND makes it into supply chain and/or operational processes, implications for GRC programmes arise that organisations ought to prepare for.

The cited ND regulation wave in the Americas and the UN Resolution A/HRC/RES/51/3 on "Neurotechnology and human rights" heighten the need for an ND strategy.

Underneath are a few suggestions for managing this novel terrain with applicable insight from GRC initiatives.


?? Start with a Framework in the absence of ND Regulations:

Besides Chile and Colorado, ND regulations are just starting to take shape.

That notwithstanding, dealing with the ethical considerations of ND will be paramount to manage the trust of stakeholders like employees, clients, investors, partners and regulators.

In lieu of regulations outside of Chile and Colorado, the 5 neurorights by the Neurorights Foundation could provide guidance for ethically handling ND and getting ahead of likely regulations.

Image Source: the Neurorights Foundation


?? Determine ND Exposure with Impact Assessments:

According to a Harvard Business Review article more than 5,000 companies already use brainwave monitoring?across industries like mining, construction, trucking, and aviation to ensure that their employees are awake.

So an ND Impact Assessment that borrows from similar initiatives in GDPR projects like Privacy Impact Assessments would be prudent for ascertaining the ND exposure.

Taking an inventory of critical processes and/or creating assessments to determine where ND is processed are actions that borrow from similar initiatives for GDPR adherence, which could help map ND exposure.

Instead of creating assessments, one could also simply add questions on ND to existing Data Privacy Impact Assessments for GDPR/CCPA compliance to extend the scope.

Crucially, ND Impact assessments should extend to the suppliers and be repeated regularly to monitor the exposure, since ND in IIoT might morph into an emerging data and compliance risk category over time.

The 5 neurorights could be turned into a series of questions that address personal identification, consent and social disadvantage from ND privacy measures like “How is Mental Privacy safeguarded?", "What are your measures for enforcing a narrow use of ND?” etc. .



?? Add ND risks to Business Continuity Management:

The Chilean Supreme Court ruling that forced a neurotechnology provider to delete ND will have had business continuity management (BCM) implications for the company.

If the ND Impact Assessment reveals that ND is in far wider use than expected or affecting critical processes, it would be prudent to extend BCM with ND considerations.

A Business Impact Assessment (BIA) is used to prepare for disruptions and line up approaches for business continuity.

Evaluating how continuity is impacted by the presence or absence of ND within the organisation or supply chain would be questions to explore in BIAs

GRC programmes could introduce questions on the proliferation of ND in critical processes and prepare alternative processes i.e. instead of tracking brain waves for occupational safety, (possibly) less invasive methods like tracking head movement, blinking rates, gait etc. could be tracked with consent.

This way organisations could avoid ND becoming an operational risk that materially impacts resilience, if a workflow is disrupted due to the unavailability of ND for a host of reasons i.e. supreme court rulings, misalignment with the ND framework after a review etc. .


Annotated Sources:

??“Is Your Neural Data Part of Your Mind? Exploring the Conceptual Basis of Mental Privacy” by Abel Wajnerman Paz-Pontificia Universidad Católica de Chile:

https://link.springer.com/article/10.1007/s11023-021-09574-7

??A LinkedIn post by Julian Conway, Associate at Bortstein Legal Group on the implications of neural data for complying with the spirit of privacy laws despite ND not being a distinct category in most privacy regulations (currently):

https://www.dhirubhai.net/feed/update/urn:li:activity:7214951197265330176/

??The Neurorights Foundation is the organisation initiated by Rafael Yuste, Professor of Biological Sciences and Neuroscience at Columbia University along with Jared Genser, Legal Counsel and managing director of the law firm Perseus Strategies, LLC:

The Neurorights Foundation "...engage the United Nations, regional organizations, national governments, companies, entrepreneurs, investors, scientists, and the public at large to raise awareness about the human rights and ethical implications of neurotechnology."

https://neurorightsfoundation.org

??Colorado Senate Bill 21-190 protects Neural Data by expanding the definition of sensitive data; it was approved April 17, 2024 and will be effective August 7, 2024:

https://leg.colorado.gov/bills/hb24-1058

??California Senate Bill 1223 from February 15, 2024, which aims to protect Neural Data. The bill hasn't been signed into law at the time of this article:

https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202320240SB1223

??UN Resolution A/HRC/RES/51/3 proposes that "...the impact, opportunities and challenges of neurotechnology with regard to the promotion and protection of all human rights..." are studied:

https://documents.un.org/doc/undoc/gen/g22/525/01/pdf/g2252501.pdf?token=WsdZC81P32c0tyRTWp&fe=true

??"Industrial internet of things: Digitisation, value networks and changes in work" A working paper by the European Commission and Eurofund on the potential of Industrial Internet of Things for the EU economy:

https://www.eurofound.europa.eu/system/files/2019-12/wpfomeef18006.pdf

??Science Friday Podcast episode with Jared Genser, Legal Counsel Neurorights Foundation discussing neural data and his advocacy for regulation to protect privacy:

https://pod.link/73329284/episode/34cf88f1177c73816d5fef053018bdcc

??The Information Commissioner's Office guidance on ‘special categories of personal data’ which informs the classification of neural data and points to the GRC implications:

https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/lawful-basis/special-category-data/what-is-special-category-data/#scd1

??GDPR Recitals offer support for implementing GDPR-I point to GDPR Recital 43 that highlights the imbalance between subject and controller that can affect whether consent is freely given:

https://ukgdpr.fieldfisher.com/recitals/

??MINISTRY OF SCIENCE, TECHNOLOGY, KNOWLEDGE, AND INNOVATION LAW NO. 21.383 from the OFFICIAL JOURNAL of the Republic of Chile which protects neural data:

https://static1.squarespace.com/static/60e5c0c4c4f37276f4d458cf/t/6182c0a561dfa17d0ca34888/1635958949324/English+translation.pdf

??Future of Privacy Forum (FPF). piece on the rise of neurorights. FPF supports "privacy leadership and scholarship, advancing principled data practices in support of emerging technologies":

https://fpf.org/blog/privacy-and-the-rise-of-neurorights-in-latin-america/

??An article by María Isabel Cornejo-Plaza, Researcher at Universidad Autónoma de Chile (Autonomous University of Chile) that analyses the Chilean Supreme Court ruling in the case on neural data between Emotiv Inc. and Guido Girardo:

https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2024.1330439/full

??A Harvard Business Review by Nita A. Farahany, Law Professor and Tech Ethicist article that lays out the spread of neural data at work and across industries:

https://hbr.org/2023/03/neurotech-at-work



要查看或添加评论,请登录

Chika O.的更多文章

社区洞察

其他会员也浏览了