ICO Novartis Privacy Sandbox Report Offers Insights for all Controllers and Processor

ICO Novartis Privacy Sandbox Report Offers Insights for all Controllers and Processor

ICO report on the Novartis Sandbox has important takeaways for controllers and processors in the digital space. Some key takeaways:

Roles of the Parties: Sometimes, you're neither controller not processor.

  • Although providers may not always have a legal obligation to consider data protection, where the intention of a product is to process personal data on its deployment, all organizations within a supply chain should consider compliance within the design. This should not only ensure that the rights of individuals are considered from the early stages of product design, but it may also give providers an edge over organizations that have not thought about data protection, when competing for contracts, as controllers will be assured that they are meeting their obligations.
  • Although the product design may be considered as a joint venture between Novartis and the relevant NHS body, through significant engagement, the business relationship between two parties may not indicate a joint controller relationship as it is defined by the GDPR. This is because the Digital Solution aims to facilitate and improve upon NHS services and offers a more efficient ‘means’ through which to carry out patient care and treatment, processing already carried out by the NHS and its employees.
  • The relevant NHS body will be ultimately responsible for deciding whether to implement the solution, and clinicians will be required to exercise their professional judgement in deciding which patients should be allowed access to the portal and in making clinical decisions about the patients’ care and treatment as a result.
  • Therefore, the NHS body will be the determinant of both the purposes and the means that patient data is processed by in line with its statutory obligations, in the context of patient care, and can be considered the controller in this context. The NHS body will also be the party responsible for determining its Article 6 lawful basis for processing and the Article 9 conditions for processing of any special category data, deciding on retention of the data, amongst other controller obligations.
  • Once deployed, Novartis will have no access to patient data acquired and processed through the Digital Solution, and is therefore unlikely to be considered as a processor, and will not have a data protection role.
  • Nonetheless, Novartis needs to ensure that the design and manufacture of the solution is carried out in a manner which takes account of the GDPR, and incorporates data protection by design and default.
  • Regardless of your role in terms of the end Digital Solution, where there is an ultimate aim to process personal data, you, and each organization involved in the supply chain considers Article 25 of the GDPR, ‘data protection by design and default’, and Recital 78, in the development of their products and services.
  • Where a vendor, that would usually be considered as a processor, is collecting and processing data for its own purposes beyond those instructed by the controller, eg for model training purposes, it would become a controller in its own right and would be required to comply with all of the GDPR controller obligations including having an appropriate lawful basis.

Biometric and Third Party Data: Not all speech data is biometric data.

  • Whether an individual’s speech data is considered as biometric data would depend on what the technology provider was ‘doing’ with that audio data, and how it was being technically processed.
  • If a technology vendor is simply extracting the content of somebody’s speech to understand what they are saying to enable the solution to work, it isn’t necessarily processing any data about an individual’s voice. Although the content of the words could be personal data, especially where a patient is providing data verbally about their health condition, the technology would not gather data about an individual’s vocal characteristics or the way that they speak.
  • On the other hand, if a technology vendor is technically processing information about the vocal characteristics of an individual to learn something about the way they speak, or to allow a distinction between that speaker and another, this would be ‘biometric data’. Where that biometric data is then used for the purposes of authenticating that individual based on their voice patterns and characteristics, this would be considered special category data.
  • Providers of well-known voice-enabled ‘smart devices’ in the consumer market may collect biometric data from users to train their speech recognition models in order to reduce errors. Although this may be classed as biometric data because it is data about the way an individual speaks, it will often be pseudonymised, therefore it is not being used to uniquely identify that speaker and is not special category.

Considerations for using Voice Enabled Technology:

  • Take into account and inquire about the practice of audio review of audio samples.
  • Enquire about the accuracy levels of models when carrying out due diligence, in terms of the technology’s ability to recognize different dialects, languages and genders. Flaws in the accuracy of the model may lead to inaccurate health data being collected by the solution, which in turn may lead to inappropriate clinical decision making, or a solution that causes frustration to its users and is therefore not viable in practice.
  • Consider how the solution will be activated. Devices that are activated by a user speaking a ‘wake word’, continuously listen for the acoustic pattern that matches that trigger word. These devices are often referred to as ‘always on’. Although assurances have been provided by some well-known voice vendors, that speech or background noise is stored on device (not sent up to the cloud) and then deleted within seconds, when the device is mistakenly awakened by incorrectly recognizing the trigger word, this audio will be sent up to the cloud for additional processing. This process risks breaching both ‘fairness and transparency’ Article 5(1)(a) and the ‘data minimization’ principles, Article 5(1)(c), as excessive personal data may be collected.

Automated Decision Making: If a clinician is making a decision based on available data, it can considered meaningful human input.

  • For human intervention to be meaningful, it must come from a qualified clinician and not the patient themselves.
  • Automatically cancelling an appointment without this clinical input, could be deemed as a significant effect, therefore Article 22 would need to be considered.
  • As an alternative to the Digital Solution automating decisions fully, where it had identified from a pattern in the data acquired from the patient’s responses to questions, that their condition had improved, the Digital Solution could provide an alert to the clinician that they should consider cancelling the patient’s follow up appointment.
  • As long as the clinician then examined the data for themselves and made their decision based on the available data rather than exclusively on the prompt from the Digital Solution, this would be considered meaningful human input.
  • Potential safeguards to ensure that human input remains truly meaningful. When the solution is intended to be used by busy NHS clinicians, the risk of automation bias, where the clinician begins to rely on the Digital Solution’s recommendations only, may be exacerbated. One safeguard may include appropriate training of clinicians when the solution is implemented into a Trust, to ensure that they are aware of the risk of automation bias and that the solution should be used as a tool to support but not replace high quality clinical decision-making.

Privacy Disclosure:

  • It is not a bad idea for a vendor to have their own privacy notice to support the controller in its disclosure obligations.
  • Where a technology provider or processor decides to draft privacy information on the behalf of a controller for use within an application, some information about the controller and the purposes for processing should be included, and a link to the controller’s wider privacy notice should be signposted at the top of the document. Engagement with the controller will be required to ensure this information is accurate. The provider/processor can then go into more detail about how personal data is used specifically in terms of the solution.



Robert Sheppard

Data Protection Specialist (knowledge of AI ??)

4 年

"Once deployed, Novartis will have no access to patient data acquired and processed through the Digital Solution, and is therefore unlikely to be considered as a processor, and will not have a data protection role." So does that mean a vendor holding encrypted data isn't a processor if they don't have the key?

Roy Smith

CEO at PrivacyCheq

4 年

Very interesting ruling re:speech recognition. I expect that to be challenged. If I use a fingerprint sensor to open a locked file cabinet wasn't that biometric even though the file cabinet didn't store my fingerprint data? Same would apply to voice recognition - for it to work, somewhere, my voice recording is being stored and processed.

Andreea Lisievici Nevin

???? Privacy & Tech Lawyer, Managing Partner @ ICTLC Sweden? Mentoring and training privacy professionals @ PrivacyCraft ? Lecturer @ Maastricht Uni? Certified DPO (ECPC-B), CIPP/E, CIPM, FIP ? ex-Volvo Cars, ex-Boeing

4 年

Wow super interesting

回复

要查看或添加评论,请登录

Odia Kagan的更多文章

社区洞察

其他会员也浏览了