Privacy and AI weekly - Issue 15
This Friday on Privacy and AI weekly
Privacy
? Bavarian SA publishes a guide on risk analysis and DPIA
? Data Governance Act and its privacy implications
? Officine Dati webinar: Ricerca sì, ricerca no, ricerca boh?! La Terra dei dati (IT)
? American Data Privacy and Protection Act (ADPPA)
Artificial Intelligence
? Understanding Bias in AI for Marketing (IAB 2021)
? Responsible AI in practice (Nissewaard's AI audit)
PRIVACY
Bavarian SA publishes a guide on risk analysis and DPIA
The Bavarian data protection authority?(BayLfD) published a guide on risk analysis and?#DPIA?addressed to Bavarian public bodies.
The paper "?#Risk?analysis and data protection impact assessment" presents the method and building blocks of a data protection risk analysis, explains the development of technical and organizational measures and provides practical tips for carrying out risk analyses.?
It attaches particular importance to the idea of scaling: risk analyzes do not always havee to be complex;?Depending on the occasion, different "expansion stages" are possible.?This is illustrated using several concrete use cases.
The supervisory authority also made available tools and modules to perform the risk analysis and DPIA (in German)
Module 1:?Description of a processing activity
Module 2:?DPIA necessity test
Module 3:?DPIA report for a processing activity
Module 4:?Resources "IT personnel management system"
Module 5:?"Video conferencing system" equipment
Module 6:?"Display workstation" resources
Link to the post (EN machine translation) and original documents (risk analysis and DPIA in DE)
Data Governance Act and its privacy implications
The?#DGA?promotes the availability of data and builds a trustworthy environment to facilitate their use for research and the creation of innovative new services and products.
It will enter into effect on 23 June 2022 and apply in full from 24 September 2023.
?Key features?
1) Making public sector data available for re-use, in situations where such data is subject to the rights of others?
The DGA creates a mechanism to enable the safe reuse of certain categories of public-sector data that are subject to the rights of others (e.g. personal data) Public-sector bodies allowing this type of reuse will need to be properly equipped, in technical terms, to ensure that privacy and confidentiality are fully preserved.
For Privacy Pros: DGA does not derogate GDPR. DGA establishes that e.g. before its transmission PD should be fully anonymised or use a secure processing environment. Additionally, PD should only be transmitted for re-use to a third party where a legal basis allows such transmission.
2) A new business model for data intermediation.?
The DGA creates a framework to foster a new business model – data intermediation services – that will provide a secure environment in which companies or individuals can share data.
Allowing personal data to be used with the help of a ‘personal data-sharing intermediary’ designed to help individuals exercise their rights under the GDPR
Such providers focus exclusively on personal data and seek to enhance individual agency, the individuals’ control over the data pertaining to them, assisting them in exercising their rights under the GDPR, e.g. managing consent to the processing, access, portability or deletion requests.
For PrivPros: new opportunities to advise data intermediary providers
?3) Allowing data use on altruistic grounds.
The DGA fosters the creation of data repositories. It allows companies that seek to support purposes of general interest by making available relevant data based on a data altruism model to register in a Data Altruism register.?
Data subjects in this respect would consent to specific purposes of data processing, but could also consent to data processing in certain areas of research or parts of research projects as it is often not possible to fully identify the purpose of personal data processing for scientific research purposes at the time of data collection.
For PrivPros: Data Altruism Organisations recognised in the Union are able to collect relevant data directly from individuals or to process data collected by others. Data altruism would rely on consent of data subjects (art. 6(1)(a), 7 and 9(2)(a)?#GDPR) and processing for this purpose should also comply with the requirements of 5(1)(d) and art. 89(1) GDPR concerning further processing (compatibility test)?#research
Organisations should put in place effective TOM to allow DS to modify their consent at any time
American Data Privacy and Protection Act (ADPPA)
For those who haven't had the time to read about the new US federal proposal, there are two shortcuts to getting an overview of it
and
领英推荐
Officine Dati webinar: Ricerca sì, ricerca no, ricerca boh?! La Terra dei dati (IT)
La protezione dei dati personali nell'ambito della ricerca scientifica e delle sperimentazioni cliniche è da sempre un tema di assoluta rilevanza: l'equilibrio tra riservatezza e progresso scientifico-tecnologico si gioca in un bilanciamento di interessi difficile da effettuare e ancor più difficile da mantenere attuale nel tempo. La digitalizzazione dei dati in ambito sanitario porta con sé incredibili potenzialità e contemporaneamente un inevitabile aumento dei rischi per i diritti e le libertà degli interessati.
Quanto la normativa vigente consente di cogliere appieno le grandi opportunità che il progresso ci mette a disposizione? Quali i principali problemi e quali le potenziali risposte per giungere ad una soluzione win-win? Giudizi di compatibilità delle finalità, secondary use, pseudonimizzazione e anonimizzazione, necessità di consensi specifici, di trasparenza, di garanzia dei diritti.
Tratteremo di questo ed altro insieme a:
??Guido Scorza | | Componente dell’Autorità per la protezione dei dati personali
??Arianna Greco | SVP Head Global Commercial Legal Alnylam Pharmaceuticals
??Roberto Benedetto | Avvocato, DPO Fondazione Santa Lucia
??Silvia Stefanelli | Avvocato, Studio Legale Stefanelli&Stefanelli
??Francesco Curtarelli | Senior Legal Consultant Partners4Innovation
Moderano:
??Rosario Imperiali d’Afflitto | Presidente Officine Dati
??Anna Cataleta | Vicepresidente Officine Dati
?? Per partecipare non è richiesta alcuna registrazione. L’evento verrà trasmesso in diretta streaming sulla pagina Linkedin di Officine Dati e sul canale Youtube.
?? Per maggiori informazioni scrivere a: [email protected] al sito web https://www.officinedati.org/
ARTIFICIAL INTELLIGENCE
Understanding Bias in AI for Marketing (IAB?2021)
This paper provides insight for business executives, technologists, legal and compliance officers, and platform users to understand their responsibilities in process development, deployment, and ongoing management of an AI-driven solution.
The guidance gives an overview of the most important aspects relevant stakeholders should consider when using?#AI?systems for marketing purposes to avoid unfair biases
In addition to considering the conscious and unconscious assumptions, intentions, and the proposed applications of AI, companies should consider:
? Data volume
? Data quality
? Computing power
? Data?#privacy?and security risks
? Legal and regulatory risks
? Public and reputational risks
It also explains important concepts to consider, a checklist, and a list of questions for each phase of an AI system’s lifecycle to help identify and mitigate?#bias
? Phase: Awareness and Discovery?
? Phase: Exploration, Solutions, and Design
? Phase: Development, Tuning, and Testing?
? Phase: Activation, Optimization, and Remediation
Links in comments
https://www.dhirubhai.net/posts/fmarengo_iab-ai-bias-reportv3marengo-activity-6937652234214109184-be-p?utm_source=linkedin_share&utm_medium=member_desktop_web
Responsible AI in practice
Nissewaard is a Dutch municipality near Rotterdam. It planned to use an AI solution to support decisions concerning access to social benefits.
The Municipality of Nissewaard asked a company to audit the algorithm and the investigation was completed in June 2021.
Through this audit, the municipality wanted to provide additional transparency to residents and ensure that the system was functioning properly while respecting all legal and ethical values.
The investigation conducted by the private company showed that the algorithm has technical defects. After the first signals from the auditor about the malfunctioning, the municipality of Nissewaard decided to stop using this?#algorithm.
The municipality communicated that no citizen has been harmed in any way.
About me
I'm a data protection consultant currently working for White Label Consultancy. I previously worked for TNP Consultants and Data Business Services. I have an LL.M. (University of Manchester), and I'm a PhD candidate (Bocconi University, Milano). As a PhD researcher, my research deals with the potential and challenges of the General Data Protection Regulation to protect data subjects against the adverse effects of Artificial Intelligence. I also serve as a teaching assistant in two courses at Bocconi University.
I'm the author of “Data Protection Law in Charts. A Visual Guide to the General Data Protection Regulation“, e-book released in 2021. You can find the book here
Founder at Clovera.io | Digital Trust Advocate | AI, Blockchain & Fintech Risk Expert | ISACA Emerging Tech Advisor
2 年Many thanks, great work ?