When Privacy by Design is Forgotten
Luiza Jarovsky
Co-founder of the AI, Tech & Privacy Academy, LinkedIn Top Voice, Ph.D. Researcher, Polyglot, Latina, Mother of 3. ??Join our AI governance training (1,100+ participants) & my weekly newsletter (55,000+ subscribers)
Recently there have been two episodes that made me question if tech companies still take privacy by design seriously and if GDPR Article 25 means anything at all to them.
As a reminder, privacy by design, the framework developed by Dr. Ann Cavoukian, has seven main principles:
1. Proactive, not Reactive
2. Privacy as the Default Setting
3. Privacy Embedded into Design
4. Full Functionality — Positive-Sum, not Zero-Sum
5. End-to-End Security — Full Lifecycle Protection
6. Visibility and Transparency
7. Respect for User Privacy — Keep it User-Centric
Last month, Dr. Cavoukian was a guest at my 'Women Advancing Privacy' live event, you can listen to the recording of our conversation about privacy by design in the age of AI.
The GDPR has officially adopted a similar concept: data protection by design and by default. According to the GDPR Article 25:
Having said that, recent events have made me question if privacy by design actually means anything in practice.
The first event was OpenAI's overall data protection strategy for ChatGPT.
In this newsletter, I have discussed extensively the possible privacy issues involved in AI-based chatbots. From the infringement of data protection principles to reputational harm, negligence with data subjects' rights, dark patterns in AI, and disregard for fairness, it is clear that those AI chatbots now broadly available to the public need a solid risk assessment, privacy assurances and privacy by design.
Nevertheless, OpenAI has not been explicit or specific on how they will deal with issues that I and various other privacy professionals have been raising in the last few months. There have not been meaningful public announcements or clarifications, and recently there was a privacy incident where people's chat histories were exposed to other users. After this incident, a more obvious change I noticed was this warning before someone could access ChatGPT.
It is good that they gave one (small) step towards more transparency. But only that, and only now? They have nothing else to tell people or to embed into the design of their product so that we can have better privacy assurances and transparency? As I questioned on social media: is the new privacy paradigm privacy by pressure? Privacy by reaction?
Perhaps as a result of this lack of proactivity, data protection authorities have recently started acting. The Italian Data Protection Authority ("Garante per la Protezione dei Dati Personali") imposed an immediate temporary limitation on the processing of Italian users' data by OpenAI. It looks like more authorities - at least in Europe - will follow suit.
Despite these expected enforcement-related advances, I am still in disbelief about why companies like OpenAI do not take privacy by design more seriously.
The second recent episode - indeed very recent, as I first heard about it yesterday - was Meta's new publicly available "opt-out" created to allow people, in theory, to opt out of data processing for behavioral advertising.
If you are a privacy professional, you are probably familiar with the recent episodes in the context of the Max Schrems/noyb vs Meta privacy litigation. It was recently decided that Meta could not use 'contract' as a lawful basis to collect and process user data, so they seem to be now relying on 'legitimate interest' (although noyb is legally questioning that).
To consolidate the legitimate interest strategy, they now offer an opt-out form so that those interested can request not to have their data processed in the context of behavioral advertising. The problem is: such an opt-out form is hidden and difficult to understand. According to noyb:
"Instead of providing a simple "switch" or button to opt-out, Meta requires users to?fill out a hidden form. In this form users have to argue why they want to perform an opt-out. Users have to identify each purpose for which Meta argues a 'legitimate interest' and then explain why Meta's assessment - which is not public - was wrong in their individual case. It is highly unlikely that any normal user would be able to argue these points effectively."
Aren't transparency & fairness data protection principles? Shouldn't it be easy for users to opt out of targeted advertising? Does privacy by design matter at all?
Consumers say privacy is important and that they are concerned about how companies use data about them. Privacy by design, including privacy UX and fostering a privacy-enhancing user experience, are great tools to help implement a privacy compliance plan. It is unclear to me why some tech companies do not think so yet. Are they waiting for more fines?
领英推荐
?? Interested in diving deeper into Privacy by Design and Privacy UX? The course Privacy UX: The Fundamentals is coming soon,?sign up for the waitlist and get 20% off when it's launched. Visit our site to check our professional privacy courses.
--
???Upcoming events
In the next edition of 'Women Advancing Privacy', I will discuss with Prof. Nita Farahany?her new book?"The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology,"?as well as issues related to the protection of cognitive liberty and privacy in the context of current AI and Neurotechnology challenges.
Prof. Farahany is a leader and pioneer in the field of ethics of neuroscience. This will be a fascinating conversation that you cannot miss. I invite you to sign up for our LinkedIn?live session?and bring your questions.
To watch our previous events (the latest one was with Dr. Ann Cavoukian on Privacy by Design), check out my?YouTube channel.
--
?? Podcast
In the latest episode of?The Privacy Whisperer Podcast, I spoke with?Romain Gauthier, the CEO of?Didomi, about:
This was a fascinating conversation. If you work in the tech industry, are a privacy professional, or are an entrepreneur, you cannot miss it.?Listen now.
--
?? Trending on social media
Which one is your favorite? Answer here.
--
???Privacy & data protection job search
We have gathered various links from job search platforms and privacy-related organizations on our?Privacy Careers?page. We are constantly adding new links, so bookmark it and check it once a week for new openings. Wishing you the best of luck!
--
? Before you go:
See you next week. All the best,?Luiza Jarovsky
Marketing Specialist | Content/Product Marketing | Growth Hacking | Marketing AI Prompt Engineering | PR
1 年Great post. “A billion-euro parking ticket is of no consequence to a company that earns many more billions by parking illegally,” Said Johnny Ryan, a senior fellow at the Irish Council for Civil Liberties. Much like Big Pharma, Auto Industry. If you're working in Europe, ya better take care of your prospect and customer data. Otherwise you can join the elites of Silicon Valley and get fined for GDPR violations. Meta Networks: 1.2 billion Euros Amazon Europe: 746 million Euros Google: 215 million Euros You Don’t Need to be Meta to Carry GDPR Risk Sharing this post: https://insidecro.com/you-dont-need-to-be-meta-to-carry-gdpr-risk/
HRBP Product & Engineering People Solutions Consultant | Gallup Clifton Strengths Coach & Career Coach Strategist | Workforce Planning & Employee Relations | Change Management | Enterprise Career Architect
1 年What is the consequence to the organization? What agency regulates ?
Data Enthusiast | ESG Champion | Turning Data into Strategy | UPG Sustainability Leader 2023 | AI Data Trainer
1 年Great read! Very pertinent questions have been asked. Hopefully, privacy by design will be embedded into the culture of many tech companies. That way privacy moves from being reactionary to being a fundamental value that underpins all their activities.
Jennifer Mahoney, CIPP/US