?? Legitimate disinterests

?? Legitimate disinterests

Lucid folks,

If you are reading this issue from the US, our October surprises may be coming from across the Pond. There are politics involved to be sure, but not those that should have much bearing on our November nail-biter.

Speaking of politics, Cambridge Analytica continues to be a wake-up call for tech giants and democracy alike. An explosive exposé from The Observer reveals how Facebook data was weaponized to sway elections and manipulate public opinion in the run-up to the 2016 election. With revelations that cut to the heart of privacy and ethics in the digital age, it is a must-read for anyone concerned about the future of democratic integrity in an increasingly connected world.?

Moving on, in this issue:

  • EDPB clarifies conditions for using Legitimate Interests
  • EDPB issues first report on GDPR vs GenAI
  • UK and US team up to tackle kids’ online safety

…and more.

From our bullpen to your screens,

Colin O'Malley & Lucid Privacy Group Team

With Alex Krylov (Editor/Lead Writer), Ross Webster (Writer, EU & UK), Raashee Gupta Erry (Writer, US & World), McKenzie Thomsen, CIPP/US (Writer, Law & Policy)


?? If this is the first time seeing our Privacy Bulletin in your feed, give it a read and let us know what you think. For more unvarnished insights, visit our Blog.

Your comments and subscriptions are welcome!


EDPB to Ad Businesses: You Can Rely on ‘Legitimate Interests’, Maybe

The European Data Protection Board (EDPB), the entmoot of EU DPAs, has adopted a draft set of? Guidelines on the use of Legitimate Interest (LI) as a legal basis. The draft is accepting public comments through Nov 20, 2024.

Why it matters: ?The Guidelines reinforce the conditions for LI’s acceptable use and relationship to the other GDPR legal bases (e.g., contractual necessity, consent etc), and the only legal basis available under the specialist ePrivacy Directive (ePD), which is GDPR-grade consent.?

  • Importantly, the EDPB clarifies that targeted advertising is indeed a form of ‘direct marketing’ and as such may rely on LI. At least in theory, and with Satre-esque caveats.?
  • Moreover, the EDPB echoes the EUCJ’s C-621/22 (KNLTB) ruling, which confirms that “purely commercial interests”, such as to derive revenue from ads, cannot be automatically dismissed as illegitimate by DPAs.

What they said: In short, details and context matter when assessing if LI is the best and justified legal basis for a specific activity.? for? for gray space activities like direct marketing, fraud prevention

  • Pseudo /?mass = ‘direct’. To qualify as ‘direct marketing’, digital solicitations must (1) be a communication, (2) have a commercial purpose (i.e., to encourage a transaction), and (3) be addressed to a consumer or group of consumers. Meaning, ads don’t need to be microtargeted to be ‘direct’ enough for the purposes of the GDPR and the overlapping ePD.

“...the CJEU found that it is irrelevant whether the advertising at issue is addressed to a predetermined and individually identified recipient or is sent on a mass, random basis to multiple recipients.

  • Not an “open door”. The ad industry should not overuse LI out of convenience or as a default, and should remember that LI is an inherently restrictive, pro-user basis.

“The open-ended nature of Article 6(1)(f) GDPR does not necessarily mean that… [LI should be] seen as a preferred option by controllers and its use should not be unduly extended to circumvent specific legal requirements or because it would be considered as less constraining than the other legal bases in Article 6(1) GDPR.”

  • Freedoms and interests. Modern marketing is data-rich and tech-sophisticated, and moving quickly to reap the promises of AI.

“The explicit reference to “interests or fundamental rights and freedoms” in Article 6(1)(f) GDPR has a direct impact on the balancing test to be carried out under that provision. It provides more protection for the data subject, as it requires the data subjects’ “interests” to be taken into account, not only their fundamental rights and freedoms…"

  • Specific needs. The EDPB stresses that even venerable purposes like fraud prevention can fail the three-pronged LI balancing test if the use case is nebulously defined. Is the activity to detect and prevent sophisticated ad fraud caused by bots and spiders? You need to spell it out to gauge if there may be some outsized impact on humans being checked for their… humaneness. (Yes, when using hidden CAPTCHAs too.)

With all this in mind, the EDPB is right to recast the Legitimate Interest Assessments (LIAs) as a DPIA threshold assessment.?

”...Moreover, if high risks are identified in the context of this assessment, the controller should consider performing a Data Protection Impact Assessment (DPIA) in accordance with Article 35 GDPR.”

Unresolved issues: While the Guidelines provide helpful refreshers and illustrative examples that bring together decades of interpretive guidance, first under the 1995 Data Protection Directive and now the GDPR, the text continues to perpetuates one of the more vexing tendencies of the Board -- to open a door with one hand and slam it shut with the other.

  • Case in point, in Example 5 the EDPB once again asserts that social platform users cannot reasonably expect that the operator of the social network will process that user’s personal data, without his or her consent, for the purposes of personalised advertising.”??

Really? Not ever? Not even for product improvement? (See below)

Source: EDPB

  • As we discussed in our April 23rd issue, the EDPB believes that “large online platforms” -- which may or may not include scaled publishers -- can only rely on consent as their legal basis for targeted ads, first to obtain necessary data under the ePrivacy Directive, and then for the subsequent processing of said data.?

This is because Meta, (1) can’t condition ad personalization to service access, (2) claim LI unless they can prove this is “objectively indispensable”, but then can’t satisfy this test because (3) personalization is never “indispensable” when less invasive options like contextual ads are available, even they are less profitable. ?

Zooming out: On the one hand, the EDPB opens an important door for using Legitimate Interests, and perhaps for more than just security and anti-fraud. But LI for processing must still be “without prejudice” to Consent under the parallel ePrivacy Directive, which the GDPR defers to on requirements for digital direct marketing and related tracking. Here lies the gotcha. To the extent the EDPB wants to expand the role of ePD Consent to even more digital data flows, that LI door is tight squeeze… In other words… ?

Blues Traveler. "Run-Around" (1994)

-AK


FPF: A Look at the EDPB AI Taskforce’s First Report

Brace yourself, folks—GenAI's GDPR moment is here. The EU is getting serious about AI regulation, and this report on the GDPR's first steps with ChatGPT could change the game. It’s a big moment for tech accountability, and if you're in the AI space, you better pay attention.?

This piece by the Future of Privacy Forum’s fab Dr. Gabriela Zanfir-Fortuna breaks down what the EDPB’s ChatGPT Taskforce’s first report means for anyone building or using GenAI.

GZF’s highlights below.


Source: X

Other Happenings

  1. California’s New SB 976: On Parenting, Now With Algorithmic Overlords. California’s SB 976 has landed, adding to the growing pile of child-focused privacy laws. Starting January 1, 2025, social media platforms will need parental consent to push “addictive feeds” to minors. No sneaky notifications during school hours either—without mom or dad’s OK, that TikTok ping will have to wait. With familiar terms like “Verifiable Parental Consent” and “Actual Knowledge” thrown in, it feels like COPPA's more demanding cousin. Enforcement kicks in 2027.
  2. US & UK Issue Joint Statement on Children's Online Safety. The UK and US governments are urging online platforms to enhance protections for children's safety and privacy. They plan to form a working group to tackle issues like sexual exploitation, cyberbullying, and harmful content. The collaboration aims to establish global standards prioritizing children’s wellbeing while maintaining a free and secure internet, emphasizing the need for platforms to take swift and stronger actions in safeguarding young users. This could also mean additional funding for NCMEC.
  3. New AdTech Industry Trade Launches in the UK. Enter the Coalition for Privacy Compliance in Advertising (CPCA). Founded by Mattia Fosci, the legal and privacy expert turned entrepreneur behind Anonymised in 2024, the new trade aims to make sense of and certify the practices of the ad technology ecosystem. The cert is in partnership with the UK ICO, a first for the industry. Proponents see hopes of regulatory clarity, but doubts linger. Can the cert appease EDPB members across the channel? Will companies just tick boxes without real changes and tick off ICO? With launch slated for 2025 and pending ICO approval, Lucid will monitor closely how this ad venture unfolds.?
  4. Cloud Storage: Safe Haven or Liability Landmine? Is your data truly safe in the cloud? Think again. Even tech giants like AT&T learned the hard way that entrusting personal data to third-party vendors can lead to messy—and costly—consequences. When a vendor leaked over eight million customer records, AT&T faced a hefty $13 million fine from the FCC. The irony? AT&T believed they had all their bases covered with contracts, audits, and oversight. Yet, cloud misconfigurations and vendor errors were the prime culprits. The harsh reality? Even when the vendor slips up, your organization foots the bill. The bottom line: trust, but verify—and then verify again. (Que “Livin’ on the Edge”.)

-RGE, RW, AK


Lucid Resources


要查看或添加评论,请登录

Lucid Privacy Group的更多文章

社区洞察

其他会员也浏览了