Setbacks and backpedals
"Backpedaling" by Alex Krylov via MS Creator

Setbacks and backpedals

Lucid folks,

OpenAI is going through a shakeup as it pivots towards a more overt profit-driven business model. Mira Murati’s exit should be lamented, but in the world of boardroom politics this was bound to happen. As CEO Sam Altman cements his power and appreciable equity ($150M) in the post-coup organization, to borrow an aphorism used recently by Prof. Scott Galloway: if you aim at the King, you better not... hire him back.

Moving on, in this issue:

  • Firefox finds itself on the wrong side of the consent debate
  • Listen to Heidi, check your AI-powered martech SaaS?
  • Desperate model training times call for backpedaling

…and more.

From our bullpen to your screens,

Colin O'Malley & Lucid Privacy Group Team

With Alex Krylov (Editor/Lead Writer), Ross Webster (Writer, EU & UK), Raashee Gupta Erry (Writer, US & World), McKenzie Thomsen, CIPP/US (Writer, Law & Policy)


?? If this is the first time seeing our Privacy Bulletin in your feed, give it a read and let us know what you think. For more unvarnished insights, visit our Blog.

Your comments and subscriptions are welcome!


Firefox Fumbles Ads Attribution Prototype, Faces Fire Over Choice

When needs must, Mozilla can’t turn away advertiser dollars either. The latest scandal involving Firefox’s Privacy Protected Attribution once again pulls browsers -- their business models and place in the privacy debate -- into the spotlight.

What happened: Last week, the Greenpeace of privacy advocacy, NOYB, filed a complaint with the Austria Data Protection Authority. The group objected to the manner in which Firefox released its Privacy Preserving Attribution (PPA) feature for advertisers.

  • NOYB objected to PPA being turned on by default, offered no upfront notice upon install, and buried the PPA on/off checkbox far ‘below the fold’ of its Privacy & Security Settings page.
  • According to Felix Mikolasch of NOYB: “Mozilla has just bought into the narrative that the advertising industry has a right to track users by turning Firefox into an ad measurement tool. While Mozilla may have had good intentions, it is very unlikely that 'privacy preserving attribution' will replace cookies and other tracking tools. It is just a new, additional means of tracking users.????????


Source: Firefox settings page

How it works: Firefox’s PPA follows a similar concept as Google Privacy Sandbox’s Attribution Reporting API. The aim? To correct the leakiness of cookies by exposing as little data as possible to programmatic advertisers while still offering them (hopefully) useful data to gauge ROI.

  • PPA allows advertisers to understand ad performance without tracking users across sites, avoiding the collection of detailed personal data.
  • As an agent of the user, literally and figuratively, FF would ‘remember’ that an ad was served, what it was about, and what website it pointed to.
  • This data is encrypted and aggregated with other data, and injected with noise a la differential privacy.
  • The fuzzed roll-up report is then shared with advertisers using FF’s Distributed Aggregation Protocol (DAP), which considers a number of other privacy and security attack vectors.?

Why it matters: On the surface, the issue is about transparency and choice. NOYB is right that Firefox buried the lede and presumed choice on behalf of its users. Looking deeper, the tiff is as much about clashing ideology as Mozilla, probably, losing 83% of its revenue from Google. Ideologically, Mozilla is anti-tracking and this has been its key differentiator from Chrome for a long time. The product team views PPA as a statement to advertisers that they can buy accountable ads without user-tracking.

Where things went sideways: In a charged Mastodon thread, Mozilla's Performance Tech Lead, Bas Schouten defended Firefox’s no-notice-and-choice decision.

  • Opt-in is only meaningful if users can make an informed decision. I think explaining a system like PPA would be a difficult task. And most users complain a lot about these types of interruption. In my opinion an easily discoverable opt-out option + blog posts and such were the right decision.” ?????????
  • Mastodonians were quick to counter that, at roughly 3-4% of the browser market, the privacy-touting Firefox is favored by users who can make such decisions, and have “already skipped over [numero uno] Chrome for a reason.”????????
  • And, that despite Google’s own UX heartburn, “literal google chrome the first thing it shows you post install these days is a screen and a half of text informing the user about their new advertising thing with a link to the relevant setting it’s not good enough but miles better than what mozilla did. [sic] ????????????

Zooming out: Schouten isn’t wrong on principle. There is an ongoing debate over the sustainability of notice and choice regimes, and the quality of consent in scenarios where the impacts and tradeoffs of processing are too subjective or murky to offload onto users. But the commentariat is also right. Prototype or not, Firefox’s decision went a step too far towards unwanted paternalism, and a step backwards for Firefox’s brand image. Users can handle the truth… including that ‘zilla’s red fox can eat?crow too.

-AK


Reads and Listens

Another week, another set of picks from our personal queues:

  • ?? ’Masters of Privacy’ Pod: Heidi Saas Talks AI Compliance for Marketers. In the latest drop, Sergio sits down with Heidi to burst a few bubbles and steer martech towards greater awareness of the tools they are adopting at an accelerated pace. In framing her views, Heidi draws an important distinction often lost in the technobabble shuffle: machine learning (ML) is not the same as deep learning, which in the case of Generative AI relies on a neural network. We won’t spoil the pod, but at a bare minimum, the business user of the AI needs to be clear who is responsible for what contractually, and how to communicate privacy information to downstream consumers. ?
  • ?? CPPA Preso: California Privacy Agency Ideates Delete Act Mechanism. Put a pin in this one folks. On Friday, Oct 4, the CPPA will be discussing its vision for DROP, the "Delete Request and Opt-Out Platform" the Agency is tasked with developing. Will we see probing and pushback from CCPA/CPRA author Alastair Mactaggard or other CPPA Board members? Alastair has been critical of proposed rules regarding privacy risk assessments, which he thought were overly burdensome on smaller businesses. Then again, he may not be so inclined when it's about data brokers…

Source: California Privacy Protection Agency

-AK


Other Happenings

  1. LinkedIn Backpedals on AI Model Training. LinkedIn just hit reverse after trying to sneak in an "opt-out" policy that automatically used your data to train AI models. The backlash from users and privacy groups, especially in the UK and Europe, was immediate. Cue the U-turn: LinkedIn has now paused the policy in those regions. Even big players like LinkedIn are making desperate moves to stay ahead in the AI race, but it turns out messing with user trust is a risky game. Lesson learned? Hopefully!
  2. Less Is More or Less: Europe’s AI Entrepreneur Paradox. Europe has an AI problem, and it’s not the one they’d like to have. On the one hand, Europe needs to reduce the regulatory burden on homegrown startups without looking overly protectionist. On the other, Europe needs to enforce its Digital Europe policies without driving out established US companies, or chilling investment in local startups and critical compute infrastructure. The Pilgrims left for a reason too.
  3. Microsoft Offers Details on ‘Recall’ PrivSec Improvements. The reveal touts a new security architecture with protections like a VBS Enclave ‘hypervisor’ (this, not this) Hello :) based biometric access controls, and a range of defenses to mitigate against attacks against Recall’s kernel and processes. Snapshot privacy also gets a boost: sensitive content filtering is on by default and users will still be able to control what’s captured and how much/long content is retained. Nice rally by MS, if pending further tests. But a key question still remains -- who exactly is this feature for? Our bet is on professional users.
  4. CNIL Offers Guide on Mobile App Privacy Protection. The French Data Protection Authority (CNIL) is urging app devs to improve their transparency and consent UX practices, with a tacit reminder to watch for forced consent and fibs in notices. CNIL, who leads the EDPB’s Cookie Taskforce, is sending a veiled message to Apple and Google too. Operating system permissions alone may not meet GDPR standards, and additional consent mechanisms (like Consent Management Platforms) may be needed. CNIL has promised to monitor compliance through audits and enforcement actions in 2025.

-RW, RGE


Heidi Saas

Data Privacy and Technology Attorney | Licensed in CT, MD, & NY | ForHumanity Fellow of Ethics and Privacy | AI Consultant | Change Agent | ?? Disruptor ??

1 个月

Thanks for the mention!??

要查看或添加评论,请登录

Lucid Privacy Group的更多文章