UK GDPR Reforms Back on the Table, Noyb Targets Microsoft's Xandr, California Tightens CCPA Rules

UK GDPR Reforms Back on the Table, Noyb Targets Microsoft's Xandr, California Tightens CCPA Rules

Privacy Corner Newsletter: July 18, 2024

By Robert Bateman and Privado.ai

In this edition of the Privacy Corner Newsletter:

  • The new UK government will revive plans to reform the UK GDPR—but which of the previous government’s proposals survive?
  • Noyb’s latest complaint targets Xandr, a Microsoft-owned adtech firm that rejected nearly 2,000 data requests last year
  • Proposed CCPA Regulations would introduce cybersecurity audits, add new definitions, and change the law’s application threshold
  • What we’re reading: Recommended privacy content for the week.




The UK’s new government will resurrect parts of its predecessor’s controversial data protection reform bill

The newly elected UK government has set out its legislative agenda, including the introduction of a Digital Information and Smart Data Bill (DISDB) that will apparently include elements of the previous government’s Data Protection and Digital Information Bill (DPDIB).

  • The last government’s DPDIB would have reformed the UK’s data protection and privacy framework, but the bill failed to pass before the new government was elected on 4 July.
  • In Wednesday's “King’s Speech”, a ceremony in which King Charles read a speech prepared by the government to begin the new parliamentary session, the government set out its plans to introduce 39 new bills, including some relating to data protection and AI.
  • The DISDB would restructure the UK Information Commissioner’s Office (ICO), liberalize data protection rules on scientific research, and introduce a digital verification scheme, among other proposals.

? So the DPDIB is back?

Not exactly. While we don’t yet have a copy of the government’s planned DISDB, it appears that the bill will incorporate some of the DPDIB’s provisions, including:

  • The restructuring of the ICO into a board with a CEO and Chair—but with “new, stronger powers” that might not have appeared in the DPDIB.
  • A provision enabling scientific researchers to request “broad consent” for a series of processing activities that might normally require separate consent.
  • A new definition of “scientific research” that explicitly includes commercial, privately-funded research.
  • The establishment of a government-certification process for Digital Verification Services.
  • An amendment to the UK’s Online Safety Act enabling coroners to access information from the social media accounts of recently deceased children.
  • A “Smart Data” scheme that appears to draw from Part 3 of the DPDIB (“Customer data and business data”), which relates to open banking and similar processes.

The speech also mentioned “targeted reforms to some data laws”, which could mean that other DPDIB provisions return under the new bill.

? What about the AI bill?

Prior to the King’s Speech, the government had mentioned “AI legislation” impacting larger tech companies, leading many to predict the introduction of an AI bill akin to the EU AI Act. But no AI bill has emerged.

In February, Member of Parliament (MP) Peter Kyle (now Security of State for Business, Innovation, and Skills) discussed a “statutory code” requiring AI companies to “share testing data” with the government.

A statutory code is not a bill, which could explain its absence from the list of 39 bills introduced under the King’s Speech.

The government also says its proposed Product Safety and Metrology Bill will “enable the UK to keep pace with technological advances, such as AI”.

So it appears that some form of AI-specific regulation is planned in the UK—even if it isn’t quite as extensive as the EU AI Act (which, incidentally, was published in the EU’s Official Journal last week).




Noyb’s latest complaint targets Xandr, a Microsoft-owned adtech firm that rejected nearly 2,000 data requests last year

Privacy campaign group noyb has submitted a complaint to the Italian Data Protection Authority (DPA) against Xandr, an adtech company owned by Microsoft.

  • Xandr, which was acquired by Microsoft in 2021, operates a Real-Time Bidding (RTB) system and owns a “Demand-Side Platform” (DSP).?
  • Xandr refused to facilitate the complaint’s access and erasure requests, claiming they could not identify him due to the pseudonymous nature of their data.
  • The complaint alleges that Xandr has “systematically” violated the GDPR’s data subject rights. Based on data provided by one of Xandr’s suppliers, the complainant also accuses the company of violating the “accuracy” principle.

? What’s the background?

As noted above, Xandr is an adtech company owned by Microsoft. The company operates an ad-bidding system and a DSP that associates internet users with inferences about their preferences and characteristics (“market segments”) based on data collected via cookies.

The complainant in this case visited certain websites where cookies were placed on his device by:

  • Emetriq, a data broker that supplies data to Xandr
  • Xandr itself

Using each company’s respective cookie ID, the complainant made data subject rights requests to Emetriq and Xandr.

? How did the companies respond?

Emetriq came back with a list of market segments associated with the relevant cookie ID. The company revealed that it had inferred, reportedly within a period of two hours, that the complainant was:

  • A woman
  • A man
  • Aged between 16-19, 20-29, 30-39, 40-49, 50-59, and over 60
  • In receipt of an income between €500 and €4,000 per month
  • A school pupil, student, job seeker, employed, and self-employed
  • Working for a company with 1-100, over 1000, and between 1001 and 5000 employees
  • A light, medium, and heavy TV viewer

Obviously, these inferences are contradictory. Noyb therefore alleges that Xandr, which receives data from Emetriq, violates the accuracy principle and misleads its customers regarding the quality of its data.

Xandr said it could not facilitate the complainant’s requests because its ad platform “contains consumers’ pseudonymous personal data and not personally identifiable information (such as name or plain text email address),” but later stated that it would delete the cookie ID provided by the complainant.

? But the complainant provided pseudonymous data, not his name or email

Indeed, Xandr’s response does not appear to relate to the complainant’s request. On its website, Xandr publishes metrics relating to “consumer data requests” revealing that, in 2022, Xandr:?

  • Received 1294 access requests and 660 deletion requests from consumers “globally”
  • Complied “in whole or in part” with zero access requests and zero deletion requests

? Why is Xandr telling people this?

Xandr might be publishing these response rates as part of its obligations under the California Consumer Privacy Act (CCPA). Under the California Attorney General’s CCPA Regulations, certain larger businesses must publish metrics relating to compliance with the law’s consumer rights.

Xandr could also be relying on the CCPA’s definition of “deidentified” information to assert that it cannot associate noyb’s complainant with any data it holds. However, the company did agree to delete the identifier provided by the complainant.

Noyb has complained to the Data Protection Authority (DPA) in Italy, where the complainant lives, and argues that the Italian DPA should investigate Xandr directly rather than investigating its parent company, Microsoft.

While this argument is not unreasonable, a cynic might suggest it is noyb’s latest attempt to avoid the GDPR’s One-Stop Shop process, under which its complaint would be sent to Microsoft’s lead supervisory authority, the Irish DPA. The Italian DPA tends to be more decisive in its enforcement.




Proposed CCPA Regulations would introduce cybersecurity audits, add new definitions, and change the law’s application threshold

The California Privacy Protection Agency (CPPA) has published draft California Consumer Privacy Act (CCPA) Regulations with major implications for businesses subject to the law.

  • The CPPA’s proposals modify existing CCPA Regulations, build on earlier versions of automated decision-making technology (ADMT) rules, and set out the CCPA’s cybersecurity audits.
  • Noteworthy provisions include a broad definition of “artificial intelligence”, corrections of regulations already in effect, and details of which businesses must conduct annual cybersecurity audits.
  • An Economic Assessment estimates that the draft regulations—if implemented as currently proposed—could cost California businesses over $4.2 billion.

? What’s changed under these new proposed regulations?

The CPPA proposes many changes. Here are a few highlights, but bear in mind that we’re still at an early stage of the rulemaking process (despite being absurdly behind schedule).

  • There’s a new definition of “artificial intelligence”, derived from that of the Organisation for Economic Co-operation and Development (OECD), but with somewhat broader and more caveated language.
  • There’s a list of technologies that do not constitute ADMT—such as spreadsheets, spam filters, and antivirus software—unless they are used to circumvent the CPPA’s ADMT rules.
  • Various monetary amounts are being adjusted for inflation—effective retroactively, from January 1, 2023—including statutory damages, civil penalties, and the revenue threshold that determines which companies constitute a “business”, which will rise from $25 million to $27.975 million per year.
  • Some minor but important corrections, including to some regulations that are already in effect, most notably relating to privacy policy disclosures.

? What about the cybersecurity audits?

As currently proposed, the following types of businesses will need to conduct annual cybersecurity audits:

  • Businesses that derive 50% or more of their revenue from selling personal information, and
  • Businesses with annual revenues of at least $25 million (as noted, this amount will soon rise to nearly $28 million) that process personal information about at least 250,000 consumers or process sensitive personal information about at least 50,000 consumers.

The CPPA estimates that around 25,352 businesses in California will meet this threshold (note that many businesses outside California will, too).

Covered businesses will have to conduct their first audit within two years of the regulations taking effect. From that point on, businesses must send the CPPA a certification confirming that they have carried out the audit every year.

The proposed cybersecurity audit requirements are too extensive to list here in detail. Businesses can appoint a suitable person internally—subject to certain safeguards to ensure their independence—or hire an external auditor.

An economic assessment suggests that cybersecurity audit proposals would cost California businesses over $2 billion, with the total economic impact of the proposed regulations coming in at over $4.2 billion. And that’s just for businesses based in California.

As such, we can expect some resistance to these proposals from groups like the California Chamber of Commerce—and support from California-based tech consultancies.

What We’re Reading

要查看或添加评论,请登录

Privado.ai的更多文章

社区洞察

其他会员也浏览了