CPPA Advises on Data Minimization, Google Deletes Incognito Records, UK ICO Focuses on Children's Privacy
By Robert Bateman and Privado.ai
In this week’s Privacy Corner Newsletter:
California regulator publishes advisory on data minimization and consumer rights
The California Privacy Protection Agency (CPPA) has published an “enforcement advisory” about how the California Consumer Privacy Act’s (CCPA) data minimization principle applies to verifiable consumer requests.
? What’s an ‘enforcement advisory’?
According to the CPPA’s Enforcement Division, an enforcement advisory addresses selected provisions of the CCPA and the CCPA Regulations in response to compliance trends. It does not:
The CPPA’s Deputy Director of Enforcement says enforcement advisories are meant to encourage compliance voluntarily, but “sometimes stronger medicine will be in order”.
? So how does data minimization apply to consumer rights requests?
Data minimization underpins all CCPA-covered activities, requiring the “collection, use, retention, and sharing” of personal information to be “reasonably necessary and proportionate” to achieve a specific purpose. The CCPA Regulations expand on the principle and apply it in various contexts.
The CPPA’s advisory is a less formal look at the sorts of questions businesses should ask themselves when verifying a consumer rights request.?
The central question is: What personal information, if any, should we collect to verify that the consumer is who they say they are?
? What’s the answer?
The answer varies depending on the context.?
For example, when processing a consumer’s request to opt out of the sale and sharing of their personal data, the advisory suggests that a business should ask itself the following questions:
? Does the CPPA recommend a particular verification method?
The advisory says that businesses should “establish, document, and comply with” a “reasonable” verification method, but there are few hard-and-fast rules.
The CCPA Regulations are clearest regarding the processing of Opt-Out Preference Signals (OOPS): ”A business shall not require a consumer to provide additional information beyond what is necessary to send the signal.”
But nuance arises in more complex scenarios.
For example, suppose a consumer makes their request via an email address that the business already associates with the consumer. If the request relates to low-risk personal information, matching this email address might be enough to satisfy the business that the request is genuine.?
But email addresses can be spoofed. A “more stringent verification process” might be warranted for a request involving sensitive personal information. Deleting sensitive information following a fraudulent request could cause serious harm. Perhaps collecting government ID would be appropriate.
The CPPA might not be saying anything particularly profound with this publication. But it’s called an ”enforcement advisory”, so we might see regulation in this area soon.
Google to delete ‘billions’ of private browsing records as part of ‘Incognito mode’ class action settlement
Google has agreed to delete billions of records reflecting the private browsing activity of around 136 million people to settle a class action lawsuit.
? So Incognito mode isn’t private?
Until this complaint was filed in 2020, Incognito mode’s main privacy feature was that it didn’t save browsing history or form inputs.
In their initial complaint , the plaintiffs alleged that Google tracked users via tools such as Google Analytics and Ad Manager regardless of whether the users were browsing normally or “privately”. As such, the complaint argued that Google’s claims about Incognito mode were misleading.
To illustrate this point, the plaintiffs showed how Google’s tracking tools activated on the New York Times website even when the user visited the site via Incognito mode.
领英推荐
The plaintiffs claimed Google had violated several well-worn laws that frequently show up in privacy class actions:
Despite alleging that Google had misrepresented Incognito mode’s privacy features, the plaintiffs did not cite violations of consumer protection law.
? How did Google respond to the allegations?
Google mounted an extremely vigorous initial defense, denying the allegations and filing two (unsuccessful) motions to dismiss.?
During a long and drawn-out discovery process, Google was ordered to pay nearly $1 million in fees for allegedly concealing important evidence—the ”private browsing detection bits” used to “track?
a user’s decision to browse privately” and “label the data collected as private”.
The plaintiffs also secured emails showing how Google employees had long been concerned about the potentially misleading nature of Chrome’s Incognito mode “splash screen” (the text displayed when a user opens an Incognito window).?
For example, in 2019, a Google employee suggested that the splash screen should inform users that Incognito mode does not prevent Google from tracking the user. Google has since implemented this change.
In 2020, while still fighting the case, Google also turned off third-party cookies in Incognito mode by default. This move means better (but not total) protection from Google’s and other companies’ tracking technologies.
? What are the settlement terms?
The settlement acknowledges the changes Google has already made as a victory for the plaintiffs. Google also agreed to modify its explanation of Incognito mode even further.
Google will also:
Google is not required to pay damages to the 136 million people the plaintiffs represent.?
However, the settlement says that “class members remain free to bring individual damages claims.” A separate Incognito mode case is also ongoing with the Texas Attorney General. So even once this settlement is approved, Google’s Incognito issues will likely continue.
UK regulator announces focus on children’s online privacy
The UK Information Commissioner’s Office (ICO) has announced that it will prioritize children’s privacy throughout 2024-25, with a focus on social media and video-sharing platforms.
? What’s the Children’s Code?
The Children’s Code is a UK GDPR code of practice developed by the ICO and adopted by Parliament. It sets out compliance requirements for online services “likely to be accessed by children”.
The Code covers issues such as age verification, age-appropriate design, and transparency for child users.
? Has the ICO ever enforced under the Children’s Code?
The Children’s Code isn’t a law as such—it derives from the UK GDPR. However, the ICO says it has “audited” 11 companies and “assessed” 44 companies to determine whether they were meeting the Code’s standards.
The ICO also says the Code caused several big tech platforms to make voluntary changes, including:
? How will the ICO focus on the Children’s Code in 2024-25?
The ICO says it will “take a closer look” at the following areas as they relate to the Children’s Code:
The ICO’s TikTok fine from last year is worth reading to understand how the ICO approaches enforcement in this area. However, that decision concerned a period between 2018 and 2020, before the Children’s Code was published.
The regulator has also announced preliminary enforcement against Snap relating to children using its AI features.
What We’re Reading