CPPA Advises on Data Minimization, Google Deletes Incognito Records, UK ICO Focuses on Children's Privacy

CPPA Advises on Data Minimization, Google Deletes Incognito Records, UK ICO Focuses on Children's Privacy

By Robert Bateman and Privado.ai

In this week’s Privacy Corner Newsletter:

  • California’s CPPA publishes an “enforcement advisory” about applying data minimization when processing consumer rights requests.
  • Google agrees to delete “billions of records” about people’s Incognito mode browsing activities.
  • The UK ICO announces a focus on children’s privacy over 2024-25.
  • What we’re reading: Recommended privacy content for the week.


California regulator publishes advisory on data minimization and consumer rights

The California Privacy Protection Agency (CPPA) has published an “enforcement advisory” about how the California Consumer Privacy Act’s (CCPA) data minimization principle applies to verifiable consumer requests.

  • The CPPA’s enforcement advisory is the first in a series intended to “share observations with the regulated community” and “encourage voluntary compliance” with the CCPA.
  • The agency says some businesses are requesting unnecessary and excessive personal information when verifying consumer rights requests.
  • The advisory sets out the relevant parts of the CCPA and the CCPA Regulations and applies them to hypothetical examples.

? What’s an ‘enforcement advisory’?

According to the CPPA’s Enforcement Division, an enforcement advisory addresses selected provisions of the CCPA and the CCPA Regulations in response to compliance trends. It does not:

  • “Implement, interpret, or make specific” the law
  • “Establish substantive policy or rights, constitute legal advice, or reflect the views” of the CPPA’s board
  • Provide “any options for alternative relief or safe harbor from potential violations”

The CPPA’s Deputy Director of Enforcement says enforcement advisories are meant to encourage compliance voluntarily, but “sometimes stronger medicine will be in order”.

? So how does data minimization apply to consumer rights requests?

Data minimization underpins all CCPA-covered activities, requiring the “collection, use, retention, and sharing” of personal information to be “reasonably necessary and proportionate” to achieve a specific purpose. The CCPA Regulations expand on the principle and apply it in various contexts.

The CPPA’s advisory is a less formal look at the sorts of questions businesses should ask themselves when verifying a consumer rights request.?

The central question is: What personal information, if any, should we collect to verify that the consumer is who they say they are?

? What’s the answer?

The answer varies depending on the context.?

For example, when processing a consumer’s request to opt out of the sale and sharing of their personal data, the advisory suggests that a business should ask itself the following questions:

  • What is the minimum amount of personal information necessary for our business to honor the opt-out request?

  • We already have certain personal information from this consumer. Do we need to ask for more personal information than we already have?
  • What are the possible negative impacts if we collect additional personal information?
  • Could we put in place additional safeguards to address the possible negative impacts?

? Does the CPPA recommend a particular verification method?

The advisory says that businesses should “establish, document, and comply with” a “reasonable” verification method, but there are few hard-and-fast rules.

The CCPA Regulations are clearest regarding the processing of Opt-Out Preference Signals (OOPS): ”A business shall not require a consumer to provide additional information beyond what is necessary to send the signal.”

But nuance arises in more complex scenarios.

For example, suppose a consumer makes their request via an email address that the business already associates with the consumer. If the request relates to low-risk personal information, matching this email address might be enough to satisfy the business that the request is genuine.?

But email addresses can be spoofed. A “more stringent verification process” might be warranted for a request involving sensitive personal information. Deleting sensitive information following a fraudulent request could cause serious harm. Perhaps collecting government ID would be appropriate.

The CPPA might not be saying anything particularly profound with this publication. But it’s called an ”enforcement advisory”, so we might see regulation in this area soon.


Google to delete ‘billions’ of private browsing records as part of ‘Incognito mode’ class action settlement

Google has agreed to delete billions of records reflecting the private browsing activity of around 136 million people to settle a class action lawsuit.

  • The case, Brown v Google, began with a court filing in 2020 alleging that Google mischaracterized Chrome’s Incognito mode and collected private browsing data via cookies.
  • Despite a vigorous initial defense of the case, Google has agreed to rewrite its disclosures to users and delete a broad range of supposedly private browsing data.
  • The settlement requires court approval before finalization and is slated for a Northern California District Court hearing in July.

? So Incognito mode isn’t private?

Until this complaint was filed in 2020, Incognito mode’s main privacy feature was that it didn’t save browsing history or form inputs.

In their initial complaint , the plaintiffs alleged that Google tracked users via tools such as Google Analytics and Ad Manager regardless of whether the users were browsing normally or “privately”. As such, the complaint argued that Google’s claims about Incognito mode were misleading.

To illustrate this point, the plaintiffs showed how Google’s tracking tools activated on the New York Times website even when the user visited the site via Incognito mode.

The plaintiffs claimed Google had violated several well-worn laws that frequently show up in privacy class actions:

  • The federal Wiretap Act
  • The California Invasion of Privacy Act (CIPA) (another wiretapping law)
  • The torts of “invasion of privacy” and “intrusion upon seclusion”

Despite alleging that Google had misrepresented Incognito mode’s privacy features, the plaintiffs did not cite violations of consumer protection law.

? How did Google respond to the allegations?

Google mounted an extremely vigorous initial defense, denying the allegations and filing two (unsuccessful) motions to dismiss.?

During a long and drawn-out discovery process, Google was ordered to pay nearly $1 million in fees for allegedly concealing important evidence—the ”private browsing detection bits” used to “track?

a user’s decision to browse privately” and “label the data collected as private”.

The plaintiffs also secured emails showing how Google employees had long been concerned about the potentially misleading nature of Chrome’s Incognito mode “splash screen” (the text displayed when a user opens an Incognito window).?

For example, in 2019, a Google employee suggested that the splash screen should inform users that Incognito mode does not prevent Google from tracking the user. Google has since implemented this change.

In 2020, while still fighting the case, Google also turned off third-party cookies in Incognito mode by default. This move means better (but not total) protection from Google’s and other companies’ tracking technologies.

? What are the settlement terms?

The settlement acknowledges the changes Google has already made as a victory for the plaintiffs. Google also agreed to modify its explanation of Incognito mode even further.

Google will also:

  • Delete all private browsing data collected before December 2023 (when the splash screen text changed)
  • Stop tracking users’ Incognito mode status?
  • Stop collecting browsing activity about people using Incognito mode

Google is not required to pay damages to the 136 million people the plaintiffs represent.?

However, the settlement says that “class members remain free to bring individual damages claims.” A separate Incognito mode case is also ongoing with the Texas Attorney General. So even once this settlement is approved, Google’s Incognito issues will likely continue.


UK regulator announces focus on children’s online privacy

The UK Information Commissioner’s Office (ICO) has announced that it will prioritize children’s privacy throughout 2024-25, with a focus on social media and video-sharing platforms.

  • In 2021, the UK adopted the ICO’s Children’s Code, a GDPR code of practice setting compliance standards for online services and websites “likely to be accessed by children”.
  • Platforms such as Facebook, Instagram, and YouTube implemented child-focused changes in the run-up to the code’s publication.
  • The ICO says it will focus on Children’s Code compliance in areas such as default privacy and location settings, targeted advertising, and recommender systems.

? What’s the Children’s Code?

The Children’s Code is a UK GDPR code of practice developed by the ICO and adopted by Parliament. It sets out compliance requirements for online services “likely to be accessed by children”.

The Code covers issues such as age verification, age-appropriate design, and transparency for child users.

? Has the ICO ever enforced under the Children’s Code?

The Children’s Code isn’t a law as such—it derives from the UK GDPR. However, the ICO says it has “audited” 11 companies and “assessed” 44 companies to determine whether they were meeting the Code’s standards.

The ICO also says the Code caused several big tech platforms to make voluntary changes, including:

  • Facebook and Instagram started requiring a date of birth at signup and restricted how they target ads to users under 18.
  • Instagram launched some parental supervision tools.
  • Google turned off its “location history” feature and YouTube’s autoplay feature by default on children’s accounts.

? How will the ICO focus on the Children’s Code in 2024-25?

The ICO says it will “take a closer look” at the following areas as they relate to the Children’s Code:

  • Default privacy and geolocation settings
  • Profiling children for targeted advertisements
  • Using children’s information in recommender systems
  • Using information about children under 13 years old

The ICO’s TikTok fine from last year is worth reading to understand how the ICO approaches enforcement in this area. However, that decision concerned a period between 2018 and 2020, before the Children’s Code was published.

The regulator has also announced preliminary enforcement against Snap relating to children using its AI features.


What We’re Reading

要查看或添加评论,请登录

社区洞察

其他会员也浏览了