California Enforces Privacy Rules, France Sets Priorities, FTC Warns on AI Training

California Enforces Privacy Rules, France Sets Priorities, FTC Warns on AI Training

By Robert Bateman and Privado.ai

In this week’s Privacy Corner Newsletter:

  • The CPRA Regulations are live as California regulator wins enforceability appeal.
  • The French DPA sets out its enforcement priorities for 2024.
  • The FTC gives a warning to companies wishing to train AI on their users’ data.
  • What we’re reading: Recommended privacy content for the week.


California privacy agency wins appeal: CPRA regulations are now enforceable

The California Privacy Protection Agency (CPPA) has successfully appealed against a decision by the Sacramento Superior Court to delay the enforcement of regulations promulgated under the California Privacy Rights Act (the “CPRA regulations”).

  • The CPPA beat a claim by the California Chamber of Commerce (CalChamber) attempting to delay the enforcement of the CPRA regulations for 12 months.
  • Having won the case, the CPPA may now enforce the regulations that have already been finalized and approved across 12 areas of California Consumer Privacy Act (CCPA) compliance.
  • Draft regulations covering a further three areas will be enforceable immediately once finalized and approved.

? What’s the background to this case?

The CPRA amended the CCPA in several important ways. For our purposes, the CPRA:

  • Created the first dedicated US privacy regulator, the CPPA, and?
  • Required the CPPA to make regulations in 15 areas (the “CPRA regulations”).

The law said that the CPPA’s “timeline” for making these regulations was July 1, 2022, and that the regulations would be enforceable on July 1, 2023.

? Did the CPPA make the regulations as required?

The CPPA has only finalized regulations covering 12 out of the 15 areas specified in the CPRA. And most importantly, they were approved nine months late—on March 29, 2023.

Once the regulations were approved, CalChamber sued the CPPA. The group pointed out that the CPRA set a deadline of July 2022 for regulations in all 15 areas and set an enforcement date exactly 12 months later.

CalChamber asked the court to delay the enforcement of the regulations until 12 months after regulations in all 15 areas had been approved. The group argued that businesses should not have less time to prepare just because the CPPA was so slow in making the rules.

? What did the trial court say?

At trial (this decision has been appealed), the court granted a partial victory to CalChamber, holding that:

  • The CPPA could only enforce the regulations—those in the 12 areas that had already been finalized—from 29 March 2024—12 months after their approval.
  • The remaining regulations—covering three additional areas—would also have a 12-month grace period once finalized and approved.

If the July 2023 deadline stood, the trial court pointed out, the CPPA could pass regulations with instant or even retroactive effect, giving businesses no time to prepare.

? What did the appeal court say?

The appeal court sided with the CPPA. As such:

  • The CPPA may now enforce the CPRA regulations that have been finalized and approved.
  • Once the remaining CPRA regulations have been approved, the CPPA will be able to enforce them right away, rather than having to wait 12 months.

The appeal court found that the trial judge had interpreted the CPRA as providing a time period (12 months) rather than two dates that happened to be 12 months apart. The appeal court also referenced information provided to Californians when they voted on the CPRA?

? So which regulations are now ‘live’?

The CPPA now has the power to enforce these regulations, covering areas such as privacy notices, the sale of personal information, consumer rights, and much, much more.

These regulations interpret the CCPA (as amended by the CPRA). So, if you’ve put time and effort into CCPA compliance, you should be pretty close to meeting the requirements of the CPPA’s regulations.

But remember—a violation of the regulations is a violation of the CCPA. And the regulations contain a lot more detail than the CCPA itself.

Three other sets of regulations—on risk assessments, cybersecurity audits, and automated decision-making technology—are still in draft but will be enforceable immediately once finalized and approved.


French regulator sets out 2024 GDPR investigations priorities

The French Data Protection Authority (DPA), known as the “CNIL”, has announced its “priority topics” for the coming year.

  • The CNIL is among the EU’s most active regulators and issued a €32 million fine against Amazon’s employee monitoring practices last month.
  • The regulator’s investigative priorities in 2024 include children’s data, “electronic sales receipts and loyalty programs”, and the right of access.
  • The CNIL says it is also preparing to ensure high standards of data protection and security at the upcoming Olympic and Paralympic Games, due to be held in France this summer.

? What do these priorities mean in practice?

The CNIL picks its priority topics each year and focuses on enforcement in the relevant areas over the following 12 months.

Last year, the CNIL conducted 340 investigations, down slightly from 345 in 2022. The regulator’s priority topics in 2023 were smart cameras, mobile apps, and bank and medical records.

The CNIL has issued some of the largest penalties under the GDPR and ePrivacy Directive.?

The regulator is particularly active in the area of cookies, having issued large cookies-related fines against companies such as Google, Meta, Microsoft, and French adtech firm Criteo in recent years.

? So what are this year’s priorities?

The CNIL’s priorities for 2024 year will be:

  • Data collected from minors: “...the CNIL will be checking the applications and sites most popular with children and teenagers to see whether age control mechanisms have been implemented.”
  • Loyalty programs and electronic till receipts: “... the CNIL (is) taking an interest in the information shared with consumers and ensuring that consent is obtained before any data is re-used for advertising targeting purposes.”
  • The right of access:? “...the CNIL and its counterparts (at the European Data Protection Board) will be carrying out checks on the conditions under which data controllers implement the right of access.”
  • The Olympic and Paralympic Games: “Checks will be carried out on the introduction of QR codes for restricted areas, access authorizations, and the use of augmented cameras.”

The CNIL says that around 30% of its investigations will focus on these topics.



FTC: Want to train AI on user data? Changing your terms might be unlawful

The US Federal Trade Commission (FTC) has warned that companies seeking to train AI models on user data must not change their terms and conditions without providing proper notice or getting consent.

  • The FTC acknowledges that businesses seeking to develop machine-learning models may require large quantities of data.
  • The agency states that changing terms in a “retroactive, surreptitious” manner could be “unfair or deceptive” and violate the FTC Act.
  • The FTC cites cases from 2004 and 2023 where it penalised companies that retroactively changed their terms and conditions or privacy policies in ways deemed unfair or deceptive.

? There’s an AI training case from 2004?

The 2004 case referenced by the FTC involved Gateway Learning Corporation (GLC), which developed the “Hooked on Phonics” range of educational software.

This early example of FTC privacy enforcement did not involve AI training but a retroactive change to GLC’s privacy notice that purportedly allowed the company to share its users’ data with third parties.

? So how is that illegal?

In GLC’s case, the violations were rather obvious:

  • GLC’s privacy notice explicitly said, “We do not sell, rent or loan any personally identifiable information regarding our consumers with any third party unless we receive customer’s explicit consent”.
  • The company then started “renting” information such as people’s “names, addresses, phone numbers… age ranges, and gender of their children” for marketing purposes without consent.
  • Two months later, the GLC modified its privacy notice to say it would share personal information to ‘“reputable companies” whose products or services consumers might find of interest… ‘from time to time’”.

The FTC decided this was an unfair and deceptive practice under the FTC Act.

? How does this relate to AI?

The same principle applies to AI training, the FTC says.

“Even though the technological landscape has changed… the facts remain the same: A business that collects user data based on one set of privacy commitments cannot then unilaterally renege on those commitments after collecting users’ data.”

In other words: Don’t say one thing and then do another. Significant changes to your terms of use or privacy notice might require consent.

? What if we never told people we wouldn’t use their data to train our AI?

Good question.?

Answer: “It depends” (sorry)—on which laws apply to you, what you said in your privacy notice, and what exactly “training your AI” means.

If you’re bound by the “purpose limitation” principle in laws like the GDPR or CCPA, you’ll likely need consent before using people’s information as training data.

If your privacy notice or terms and conditions provide an exhaustive list of the ways in which you use your customers’ data, you might need consent before expanding that list to AI training.

But “getting consent” doesn’t have to be prohibitive. If some AI feature will truly benefit your users, perhaps many will want to opt in.

But tread carefully—and remember Zoom’s bad press last year.


What We’re Reading

要查看或添加评论,请登录

社区洞察

其他会员也浏览了