Data privacy heating up! US states race for strongest laws, UK grapples with AI rights, and EU probes Meta's child safety.
Privacy Corner Newsletter: May 17, 2024
By Robert Bateman and Privado.ai
In this week’s Privacy Corner Newsletter:
Maryland enacted one of the strictest US privacy laws—then Vermont passed a bill that raises the bar even higher
In the past week, Maryland’s governor signed the Maryland Online Data Privacy Act (MODPA), and Vermont passed the comprehensive privacy bill H121.
? More state privacy laws? Are these two more interesting than the last fifteen?
Yes, these laws are a little different from what we’re used to.
Both Maryland and Vermont’s laws include tight rules on “data minimization”. They move away from the standard “Virginia-style” model—which generally allows controllers to process non-sensitive data by default—and towards something much more prescriptive.
Under Maryland’s law, a controller may not collect personal data except where “reasonably necessary and proportionate” to provide or maintain a product or service requested by the consumer. Processing personal data for further incompatible purposes requires consent.
Maryland provides the same rule for sensitive data, except that it only applies when processing, collecting, or sharing sensitive data is “strictly” (as opposed to “reasonably”) necessary for providing or maintaining a product or service. And selling sensitive data is prohibited altogether.
Vermont expresses its data minimization rule somewhat differently, allowing controllers to process personal data:
Like most Virginia-style laws, Vermont only allows the processing of sensitive data with consent.
? What else is interesting about these two laws?
Both laws include practically every obligation on the Virginia-style menu, including data protection assessments, the full range of consumer rights, and a requirement to honor Universal Opt-Out Mechanisms (UOOMs).
Vermont’s bill is particularly special, though, as it would create the only comprehensive privacy law outside California that empowers consumers to sue businesses that violate it.?
? Vermont’s new private right of action
H.121’s “private right of action” has changed substantially over a series of amendments.?
This re-drafting is because a poorly constructed private right of action can have serious consequences. If it’s too broad, it encourages frivolous and disruptive litigation. Too narrow, and it becomes practically meaningless.
As it stands, the bill passes the buck to the Vermont Attorney General. By mid-January 2026, the AG must design “legislative language for implementing a private right of action” that focuses on the most serious harms and protects businesses acting in good faith.
Nonetheless, both Vermont’s and Maryland’s laws break the trend of copy/pasting Virginia’s privacy law, ensuring that 2024 turned out to be another very interesting year for state privacy law.
UK regulator opens consultation on data rights in generative AI (but dodges the hardest question)
The UK’s Information Commissioner’s Office (ICO) has published a “call for views” on “engineering individual rights into generative AI models”.
领英推荐
? Does the ICO give its own views on generative AI and data subject rights?
The ICO provides some basic analysis of the challenges of meeting data subject rights obligations when developing and using generative AI. The following statements are among the more noteworthy:
The ICO also states that AI developers must be in a position to uphold the rights of access, deletion, and restriction of processing, and the right to object.
? What about the right to rectification?
The right to rectification, i.e., to correct inaccurate personal data, is particularly challenging in this context.?
Noyb’s complaint against OpenAI, covered in a previous edition of The Privacy Corner, suggests that this is one of the hardest aspects of the GDPR to reconcile with generative AI.
But this consultation “does not examine the right to rectification in detail,” the ICO says, as it “was examined in our previous consultation on accuracy.”
Refer back to that consultation, however, and there’s a similar message. “The analysis in this document does not cover the right to rectification,” the ICO said, promising to cover the topic in the next consultation.
So, it appears we might not give the regulator’s interpretation of this “hard GDPR question.” If you have views about it, you might wish to tell the ICO. This part of the consultation is open until June 10. The next part will examine controllership in the generative AI supply chain.
European Commission begins Digital Services Act investigation over Meta’s child safety and age verification practices
The European Commission has opened a formal investigation into Meta’s Facebook and Instagram platforms under the Digital Services Act (DSA).
? Why does the Commission think Meta may have violated the DSA?
The investigation will explore three main issues:
These issues fall under Articles 28, 34, and 35 of the DSA.
Regarding the privacy-related matters, the Commission has not revealed details of the potential “privacy, safety, and security” and age-verification problems.?
However, we do know how Meta verifies its users’ ages on Facebook and Instagram.
? How does Meta verify its users’ ages on Facebook and Instagram?
Meta verifies users’ ages, in its own words, by “asking people for their birthdays.”
Providing a date of birth is one of several ways users can verify their ages on Meta’s platforms, along with providing a government-issued ID or a selfie. But presumably, most users choose to simply provide their date of birth.
Is this method good enough to satisfy the DSA’s requirement to protect children’s rights? The DSA does not specify how platforms should verify children’s ages, only that they should factor age verification in as part of a risk assessment.
It’s not clear that some age verification solutions—such as collecting every user’s ID or using biometric age estimation—would comply with the GDPR’s “data minimization” principle.
While Meta is probably chiefly concerned with its user count, which would likely fall if every user had to show an ID, it’s fair to say that age verification is another “hard GDPR problem.”
What We’re Reading