State Privacy News - 7/26

State Privacy News - 7/26

Welcome to The Patchwork Dispatch, a fortnightly newsletter that brings you the top 5 recent developments in consumer privacy legislation, regulation, and enforcement from across the U.S. states. This is a big fortnight for our paper's Editor-in-Chief K. Lamont who is taking leave and as such, the below updates do not reflect any state privacy developments that may have occurred following Wednesday afternoon of July 24th.

1. No New California Privacy Rulemaking... Yet

On July 16, the California Privacy Protection Agency hosted a board meeting that covered a variety of topics. While the agenda teased "possible action" on proposed regulations regarding risk assessments, automated decisionmaking technology (ADMT) opt-outs, cybersecurity audits, and updates to existing rules; the board did not call for a vote to advance the proposal to formal rulemaking. Nevertheless, Chair Urban and member Le both stated that it was time to receive public input on the proposed regulations and expressed hope that the necessary documentation will be in place to begin the formal rulemaking process at the board's upcoming meeting in September.

Debate during the meeting surfaced several lines of concern that are likely to feature prominently whenever formal rulemaking on the proposed regulations does commence. In particular, member Mactaggart questioned why the use of ADMT alone should be a trigger for conducting a risk assessment. He was also concerned how ADMT opt out rights would work at scale, arguing that relevant definitions are so broad that they could capture essentially any use of software. In particular, Mactaggart keyed in on the proposed regulations' application to systems that implicate “access to” (rather than just the provision or denial of) important life opportunities. He further favorably referenced the Colorado Privacy Act Regulations 'tiers of human involvement' approach to regulating data profiling, in contrast to the proposed regulations' exception to opt-out rights only if a human appeal process is offered.

Prior to the meeting, the Agency also released preliminary estimates of the cost of the proposed regulations. In short, the assessment found that the proposed regulations would cover approximately 52,000 California Businesses (25,000 for cybersecurity audit requirements) and result in over $4 billion in direct costs during the first year of their adoption. Pointing to these anticipate costs, member Mactaggart raised the possibility of raising the applicability thresholds for the ADMT and risk assessment requirements.

The public comment period also highlighted areas of concern that will likely feature during formal rulemaking on the proposed regulations. Multiple industry groups argued that prescriptive cybersecurity audit requirements would constitute an inappropriate "backdoor" to new substantive cybersecurity standards. With respect to proposed rules on ADMT, a representative of the California Chamber of Commerce dialed in to note that both the governor and state legislature are working on different aspects of Artificial Intelligence and encouraged the Agency to allow the democratic process to work through these issues from a variety of perspectives, not just a single lens [of privacy]. A member of the California Department of Insurance also discussed ongoing multi-state efforts toward a model insurance privacy law and expressed concern that proposed regulations on this topic could create confusion.

Perhaps the most notable participant in public comments was former California Senate Majority Leader Robert Hertzberg. He cautioned the board against moving so quickly on AI regulations, noting that the underlying law is about privacy and only has a couple of lines about artificial intelligence, expressing dismay that now the CPPA wants to become the 'AI Agency.' He further argued that moving forward on these regulations without direction from the governor or legislature would risk undermining the Agency's institutional credibility.

2. California AADC Has a Tough Day in Court

On July 17, a Ninth Circuit panel heard oral arguments (video here) in NetChoice v. Bonta - a case regarding the constitutionality of the California Age-Appropriate Design Code (AADC) which a District Court previously enjoined. While the lower court found that essentially every affirmative obligation of the AADC is constitutionally-suspect under the First Amendment; oral arguments before the Ninth Circuit focused on the AADC's data protection impact assessment (DPIA) requirements. In particular, the judges appeared skeptical of the State’s argument that the AADC does not regulate content (and instead governs “data management practices"), particularly when it comes to the Act's requirement to create a timed plan to mitigate or eliminate the risk of the exposure of children to “harmful, or potentially harmful content" that is identified by a DPIA.

Furthermore, the judges asked multiple questions on severability, with one noting that much of the bill is “keyed” to the DPIA requirements. However, the judges generally seemed to think some parts of the California AADC do not raise First Amendment issues (such as the formation of a child data protection working group and restriction on geolocation information) and explored how the Act would operate if only the DPIA provisions were struck down. It was also noted that severability is a state issue rather than a federal issue, and the idea of certifying the matter to the California Supreme Court was raised.?

Finally, both NetChoice and the State received multiple questions about how the recent SCOTUS NetChoice v. Moody decision (which was decided following the District Court’s injunction) should impact the case, given that Moody lays out considerations for courts in analyzing facial challenges to laws (as NetChoice has brought against the AADC).

3. My Health, My Data Act Near-Copycat in D.C.

Turning to the District of Columbia, on July 12 Attorney General Schwalb (via DC Council Chairman Mendelson) introduced the Consumer Health Information Privacy Protection Act of 2024 ("CHIPPA"). The proposal shared numerous common features with the Washington State My Health, My Data Act (MHMDA) of 2023 including classic MHMDA features such as a broad definition of "consumer health data"; a definition of "collect" that includes any processing; two different tiers of consent; and a 'backdoor' private right of action tied to the state's unfair & deceptive act and practice.

However, CHIPPA contains a significant divergence from MHMDA that arguable makes it a much more restrictive framework. While MHMDA provides that a business may collect health data if necessary to provide a requested product or service, CHIPPA provides that collection of health data is only permissible if an organization first obtains consent.

The DC Council is currently on recess and scheduled to return on September 16.

4. New York State Staffing Up for Child Privacy Rulemakings

The New York Attorney General's Office is hiring for two attorney roles (a special counsel and a regulatory project attorney) on 24-month terms who will be charged with leading rulemaking surrounding New York's two new child online safety laws, the New York Child Data Protection Act and the SAFE for Kids Act.

It is an encouraging sign to see New York hire staff for this work as the relevant laws provide for tremendously important and complex rulemakings:

  • The New York Child Data Protection Act provides for rulemaking to establish of a new class of device signals termed "user-provided age flags" that are intended to communicate on a default basis whether a user is a covered minor. Such signals may also be intended to communicate opt-in consent for processing on a default basis.
  • The SAFE for Kids Act charges the Attorney General with developing regulations on "commercially reasonable and technically feasible" methods of age verification as well as methods for obtaining "verifiable parental consent." Furthermore, the Act will not take effect until half a year after the regulations are finalized.

It has been said that personnel is policy. Applications are due August 16th.

5. Florida Initiates Social Media Rulemaking

Speaking of rulemaking for commercially reasonable age verification, on July 23 the Florida Department of Legal Affairs posted proposed rules to implement Florida's new social media use for minors law (HB 3). The law (effective January 1, 2025) requires certain social media platforms to verify the age of all its account holders, to prohibit minors under age 14 from opening accounts and to obtain parental consent for individuals aged 14 or 15 to hold social media accounts. The bill also provides for the 'permanent deletion' of all personal data associated with existing accounts of minors.

The proposed rules define "commercially reasonable age verification," perhaps without adding a great deal of clarity to the term, as a "method of verifying age that is regularly used by the government or businesses for the purpose of age and identify verification." With respect to "reasonable parental verification" the draft is slightly more detailed, providing an example verification process of a social media platform: (1) requesting a parent's name, address, phone number, and e-mail address from a child; (2) contacting the name provided by a child and requesting documents or other information sufficient to evidence the relationship; and (3) utilizing any "commercially reasonable method" regularly used by the government or business to verify that parent's identity and age.

There are certain ambiguities in the proposed rules. As drafted, it is not clear if both steps (2) and (3) are necessary to satisfy the parental verification requirement or if either step alone would be sufficient. The relevancy of verifying a parent's age as part of parental verification is also unclear and does not appear supported by the statute. Likewise, neither the Act nor proposed regulations provide information about the timeframe for which permanent deletion of data associated with a terminated account should occur.

Stakeholders have 21 days since time of publication to request a hearing on the proposed regulations.

As always, thanks for stopping by.


Keir Lamont is the Director for U.S. Legislation at the Future of Privacy Forum



Matthew R.

Director @ CIPL | Privacy, Data, and Technology Policy

4 个月

Super helpful, as always. Thank you, Keir and team!

回复

Washington's gift to the privacy world is writing laws that get traction elsewhere, even though it can't always pass them. Nice to see CHIPPA modeled after MHMD, even if only by another Washington.

Bailey Sanchez

Senior Counsel at Future of Privacy Forum | FPF US Legislation Team

4 个月

"Speaking of rulemaking for commercially reasonable age verification" is an all-time great line written by the Dispatch staff

Ron De Jesus

The Industry’s 1st Field Chief Privacy Officer @ Transcend. Founder @ De Jesus Consulting. AIGP, FIP, CIPP/A/C/E/US, CIPM, CIPT, CDPSE, CISSP. Fusing compliance with creativity.

4 个月

Love that you use "fortnightly" to describe the cadence of this U.S.-focused newsletter ??

要查看或添加评论,请登录

社区洞察

其他会员也浏览了