AI Regs on the Rise: CO Leads, EU Finalizes Law, & US APRA Shifts
Privacy Corner Newsletter: May 24, 2024
By Robert Bateman and Privado.ai
In this week’s Privacy Corner Newsletter:
An overview of Colorado's new AI law, one of the most significant outside the EU
Colorado’s General Assembly has passed SB 24, the Colorado Artificial Intelligence Act (CAIA), which regulates the use and development of “high-risk AI systems.”
? Why are you writing about Colorado’s new AI law? The European Council just adopted the AI Act…
Each announcement about the EU AI Act’s progress arrives with more fanfare than the last. But The Privacy Corner Newsletter has covered a few, so now it’s Colorado’s turn.?
Colorado’s new AI law is quite similar to the EU’s in some ways, except that it’s much shorter and simpler.
Like the EU AI Act, the CAIA focuses on “high-risk AI systems.” Colorado defines a high-risk AI system as an AI system that makes or makes a “substantial contribution” to a “consequential decision” in the following areas:
The law requires developers and deployers (users) of high-risk AI systems to take “reasonable care” to avoid “algorithmic discrimination” based on age, ethnicity, language proficiency, and certain other characteristics.
Both developers and deployers are presumed to have taken reasonable care to avoid algorithmic discrimination if they comply with their obligations under the CAIA (this mechanism is known as a “rebuttable presumption”).
? What are developers’ obligations under the CAIA?
Developers of high-risk AI systems face many new obligations, mostly around providing technical documentation to the deployers using their products.
Here are some examples of the types of technical information developers must provide:
Deployers also have to notify the Colorado Attorney General about known or reasonably foreseeable risks of algorithmic discrimination.
? What are deployers’ obligations?
Deployers of high-risk AI systems also have extensive obligations under the CAIA, including the following:
The CAIA also cross-references the Colorado Privacy Act, which includes relevant obligations around “profiling with legal or similarly significant effects.”
If signed by the state’s governor, the CAIA will take effect from February 2026.
The ICO has dropped its case against Snap but is ‘making enquiries’ about Microsoft’s Recall
The UK Information Commissioner’s Office (ICO) says it will be speaking with Microsoft about its new Recall feature after deciding not to proceed with enforcement against Snap’s “My AI” chatbot.
? What happened with Snap?
Snap rolled out My AI, a GPT-based chatbot with “additional safety features,” to all Snapchat users last April. The ICO began an investigation last June and issued a preliminary enforcement notice in October.
The ICO did not publish the preliminary enforcement notice, but it reportedly alleged that Snap had not properly considered data protection risks associated with generative AI, particularly for users aged between 13 and 17.
领英推荐
This week, the ICO announced that it would not pursue enforcement against Snap because the company had taken “significant steps to carry out a more thorough review of the risks” and “implemented appropriate mitigations.”
? What about Microsoft?
Microsoft announced Recall, a new AI feature for Copilot+ PCs (computers with special hardware capable of running resource-heavy AI applications).?
Recall grabs an image of a user’s screen every few seconds and stores it in encrypted form on the device’s hard drive. The user can then scroll back “through time” and can search for things they did in the past.
Microsoft says it won’t use the images stored by Recall for any other purposes and will only make them available to the relevant logged-in Windows user. Users can pause recording or exclude certain apps. If the user happens to enjoy Edge browser, Recall won’t screenshot their private browsing sessions.
? What’s the privacy issue?
An app that snaps an image of your screen every few seconds has potentially huge privacy implications. However, based on Microsoft’s reassurances, it appears that all the storing and processing of the images will occur on the user’s device.?
Nonetheless, data protection experts have pointed out some of the many possible risks involved in the product (this article from Carey Lening provides an excellent run-down of the issues), including that Recall will be activated by default on certain PCs.
Following a BBC report describing Recall as a “privacy nightmare”, the ICO said it was “making enquiries with Microsoft to understand the safeguards in place to protect user privacy.”?
Later, on X, the ICO linked its inquiries into Recall with its investigation into Snap’s AI product, the latter of which it described as a “warning shot for industry.” Note, however, that, as far as we know, Snap did not receive an actual warning, reprimand, or any other penalty.
Some important changes have been made to the American Privacy Rights Act draft
A new draft of the American Privacy Rights Act (APRA) was published before a House Energy and Commerce Subcommittee on Thursday.
? What’s changed about the APRA?
Here are a few of the many changes present in this new draft of the APRA.
First, there are a couple of clarifications on data minimization.
The original draft provided a broad prohibition on processing covered data with two main exceptions, one of which was about providing “reasonably anticipated” communications to the individual. The new version clarifies that such communications do not include ads.
The new draft clears up an ambiguity about transferring sensitive covered data. It clarifies that where a covered entity can rely on a permitted purpose to transfer sensitive covered data, the entity must also get consent from the individual.
There’s a new (and long) “privacy by design” section that requires covered entities to implement “policies, procedures, and practices” for identifying and mitigating privacy risks.
The deadline for responding to individual privacy requests has been cut from 45 days to 30 days (for large data holders and data brokers, the timeline remains at 15 days).
On data brokers, the new draft includes a “Delete My Data” request mechanism alongside the original draft’s “Do Not Collect” process. This change addresses a key criticism from the California Privacy Protection Agency’s (CPPA) letter to the bill’s sponsors last month.
On pre-emption of other laws—the contentious issue that could end up killing this bill—there appears to be little change, despite a slight broadening in how other federal laws take precedence over the APRA.
And for whatever reason, the Kids’ Online Safety Act (KOSA), a separate child privacy bill introduced last year, is now tacked onto the end of the APRA.?
Because things weren’t complicated enough already.
What We’re Reading
Privacy, Data Protection, Compliance | CIPP, CIPM, FIP
6 个月Robert Bateman I'd like to see Colorado do a public comment round on AI Risk Nutrition Labels / Manifests as they've done for OOPS/UOOS choice signals. Seems more and more like a good and useful thing. PS: https://www.forbes.com/sites/greglicholai/2023/11/21/its-time-for-nutrition-labels-in-artificial-intelligence/?sh=2e2731a05f87