AI Act Approved, Kentucky Gets Privacy Law, EU Commission Fined for Data Breach

AI Act Approved, Kentucky Gets Privacy Law, EU Commission Fined for Data Breach

By Robert Bateman and Privado.ai

In this week’s Privacy Corner Newsletter:

  • The European Parliament approves the EU AI Act: What’s next?
  • Kentucky passes a comprehensive privacy law.
  • The European Commission is found to have violated data protection law by using Microsoft 365.
  • What we’re reading: Recommended privacy content for the week.




European Parliament approves AI Act: What happens next?

The EU AI Act has cleared one of the final stages of the EU’s legislative process after Members of the European Parliament (MEPs) approved the text on Wednesday.

  • Since its proposal by the European Commission in 2021, the AI Act has been through several rounds of negotiations and amendments among the EU’s institutions.
  • The final text reflects a compromise package agreed between the European Parliament and Council late last year.
  • The text will now be translated into the EU’s official languages and must be formally approved by the Council. The law should appear in the Official Journal in May or June and will take effect 20 days later.

? Didn’t you cover this before? Has the AI Act passed or not?

This is a very significant law, so digital policy enthusiasts are watching very closely. We can’t quite say that the law has “passed” yet.?

The Parliament’s approval of the final text is arguably the most significant stop on the AI Act’s legislative journey, but it’s not the final one.

Here’s a timeline for what happens next.

  • In the coming weeks: The AI Act’s final text is approved by the Council and translated into the EU’s many official languages.
  • May or June 2024: The AI Act is published in the EU’s Official Journal.
  • Twenty days later: The AI Act enters into force.

? What happens after the AI Act comes into force?

Within six months: Prohibited AI practices must end, including (among others):?

  • The untargeted scraping of images for facial recognition databases
  • Using “subliminal techniques” to influence behavior in harmful ways
  • Exploiting people’s vulnerabilities

After 24 months: The rules on certain “high-risk AI systems” take effect.

  • The rules kick in first for the high-risk AI systems listed in Annex III of the AI Act.
  • Annex III lists AI systems used in contexts such as education, employment, and justice.
  • Such high-risk AI systems must undergo a “conformity assessment” and develop detailed technical documentation, among many other obligations.

After 36 months: The rules on high-risk systems covered by existing EU “product safety” laws take effect.

  • Some high-risk AI systems are products—or form part of products—already regulated under certain EU product safety laws listed in Annex II of the AI Act.
  • These AI systems are used in products such as toys, aircraft, and medical devices, among many others.
  • “Product safety” high-risk AI systems will be subject to the same rigorous requirements as other high-risk AI systems.

Besides these four key dates, the AI Act includes many other deadlines relating to codes of conduct, guidelines, and market surveillance authorities.?

Many of the rules won’t apply to AI systems already on the market or in use before the relevant deadlines unless there’s a substantial change to their design or use. Nonetheless, the next few years will be very busy for many companies using or developing AI systems.

Kentucky passes a comprehensive privacy law

Kentucky has passed a “Virginia-style” comprehensive privacy law, HB 15 .

  • HB 15 awaits a signature from the state’s governor and should take effect in January 2026.
  • The law is very similar to the Virginia Consumer Data Protection Act (VCDPA) and several other state privacy laws.
  • Obligations under HB 15 include upholding consumer rights, maintaining contracts with service providers, and conducting data protection assessments.

? How does HB 15 apply?

Kentucky’s new privacy law will apply to a company that conducts business in Kentucky or produces products or services targeted to Kentucky residents, and that during a calendar year either:

  • Controls or processes the personal data of 100,000 or more consumers in the state; or
  • Both:Controls or processes the personal data of 25,000 or more consumers and?Derive at least 50% of gross revenue from the sale of personal data

Does this sound familiar??

These are the exact same application thresholds as comprehensive privacy laws in Virginia, Iowa, and Indiana. Similar exemptions also apply for employment data, Health Insurance Portability and Accountability Act (HIPAA)-covered entities, and financial institutions.

? What consumer rights does HB 15 provide?

Kentucky consumers will have the right to:

  • Verify and access their personal data
  • Correct any inaccuracies in their personal data
  • Delete their personal data
  • Provide a portable copy of their personal data
  • Opt out of:?Targeted advertisingThe sale of personal dataCertain forms of profiling

There’s a 45-day deadline for responding to consumer rights requests, with one additional 45-day extension available, and the right to appeal to the State Attorney General.

Guess what? This is also identical to Virginia.

? Is anything about HB 15 not ‘identical to Virginia’?

Seemingly not. Post in the comments if you find anything.

Other Virginia-inspired provisions include:

  • Opt-in consent for processing sensitive data
  • Mandatory data processing agreements
  • Data protection assessments
  • No obligation to recognize Universal Opt-Out Mechanisms (UOOMs)
  • No “authorized agent” provisions
  • A 30-day cure period (with no sunset)
  • No private right of action
  • Principles of:Data minimizationPurpose limitationReasonable security

If you’re compliant with Virginia’s VCDPA or another similar law, you shouldn’t have too much work to do to meet Kentucky’s requirements.?

Nonetheless, Kentucky’s HB 15 is another straw on the back of the US privacy-professional camel. And don’t expect a federal privacy law to lighten the load any time soon.

European Commission sanctioned for using Microsoft 365

The European Data Protection Supervisor (EDPS) has found that the European Commission violated data protection law through its use of Microsoft 365.

  • The Commission data protection obligations arise under Regulation 2018/1725, which applies to all EU institutions, rather than the GDPR.
  • Most, but not all, of the Commission’s alleged Regulation 2018/1725 violations would also infringe the GDPR.
  • An investigation by the EDPS revealed issues with the Commission’s licence agreement with Microsoft, which allegedly failed to adequately restrict how Microsoft could use and transfer personal data.

? Is using Microsoft 365… illegal?

Practically everyone uses Microsoft 365, and that’s not illegal per se. But the Commission appears to have failed on some pretty basic data protection compliance steps.

The investigation began in 2021, but the EDPS says some of the Commission’s violations are still ongoing. Other infringements stopped once Microsoft self-certified under the EU-US Data Privacy Framework (DPF).

According to a detailed press release , the EDPS found that the Commission violated various articles of Regulation 2018/1725 by failing to:

  • Properly define the types of personal data Microsoft processes and for what purposes.
  • Ensure that Microsoft processes data solely based on the Commission's documented instructions.
  • Assess whether any further processing of personal data is compatible with the purposes for which it was collected.
  • Assess the necessity and proportionality of transmitting data to Microsoft for a public interest purpose (this violation is not relevant to the GDPR).

? What about data transfers?

Regulation 2018/1725 is slightly stricter than the GDPR on international data transfers. However, a lot of the Commission’s alleged failings would also have violated the GDPR.

The EDPS alleges that the Commission:

  • Failed to specify which types of personal data would be tranferred, the recipients of the data, the relevant third countries, and the purposes for which data transfers could occur.
  • Implemented Standard Contractual Clauses (SCCs) to facilitate transfers to the US without conducting an adequate Transfer Impact Assessment (TIA) (this is no longer an issue since the DPF).
  • Failed to assess the risks associated with onward transfers from Microsoft Ireland to importers in other third countries.

As any data protection professional will know, negotiating a data processing agreement with a tech giant is hard work (if it’s possible at all). And conducting a TIA in the post-Schrems-pre-DPF era was often a futile exercise.

But perhaps it’s not unreasonable to expect the Commission to uphold the laws it drafted.

What We’re Reading

Divesh Sood

Sr. Product Marketing Manager

8 个月

Robert Bateman another great update

要查看或添加评论,请登录

社区洞察

其他会员也浏览了