The EU clarifies AI Act rules, the UK demands encrypted iCloud access, and Amazon faces a lawsuit under Washington’s strict health data law

The EU clarifies AI Act rules, the UK demands encrypted iCloud access, and Amazon faces a lawsuit under Washington’s strict health data law

Privacy Corner Newsletter: February 13, 2025

By Robert Bateman and Privado.ai

In this edition of the Privacy Corner Newsletter:

  • The European Commission drops two new sets of AI Act guidelines
  • The UK government demands a secret backdoor into Apple’s encrypted iCloud
  • The first case is brought under Washington’s My Health My Data Act
  • What we’re reading: Recommended privacy content for the week




European Commission drops AI Act guidelines on AI systems and prohibited practices

The European Commission has published two sets of guidelines on the EU AI Act, covering the law’s definition of an “AI system” and its “prohibited practices”.

  • The “AI system” guidelines break the definition into seven key elements and highlight some types of software that are excluded from the AI Act’s scope.
  • The extensive “prohibited practices” guidelines explore the AI Act’s nine banned AI-related activities.
  • The guidelines do not have direct legal effect, but they provide some important insights into how the Commission interprets the AI Act.


? AI system definition guidelines

The Commission's Guidelines on the definition of an artificial intelligence system are relatively modest in length and analyze the AI Act’s “AI system” via its seven key elements:

  • It’s a machine-based system
  • It operates with varying levels of autonomy
  • It may exhibit adaptiveness after deployment
  • It has explicit or implicit objectives
  • It infers from inputs how to generate outputs
  • It generates outputs such as predictions, content, recommendations, or decisions
  • It can influence physical or virtual environments

Perhaps the most interesting part of these guidelines provides the Commission’s view on what types of systems are not AI systems under this definition:

  • Mathematical optimization systems: Systems that improve optimization methods (e.g., linear or logistic regression) without exceeding basic data processing. Examples include machine learning models enhancing computational efficiency in physics-based simulations and satellite bandwidth allocation.
  • Basic data processing systems: Systems that follow explicit, predefined instructions without learning, reasoning, or modelling. Examples include database management, spreadsheets without AI features, and statistical analysis tools for summarising data without generating recommendations.
  • Classical heuristics-based systems: Systems that use rule-based problem-solving without adapting based on data. Examples include chess programs using minimax algorithms and heuristic evaluation functions.
  • Simple prediction systems: Systems relying on basic statistical methods to establish baseline predictions. Examples include stock price estimators using historical averages and static estimation systems predicting customer support response times or store demand.


? Prohibited AI practices guidelines

The Guidelines on prohibited artificial intelligence practices are less modest in length, coming in at 140 pages.

Here’s a summary to remind you of the prohibited practices under Article 5 of the AI Act:

  • Harmful manipulation and deception: Use of subliminal, manipulative, or deceptive techniques to distort behavior, causing or likely to cause significant harm.??
  • Harmful exploitation of vulnerabilities: Exploiting vulnerabilities due to age, disability, or socio-economic conditions to distort behavior, causing or likely to cause significant harm.??
  • Social scoring: Classifying individuals based on behavior or personal traits, leading to unjustified or disproportionate negative treatment.
  • Criminal offense risk prediction: Assessing or predicting crime risk based solely on profiling or personal characteristics, except when supporting human assessments with objective, verifiable facts.??
  • Untargeted facial recognition data scraping: Collecting facial images from the internet or CCTV footage to build or expand recognition databases.??
  • Emotion recognition: Detecting emotions in workplaces or educational institutions, except for medical or safety reasons.??
  • Biometric categorisation: Inferring race, political opinions, trade union membership, religious beliefs, sex life, or sexual orientation from biometric data, except for lawful dataset labelling or filtering.??
  • Real-time remote biometric identification: Performing biometric identification in public spaces for law enforcement, except for targeted victim searches, prevention of specific threats, or locating suspects of certain crimes.

Most of these prohibitions include caveats and limited exceptions.

Despite its length, much of the document re-states existing law and offers painstaking analyses of the meaning of particular words—with arguably limited impact on AI operators.

However, those involved in the advertising industry should read the Commission’s guidelines carefully, particularly where they discuss the scope of prohibited “emotional manipulation” techniques.


UK government demands access to encrypted Apple data

The UK has ordered Apple to provide a backdoor to encrypted iCloud data.

  • The UK government served Apple with a Technical Capability Notice under the Investigatory Powers Act 2016 in January, according to reports from the Washington Post and BBC.
  • The warrant would impact users of Apple’s Advanced Data Protection feature, which adds end-to-end encryption to iCloud accounts.
  • Warrants under the Investigatory Powers Act are confidential, and Apple has declined to officially comment on the government’s intervention.


? Can the UK government actually do this?

The government reportedly issued a secret warrant to Apple in January under the UK’s Investigatory Powers Act 2016, a law sometimes disparagingly called the “Snooper’s Charter”.

The government ordered Apple to remove “electronic protection” to “allow access to data that is otherwise unavailable due to encryption” via a “Technical Capability Notice”.

Warrants under the act are confidential—neither the contents nor the mere existence of such a warrant may be disclosed by either the recipient or the government. The recipient may appeal to a tribunal but must not delay implementation of the order.


? What happens if Apple complies?

The government reportedly demanded access to accounts secured via Apple’s “Advanced Data Protection” feature, which provides end-to-end encryption for iCloud data.

To comply with the order, Apple would need to implement a backdoor enabling the government to secretly view and copy people’s encrypted messages, photos, and notes. The order would impact all encrypted iCloud accounts globally.


? What’s next?

The government is now consulting its Technical Advisory Board about whether to push ahead with the order, according to Computer Weekly.

Apple has previously refused to provide access to its users’ data, including in 2016 when the FBI attempted to force the company to assist it in unlocking iPhones?

Apple has (understandably) declined to comment about the UK government’s intervention.


It begins: First case brought under the Washington My Health My Data Act (MHMDA)

A class action against Amazon lodged in Washington is the first case to cite the state’s My Health My Data Act (MHMDA).

  • Maxwell v Amazon alleges violations of seven federal and Washington laws, including the state’s strict health privacy law, the Washington MHMDA.
  • The case alleges that Amazon failed to provide privacy disclosures or obtain consent before collecting precise geolocation data and Mobile Advertising IDs (MAIDs) via its Software Development Kit (SDK).
  • The plaintiffs argue that such data constitutes “consumer health data” under the Washington MHMDA.


What’s the background?

Washington’s MHMDA is among the strictest and most broadly applicable state privacy laws—despite ostensibly focusing on health privacy.

The allegation under Washington’s MHMDA is the fifth of seven causes of action against Amazon, alongside allegations involving wiretapping and consumer protection law.

The case concerns the Amazon Ads SDK and how it collects two types of information:

  • Time-stamped geolocation data
  • Mobile advertising IDs (MAIDs)


Is that… health data?

The plaintiffs argue that the Amazon SDK collects “information that could reasonably indicate a consumer’s attempt to acquire or receive health services or supplies,” which could meet the MHMDA’s definition of “consumer health data”.

The case alleges that Amazon did this without obtaining consent, as is required in many circumstances under the MHMDA.

The case also alleges that Amazon failed to make mandatory privacy notice disclosures under the MHMDA.


Will the case succeed?

While such GPS and MAIDs can meet the MHMDA’s “consumer health data” definition, the plaintiffs spend little time explaining how they do so when collected by Amazon. The MHMDA count also mentions “biometric information”, with no elaboration on how Amazon collects such data.

However, if it’s established that Amazon does collect consumer health data, the plaintiffs might be correct that the company should have obtained consent.

The main exception to the MHMDA’s consent rule applies when the entity provides a service explicitly requested by the user.

But as Felicity Slater from Hintze Law LLC points out—because Amazon’s SDK is integrated into third-party apps, the company might struggle to argue that its collection of the data was necessary to provide a service to an individual (the services instead being provided to app publishers and advertisers).

As noted, the MHMDA is broad and strictly drafted, so Amazon likely won’t be the last company to see litigation under the law’s private right of action.


What We’re Reading

要查看或添加评论,请登录

Privado.ai的更多文章