3 types of HRAIS, and "intended purpose".
This is number 4 of a series of bite-sized chunks on the AIA.
A previous edition of AI Legal explained that the AIA is essentially product safety legislation, and is best understood as part of the EU’s product safety regime.
?That’s also true of High Risk AI Systems. The AIA provides for 3 different types of HRAIS.
?The first is where the AI system intended to be used as a safety component of a separate product which is itself covered by EU safety legislation (eg. lifts, medical devices, etc), and that separate product is sufficiently risky that it needs a third-party conformity assessment before it can be lawfully put on the market. (For lower-risk products, the manufacturer can self-certify the assessment. For higher-risk products, you need to get an independent assessment).
?
The second is where the AI system is not just the safety component of a product covered by the EU safety legislation, but is itself one of those products (eg. lifts, medical devices, etc), and, like the first type, is also sufficiently risky that the EU safety regime requires a third-party conformity assessment before it can be lawfully put on the market.
?
领英推荐
The third type is the AI system that is most commonly talked about in connection with the AI.? It’s the type of AI that carries out the high-risk activities set out in Annex III of the AIA: processing of biometric data, critical infrastructure, and so on.
?
What’s different about the 3rd type is that, unlike most other product types envisaged, it requires the concept of “intended purpose”. For example, you don’t need a concept of intended purpose for a lift. A lift can only be used as a lift: you can’t do much else with it. The same is true for medical devices, cars, etc – ie. all the products which are presently covered by the EU safety regime.
?
But a software-enable AI system is different: it’s much more versatile, and that’s why the Annex III definitions of high-risk (nearly) all make reference to intended purpose (for example: “AI systems intended to be used to evaluate learning outcomes…”), and all the compliance requirements in the AI make reference to “intended purpose”.
?
“Intended purpose” even has its own definition: “the use for which an AI system is intended by the provider, including the specific context and conditions of use, as specified in the information supplied by the provider in the instructions for use, promotional or sales materials and statements, as well as in the technical documentation.”
?
In case you’re wondering, the phrase “intended purpose” occurs 50 times in the AIA!