The EU AI Act is here: ensuring AI system compliance with conformity assessment procedures
On 1 August 2024, the EU Artificial Intelligence (AI) Act (AI Act) entered into force. In previous issues of the AI Regulatory Update, we:
·?????? Highlighted the AI Act’s risk-based classifications for AI systems and models.
·?????? Provided an overview of relevant stakeholders and their potential obligations before and after placing an AI system or model on the EU market and included an implementation timeline.
?
In this third issue introducing the AI Act, we take a closer look at the conformity assessment within the compliance stages of the AI lifecycle.
Ensuring AI system compliance via product safety measures
AI is not new technology. Initial studies on AI stemmed from concepts that gained prominence in the late 1930s through early 1950s, but they have only recently gained practical application and traction with large language models.
Likewise, the EU’s approach to regulating AI systems and models has roots in an earlier regulation. While the AI Act may be new, at its core it is the extension of a proven model, the New Legislative Framework (NLF).[1] Adopted in 2008, the NLF establishes harmonized and state-of-the-art regulation of product safety in the EU. With the AI Act, the EU extends this product safety law to AI. Because the AI Act does not regulate AI itself but AI systems and general-purpose AI models (GPAIs), which are often part of another product, like an automobile or health care device, different product regulation standards (i.e., Conformité Européenne (CE) standards) need to interact and be harmonized under the NLF.
The AI Act conformity assessment procedure for high-risk AI systems
Providers of high-risk AI systems must “ensure that the high-risk AI system undergoes the relevant conformity assessment procedure as referred to in Article 43, prior to it being placed on the market or put into service” (Article 16 (f) AI Act).
Such providers may be subject to one of two conformity assessments:
????? Self-assessment: the conformity assessment procedure based on internal control (Annex VI)
????? Third-party assessment: the conformity procedure based on an assessment of the quality management system and an assessment of the technical documentation (Annex VII)
?
In both cases, substantive requirements of the conformity assessment procedure are the quality management system (Article 17 AI Act) and the technical documentation (Article 11 AI Act) of the provider, meaning that the provider has to demonstrate compliance, in particular with Chapter 3, Section 2 (Articles 8 to 15 AI Act), for the individual high-risk AI system.
Depending on the classification categorization of the high-risk AI system, Article 6 (2) and Article 43 AI Act determine which kind of conformity assessment procedure applies.
If the provider makes any substantial modification to the high-risk AI system after the initial conformity assessment, this system must undergo a new conformity assessment procedure (Article 43 (4) AI Act). This does not apply to continuously learning high-risk AI systems after being placed on the market or put into service, if “changes to the high-risk AI system and its performance have been pre-determined by the provider at the moment of the initial conformity assessment and are part of the information contained in the technical documentation referred to in point 2(f) of Annex IV.”
The conformity assessment procedure along the AI value chain
Other AI Act stakeholders, in particular the importer and distributor, do not have to conduct the conformity assessment for the imported or distributed AI product. They are only prohibited from placing a high-risk AI system on the market or putting it into service if it has not undergone a conformity assessment procedure by the provider.
领英推荐
However, this can change should a distributor, importer, deployer or other third party become the provider. Article 25 AI Act stipulates that any distributor, importer, deployer or other third party is considered to be the provider of a high-risk AI system if it does one of the following:
????? It puts its name or trademark on a high-risk AI system already placed on the market or put into service.
????? It makes a substantial modification to a high-risk AI?system.
????? It modifies the intended purpose of an AI system, including a GPAI, that has not been classified as high-risk and has already been placed on the market or put into service in such a way that the AI system becomes a high-risk AI system in accordance with Article 6.
?
It does not matter for their provider status if the distributor, importer, deployer or other third-party actually makes the AI system available, as the provider status is presumed (Article 25 (1) AI Act). It is noteworthy that the distributor, importer, deployer or other third party does not become an additional provider alongside the original provider, but actually replaces the latter as the primary responsible party. Any distributor, importer, deployer or other third party must therefore be particularly careful when negotiating contractual relationships with the actual provider of an AI system.
Demonstrating conformity through the EU declaration, CE marking and registration
With the successful completion of the conformity assessment procedure, the provider has to draw up “a written machine readable, physical or electronically signed EU declaration of conformity for each high-risk AI system” (Articles 16 (g), 47 (1) AI Act) and affix the CE marking to the high-risk AI system or, where that is not possible, on its packaging or its accompanying documentation (Articles 16 (h), 48 (1) AI Act). For high-risk AI systems provided only digitally, a digital CE marking can be used under certain conditions according to Article 48 (2) AI Act. In this context, consider the following questions for the individual high-risk AI system:
????? What are the required size and dimensions for the CE marking?
????? Where should it be positioned on a website or in the system?
????? Is it mandatory for the marking to be directly within the system, or can displaying it on the website near the download suffice?
????? In terms of the system’s front end, would it be adequate for the CE marking to appear on a secondary page accessible to the user after a few clicks, or should it always be in plain sight, perhaps within a footer or banner?
?
Finally, before putting the high-risk AI system on the market or into service, the provider (or its authorized representative) and the system need to be registered in the EU database according to Articles 49 (1), 71 AI Act. Exempt from that registration requirement are AI systems intended to be used as safety components in the management and operation of critical digital infrastructure; road traffic; or in the supply of water, gas, heating or electricity (Article 6 (2), Annex III No. 2 AI Act), as they have different registration requirements under the EU critical infrastructure regulations.
?
[1] “New legislative framework,” European Commission website, https://single-market-economy.ec.europa.eu/single-market/goods/new-legislative-framework_en
?
This publication contains information in summary form and is therefore intended for general guidance only. It is not intended to be a substitute for detailed research or the exercise of professional judgment. Member firms of the global EY organization cannot accept responsibility for loss to any person relying on this article.
?
?