AI literacy – knowledge is power
One of the first principles and obligations of the EU AI Act (AI Act) to become applicable on 2 February 2025 is the principle of artificial intelligence (AI) literacy, set out in Article 4 of the Act.
The AI literacy principle requires providers and deployers to take measures to ensure their staff and other persons dealing with the operation and use of AI systems have sufficient skills, knowledge and understanding to make informed use of AI systems. Providers, deployers and affected persons should have or gain awareness about the opportunities and risks of AI, including the possible harm it can cause.
How can people achieve AI literacy?
What qualifies as being AI literate (i.e., having the required level of AI literacy) will vary depending on the context. It can include understanding the correct application of technical elements during the AI system’s development phase, the measures to be applied during its use, the suitable ways in which to interpret the AI system’s output and, in the case of affected persons, the knowledge necessary to understand how decisions made with the assistance of AI may affect them.
Each person or group involved does not require the same level of AI literacy. The necessary level of AI literacy depends on the technical knowledge, experience, education and training, and context in which the AI system will be used, along with the people or groups who will use it. IT staff will require a higher level of AI literacy than, for example, sales staff. The IT staff should have a thorough understanding of how an AI system works technically, whereas for sales staff, who want to market the AI system, they should understand how the AI system works so they can properly explain the risks and opportunities to interested persons who would be affected by the outcome of output generated by an AI system. Similarly, for affected persons, the amount of information and clarification necessary to achieve AI literacy will vary.
AI literacy should evolve with time. In this regard, the European Artificial Intelligence Board will support the European Commission to promote AI literacy tools, and public awareness and understanding of the benefits, risks, safeguards, rights and obligations in relation to the use of AI systems. In addition, relevant stakeholders, including the Commission and the Member States, should facilitate the drafting of voluntary codes of conduct to advance AI literacy.
Are there sanctions?
There are no specific fines for failure to achieve AI literacy. However, considering it is a general principle, the compliance or noncompliance with this principle will have an effect on all other principles of the AI Act. For example, providers and deployers need to understand the lines between prohibited AI and high-risk AI. Additionally, in order to provide effective human oversight in the framework of high-risk AI systems, AI literacy is essential. An insufficient level of AI literacy within the appropriate teams could lead to a violation of other AI Act obligations and could subsequently lead to fines.
领英推荐
What now?
The obligation to ensure AI literacy becomes applicable on 2 February 2025. This does not mean that providers and deployers should wait until 2 February 2025 to start pursuing AI literacy. On the contrary, companies currently working on their AI strategy and AI Act compliance, and in particular companies that have not yet started, should also begin preparing now to establish proper AI literacy. Considering the prohibition of certain practices in AI systems also becomes applicable on 2 February 2025, companies should promote an understanding of how these practices relate to the AI systems used within their organizations, boosting the level of AI literacy alongside conversations around compliance.
First, companies will have to determine the current level of AI literacy among their people. Companies should then establish training and education tailored to their specific context. Next, relevant procedures and policies should be drafted to support a lasting framework for the ongoing compliance with AI literacy. Finally, companies should establish monitoring to help ensure that staff and others dealing with AI systems have the required level of AI literacy (e.g., through regular training) for their roles and maintain documentation to demonstrate compliance with this principle. ?
Many thanks to Kelly Matthyssens , Senior Manager, EY Law Belgium, for her insights in this article.?
The views reflected in this article are the views of the author and do not necessarily reflect the views of the global EY organization or its member firms.
?
?
Privacy Engineering & AI Governance, CIPP/E | CIPM | CIPT, Privacy Auditor, Trainer, looking at Security and Governance from my former CIO and Digital Product Management roles' perspective (helps sometimes).
5 个月One thought: a solid overview on what the current use cases are in the organisation would be an essential first step to determine, what level of AI literacy is needed and where IMHO. I am curious, what these "AI literacy tools" they are going to promote will look like.
Data Driven Marketing & CRM | Agile Leadership | AI early adopter | Marketing Automation | Changemanagement
5 个月Thanks for sharing!