AI Act: Why Responsible Use of Generative AI Is Important – and How You Ensure It

AI Act: Why Responsible Use of Generative AI Is Important – and How You Ensure It

In this month’s newsletter, we focus on the importance of responsible use of AI – and how you ensure it.


So, read along as we cover:


?? Stats showing why non-responsible use of AI can affect your business and bottom line

?? The deadlines for implementation and compliance you should have in mind when using generative AI

?? How the AI Act regulates generative AI

?? Information you and your employees shouldn’t share with AI

?? Words of wisdom from our AI expert Frederik Them Pedersen on how to ensure awareness among your employees

? How you can document your responsible use of generative AI in a structured, automated manner with our AI compliance solution


Happy reading!


Did you know that...

  • 76% of consumers have concerns about misinformation from artificial intelligence such as Google Bard, ChatGPT, and Bing Chat?
  • 70% of consumers are either very concerned or somewhat concerned about businesses’ use of AI tools?

And it looks like many companies need to address these concerns among consumers. At the recent webinar in our AI Act series, 52% of the audience said they use generative AI:

Responsible use of generative AI is not ‘only’ about how you comply, but also when. So, let’s get the AI Act’s deadlines for compliance and documentation in place.


Generative AI compliance: Deadlines you should have in mind


Here are the deadlines for compliance and documentation on generative AI:

? August 2, 2025, for General-Purpose AI (GPAI)


This deadline affects the use of generative AI as GPAI is the system – or the motor – behind generative AI (such as ChatGPT and Microsoft Copilot).

? August 2, 2026, for high-risk AI systems (under Annex III) and the rest of the AI Act.

This deadline includes transparency requirements in Article 50. Speaking of requirements:


Let’s take you through the requirements you must comply with when being a provider of generative AI.


How the AI Act regulates generative AI

When you provide or use generative AI, you should be aware of three requirements in the AI Act:

Transparency, documentation, and notification.

1?? Transparency

In Article 50, there are obligations stating you must tell people that they’re interacting with AI. This is especially relevant if you, for example, use AI for a customer chatbot on your website.

Also, you should mark the output as artificially generated. Especially if you use AI to create content such as text, pictures, or video.

Finally, you must inform people about an AI system used for emotion recognition. This would likely make the AI system high-risk, triggering the AI Act's high-risk obligations. You should also consider Article 22 of the GDPR in this case.

2?? Documentation

There is a difference between general-purpose AI model providers and general-purpose AI model providers with systemic risks.

If you provide generative AI without systemic risk, you must prepare technical documentation on modality, computational power, security, etc.

You must also implement a policy to ensure compliance with copyright and related rights.

Finally, you must make a summary of all content that you use to train the AI.

3?? Notification

There are notification requirements for you as a provider of AI models with systemic risk as you must:

  • Report your AI model to the European Commission.
  • Provide documentation of evaluation strategies, etc.


How to share (and not to share) information with generative AI

The are two reasons why you should be careful with the information you share with generative AI:

  1. The data you put into the generative AI will be shared with the party responsible for the hosting of the system.
  2. Generative AI will typically use the information you put into it to train itself.


Thus, you shouldn't share:


??? Customer information

?? Trade secrets

?? Intellectual property-protected material

However, you can, in some cases, set up generative AI so that it doesn't use the input you feed it in its own training.

Of course, it’s important that you also ensure awareness among your employees on how they use and share information with AI.



Structure and automate your AI compliance and documentation with our solution

Due to customer demand, we've now expanded our AI Compliance Solution with more features and documents, meaning that you can:


??? Easily identify and classify your AI systems to ensure compliance with the EU's AI Act.


?? Assess and mitigate AI-specific risks to protect your organization and ensure compliance with both the AI Act and GDPR, guided by our legal experts.


? Automatically generate all necessary AI compliance documents and documentation, including risk assessments and policies, with ComplyCloud’s document generator.


?? Gather everything in a trust center to show on your website how your company addresses the AI Act requirements.

Get off to a good start with your compliance work and enjoy the benefits of a free trial of 14 days for our AI Compliance solution. Start your free trial now.

要查看或添加评论,请登录

ComplyCloud的更多文章

社区洞察

其他会员也浏览了