Navigating the Urgent and Significant New Regulatory Landscape of Digital Operational Resilience and AI Governance
Introduction
The financial services industry has witnessed a transformative impact from new technologies, necessitating significant adaptation. This digital evolution has increased reliance on IT systems and introduced new risks. In response to these challenges, the European Union has introduced the Digital Operational Resilience Act (DORA) and the AI Act. These regulations are not just a response, but a necessary step to address the challenges of digital evolution, strengthening the resilience and integrity of digital and AI systems within financial services.
DORA and the AI Act: An Overview
DORA and the AI Act establish new regulations for digital operational resilience and the ethical use of AI in the financial sector. Both require robust governance structures and comprehensive risk management frameworks. These frameworks are designed to cover all potential risks, ensuring that financial institutions are well-prepared for any eventuality. Financial institutions must comply with new obligations for managing ICT risks, incidents, and third-party AI providers. Prompt incident reporting is essential under both regulations to ensure transparency and timely mitigation.
Critical Regulatory Articles, Lex Specialis, and NSA Designations
The AI Act contains several critical clauses that impact DORA’s implementation and highlight the regulatory interplay between the two frameworks. These essential articles and the concept of Lex Specialis and NSA designations form the cornerstone of the new regulatory landscape.
The AI Act will serve as Lex Specialis, taking precedence over DORA in specific scenarios where AI-specific governance and risk management requirements are more detailed. The AI Act’s provisions for high-risk AI systems and AI-specific incident reporting will prevail over DORA’s general regulations. This ensures that the most pertinent and rigorous standards are applied to the governance and management of high-risk AI systems.
These articles collectively ensure that the AI Act’s stringent standards are seamlessly integrated into the broader ICT governance frameworks established by DORA. This integration is not just important, it's vital for maintaining a cohesive and resilient digital operational landscape within the financial sector.
Overlaps and Practical Considerations
Conclusion
Financial institutions stand at a pivotal juncture where aligning with DORA and the AI Act is not just a regulatory requirement but a strategic imperative for long-term resilience and competitiveness. Institutions can preempt potential disruptions and safeguard their operations by conducting comprehensive risk assessments that encompass both ICT and AI-related threats. Developing robust incident response plans tailored for ICT and AI incidents ensures institutions can respond swiftly and effectively to any breaches, minimising impact. Integrating governance frameworks that blend ICT resilience with AI ethical standards will create a robust foundation for navigating the complexities of digital transformation. This strategic alignment will ultimately protect against diverse risks, enhance stakeholder trust, and pave the way for sustainable growth in the digital age.
Article by Dr Ian Gauci
Dr Gauci is the Managing Partner of GTG, a technology-focused corporate and commercial law firm that has been at the forefront of major developments in fintech, cybersecurity, telecommunications, and technology-related legislation.
Disclaimer: This article is not intended to impart legal advice and readers are asked to seek verification of statements made before acting on them.