What Boards Need to Know About AI Regulation in Luxembourg

Denis de Montigny PhD, CFA


This guide summarizes the EU AI Act, DORA, MiCA, and the CSSF/BCL AI Thematic Review — the core regulatory and supervisory developments relevant to AI, digital resilience, and crypto-asset governance in Luxembourg. It highlights what boards need to address to strengthen oversight and meet supervisory expectations.


Core Table: Key Elements at a Glance

FrameworkPrimary FocusBoard-Relevant ObligationTimeline
EU AI ActRisk-based regulation of AI systemsEnsure proper AI risk classification, human oversight, compliance with high-risk AI controls2026 full application (phased from 2025)
DORADigital operational resilience in financial servicesOversee ICT risk governance, incident response, third-party risk, resilience testingJanuary 2025
MiCARegulation of crypto-assets and service providersSupervise compliance for crypto services, governance standards, reserve requirements (if applicable)Mid to end 2024
CSSF/BCL AI Thematic ReviewSurvey of AI use and governance maturityRespond to gaps in AI strategy, oversight, classification, monitoring highlighted by supervisorsReport published May 2025

Summary

EU AI Act
  • Introduces a tiered risk-based framework: prohibited AI systems (Article 5), high-risk AI systems (Articles 6–9), and AI systems subject to transparency obligations (Articles 52–54). Systems falling outside these categories are not subject to specific obligations under the Act.
  • High-risk AI systems require conformity assessment (Article 43), technical documentation (Article 11), human oversight (Article 14), and robustness controls (Article 15).
  • Transparency obligations apply to certain AI systems that interact with humans, detect emotions, or generate content (Articles 52–54).
  • General-purpose AI (including large language models) is subject to transparency and risk management obligations as set out in Title VIII (Articles 52 and following).
  • Boards must oversee that AI systems, especially high-risk ones (e.g. credit scoring, AML tools), are correctly classified, documented, and governed in line with the Act’s requirements, including ensuring appropriate human oversight and transparency where mandated.
DORA
  • Mandatory ICT risk management framework (Articles 5–10), including governance, identification, and protection measures
  • Harmonised incident reporting to regulators (Articles 17–20)
  • Strict requirements on ICT third-party risk management (Articles 28–30), including contractual controls and oversight
  • Obligations for digital operational resilience testing (Articles 24–26). Threat-led penetration testing is required only for entities deemed significant, critical, or otherwise designated (Article 26).
  • Boards must ensure that ICT risk governance, incident reporting, third-party oversight, and resilience testing are fully embedded at the entity level, not left to group functions.
MiCA
  • Aims to regulate tokens not already covered by other EU law (Recital 11)
  • Authorisation regime for issuers of asset-referenced tokens and e-money tokens (Title II and III)
  • Obligations for crypto-asset service providers on governance, conduct, capital requirements (Title V)
  • Reserve requirements for stablecoins (asset-references tokens: Article 35, e-money tokens: Article 49)
  • Consumer protection, disclosure, and marketing standards (Articles 22–24, 46–48)
  • Where relevant, boards must oversee compliance with MiCA governance, conduct, and reserve requirements, ensuring crypto risks are integrated into risk management frameworks.
CSSF/BCL AI Thematic Review
  • AI adoption is increasing, but formal board-approved AI strategies are rare
  • Generative AI use has outpaced governance frameworks
  • Many institutions misclassify AI risk levels under the EU AI Act
  • AI governance maturity uneven: data, bias, security, auditability often underdeveloped
  • The review highlights areas where supervisors will expect boards to strengthen oversight.

What This Means for Boards

The EU AI Act, DORA, MiCA, and the CSSF/BCL AI Thematic Review require boards to provide structured, documented oversight of AI, digital resilience, and crypto governance. This oversight must align with specific regulatory expectations, including EU AI Act Articles 9 and 14 (governance and human oversight), DORA Articles 5–10 (ICT governance), and MiCA Titles II and V (issuer and service provider governance).

  • Establish an AI and digital risk committee, or formally extend the mandate of existing committees to cover these areas, with clear reporting lines to the board.
  • Implement regular board training on AI risk, digital resilience, and crypto governance, covering relevant articles and obligations.
  • Ensure that regulatory mapping is performed so the board understands where the institution is exposed to high-risk requirements under each framework.
  • Require internal audit and compliance assurance on model classification, third-party risk management, and crypto compliance — with explicit reference to regulatory provisions.
  • Oversee third-party and vendor risk with attention to contractual controls, exit strategies, and testing of contingency plans.
  • Integrate these topics into standing board and committee agendas, supported by structured reporting (model inventories, risk dashboards, compliance registers).
  • Smaller institutions should focus board attention where risk exposure is greatest, using group support where appropriate but ensuring local accountability.

Independent directors play a key role in ensuring these frameworks are implemented through meaningful governance: clear structures, active challenge, and evidence of oversight in board minutes, reporting, and decision-making. This is no longer an optional best practice — it is a regulatory expectation.

At Fund Guardian, my colleagues Dr. Angelina Pramova, CESGA®, Guillem Liarte, and I, support firms in executing their AI and oversight strategies — offering tools, analytics, and expertise to accelerate implementation, reduce risk, and build long-term governance capability. Contact us here.


This document is provided for general informational and governance purposes only. It does not constitute legal, regulatory, or compliance advice. Directors and boards should seek appropriate professional advice before making decisions based on these frameworks.