Training  /  October 08, 2024  -  October 09, 2024

EU AI Act and High-Risk AI Systems

What is the EU AI Act?

The Artificial Intelligence Act (AI Act) is a proposal by the European Commission to create a unified regulatory and legal framework for artificial intelligence across the EU. The AI Act adopts a risk-based approach, imposing a range of escalating legal and technical requirements based on the risk that an AI system may pose to the public.

EU AI Act training

Our EU AI Act training brings together best practices in assurance argumentation for safety critical systems to address the requirements for high-risk AI systems in the EU AI Act. We cover:

  • Key principles, objectives, and definitions from the EU AI Act
  • Relevant stakeholders and implications across the AI value chain
  • Overview and analysis of the requirements for high-risk AI systems
  • A verification methodology for industry specific use-cases, which supports in:
    • Ensuring high-quality training, validation, and testing data, and minimizing bias in data sets
    • Selecting methods for the design, training, and evaluation of ML models
    • Providing complete technical documentation in an automated manner
    • Establishing post-market monitoring and verification of safety and performance properties throughout the lifecycle

Target audience

If you are looking to bridge the gap between the high-level AI regulations within the EU AI Act and detailed requirements for AI, this training is for you. We welcome anyone looking to expand their knowledge on the EU AI Act and understand how it will impact their work.

If there are 3 or more participants from one company, you can also bring along a company-specific use case, which the Fraunhofer IKS experts will look at together with you.