Logo for Clariom AI, providing guidance on the EU AI Act and AI compliance
image of company event (for a digital marketing & advertising agency)

AI Act
Readiness:
step by step

This page explains the three core steps of our AI Act Readiness Assessment. Each step is designed to give you clarity on how the EU AI Act applies to your use of AI, where potential risks may exist, and what actions are needed to move forward. The focus is practical and business-oriented.

FAQ: EU AI Act Readiness

AI Act Readiness, explained clearly

Clear, practical explanations of how the EU AI Act applies to your use of AI.

1. AI usage assessment

This step focuses on understanding how AI is actually used in your business today. Many organizations use AI indirectly through third-party software without realizing it. The purpose is to create a complete and realistic overview, not a technical audit.

2. AI Act risk and gap analysis

Once AI use cases are identified, the next step is to assess how they are treated under the EU AI Act. This analysis focuses on classification, obligations, and gaps between current practices and regulatory requirements.

3. Prioritized action plan

The final step turns analysis into action. Based on the identified risks and gaps, you receive a prioritized set of recommended next steps tailored to your organization.

1. AI usage assessment:

Typical examples include:

  • AI used in HR systems for CV screening or candidate ranking
  • Customer support tools with automated chat or ticket prioritization
  • Marketing platforms using AI for targeting, personalization, or lead scoring
  • Internal tools for forecasting, analytics, or process automation
  • What matters is not how advanced the system is, but where AI influences decisions, processes, or people. This overview becomes the foundation for all further analysis under the EU AI Act. If an AI use case is not identified here, it cannot be properly assessed later.

    Why this step matters:

    Without a clear inventory of AI use, companies often underestimate their regulatory exposure or focus on the wrong systems.

    2. AI Act risk and gap analysis:

    In practice, this includes:  

  • Determining whether a use case falls under prohibited, high-risk, or limited-risk categories
  • Identifying whether specific obligations may apply, such as documentation, transparency, or human oversight
  • Reviewing governance, data handling, and accountability structures at a high level
  • Highlighting where existing processes may not meet expected requirements
  • The goal is not legal interpretation in isolation, but a practical understanding of exposure and priorities. Some AI uses may require immediate attention, while others may pose minimal risk.

    Why this step matters:

    The EU AI Act applies differently depending on risk level. Treating all AI systems the same often leads to unnecessary work or missed compliance issues.

    3. Prioritized action plan

    This typically includes:

    • Clear actions mapped to specific AI use cases
    • Prioritization based on regulatory risk and business impact
    • Guidance on what needs to be addressed now versus later
    • Suggested ownership and next decisions for management or compliance teams

    The action plan is designed to support decision-making, not replace it. It provides structure and clarity so teams can move forward confidently, whether that means adjusting processes, engaging vendors, or planning further assessments.


    Why this step matters:

    Compliance is not achieved by analysis alone. A clear and realistic action plan ensures effort is spent where it matters most and avoids over- or under-reacting to the regulation.

    “Clariom AI gave us clear, actionable steps to prepare for the EU AI Act. Their assessment was practical, focused, and easy to implement”

    Taylor Schmidt
    Head of Operations  
    Start AI Act assessment