EU AI Act: Which rules apply to you Logo
  • EU AI Act: which rules apply to you?

    Based on the final text of the AI Act.
  • This form will help you to determine if the AI act is applicable to you. If you encounter terms which you are unsure about, you can go to our website bg.legal/datalawhub to find out more. We tried to keep this form as compact as possible.

  • AI systems and organisation roles

  • An AI system is:

    • a machine-based system designed to operate with varying levels of autonomy and
    • that may exhibit adaptiveness after deployment and
    • that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments.

    Article 3(1) AI Act

     

    A general purpose AI model is:

    • an AI model, including when trained with a large amount of data using self-supervision at scale,
    • that displays significant generality and is capable to competently perform a wide range of distinct tasks regardless of the way the model is placed on the market and
    • that can be integrated into a variety of downstream systems or applications.

    This does not cover AI models that are used before release on the market for research, development and prototyping activities.

    Article 3(63) AI Act

     

    A general-purpose AI system is:

    • an AI system which is based on a general-purpose AI model,
    • that has the capability to serve a variety of purposes, both for direct use as well as for integration in other AI systems.

    Article 3(66) AI Act

  • Within the AI Act, there are a few roles which an organisation can have:

    • Provider: develops AI systems or has an AI system developed for them.
    • Deployer: uses AI systems under its authority.
    • Authorised representative: represents providers of AI systems for the purposes of the AI Act.
    • Importer: imports AI systems from outside the EU.
    • Distributor: distributes AI systems and makes them available on the market.
    • Product manufacturer: makes products containing AI systems.

    These are all also called operators.

    Article 3(3)-(8) AI Act

  • Scope of the AI Act

  •  

    The AI Act applies to:

    • Providers who place AI systems on the market, put them into service or place general purpose AI models on the market in the EU.
    • Deployers of AI systems which are located within the EU.
    • Providers and deployers of AI systems which are located in third countries, but whoes AI systems are used in the EU.
    • Importers and distributors of AI systems.
    • Product manufactures which place AI systems on the market or into service together with their product and under their own name.
    • Autorised representatives of providers who are not located in the EU.
    • Affected persons located in the EU.

    Article 2(1) AI Act

  • Limitations of scope

  • The AI Act does not apply in the following circumstances:

    • Areas outside the scope of EU law and in the area of national security.
    • To AI systems places on the market or into service exclusively for military, defence or national security purposes.
    • To AI systems whoes output is only output in the Union is used exclusively for military, defence or national security purposes.
    • To AI systems within an international legal cooperation framework for law enforcement and judiciary purposes, as a third-country public body.
    • To AI systems which were developed and put into service purely for scientific research and development.
    • To research, testing and development of AI systems or models prior to being placed on the market or into service.
    • To deployers who are natural persons using AI systems for purely personal non-professional activities.
    • To AI systems released under free and open source-licences unless they are placed on the market or put into service as high-risk AI systems or an AI system which falls under forbidden uses or transparency obligations.

    Article 2

  • Prohibited practices

  • Some applications of AI systems are considered prohibited:

    • Subliminal or manipulative techniques which aim to materially distort behaviour and limiting the ability to make informed decisions.
    • Exploitation of vulnerabilities of groups due to their age, disability or specific socio-economic situation, with the goal or effect of materially distorting their behaviour.
    • Biometric categorisation systems which deduce or infer race, political opinions, trade union membership, religious or philosophical beliefs, sex life or sexual orientation.
    • Social scoring if it leads to unfavourable treatment in different contexts or disproportionate to the behaviour.
    • Real-time remote biometric identification in public spaces for the purpose of law enforcement, unless strictly necessary for:
      • A targeted search for victims of abduction, human trafficking or sexual exploitation as well as missing persons.
      • the prevention of a specific, substantial and imminent threat to the physical safety of people or a genuine and present or foreseeable threat of a terrorist attack.
      • Localisation or identification of a suspect of a serious criminal offence.
    • Risk assessment of people to assess the risk of commiting a criminal offence based on their personality or characteristics.
    • Creating or expanding facial recognition databases through scraping of images from the internet or CCTV footage.
    • Infering emotions of people in the workplace and in education institutions, except when the AI system is intended for medical or safety purposes.

    Article 5 AI Act

  • High-risk AI systems

  • A safety component is defined as: A part that has a critical safety function. If this component fails or malfunctions, it can endanger the health and safety of users.

    Article 3(14) AI Act

  • The AI Act contains a list of crticial areas and use cases which are considered high-risk:

    • Biometrics, as far as normally allowed, including:
      • Remote biometric identification systems (this does not include biometric verification meant to verify someones claimed identity).
      • Biometric categorisation systems.
      • Emotion recognition systems
    • Management and operation of critical infrastructure (digital, road traffic and supply of water, gas, heating and electricity).
    • Education and vocational training:
      • Determining access to educational institution in the broadest sense.
      • Evaluation of learning outcomes.
      • Assessing the appropriate level of education for an individual (both receive and access).
      • Monitoring or detecting prohibited behaviour during tests.
    • Employment, workers management and access to self-employment:
      • Recruitment or selection of people, including placing vacancies, filtering applications and evaluating candidates.
      • Making decisions on work-related relationships, promotion and termination, allocation of tasks based on behaviour and evaluation of performance.
    • Access to and enjoyment of essential private services and benefits:
      • Evaluation of eligibility of people for essential public services.
      • Evaluation of creditworthiness of people, except for the purpose of detecting financial fraud.
      • Evaluating and classifying emergency calls or dispatching emergency first response services.
      • Assessing risks and pricing for life and health insurance.
    • Law enforcement:
      • Assessing the risk of a person to become a victim of criminal offences.
      • Supporting law enforcement as polygraphs and similar tools.
      • Evaluating reliability of evidence in an investigation or prosecution of criminal offences.
      • Assessing the risk of offending or re-offending of a person for the purpose of law enforcement outside the scope of the prohibited practice.
      • Profiling people in support of law enforcement for the detection, investigation or prosecution of criminal offences.
    • Migration, asylum and border control management:
      • Polygraphs and similar tools for competent authorities.
      • Assessing risks posed by a person who wants to enter a member state for competent authorities.
      • Assisting competent authorities when assessing applications for asylym, visa and residence permits and associated complaints.
      • Detecting, recognising or identifying people in the context of migration, asylum and border control management, except of verification of travel documents.
    • Administration of justice and democratic processes:
      • Assisting a judicial authority in researching and interpreting facts and the law and applying the law to a concrete set of facts.
      • Influencing the outcome of an election or referendum or the voting behaviour of people. This does not include tools to organise, optimise and structure political campaigns for administrative and logistical purposes.

    Annex III AI Act

  • A significant risk is defined as:

    A risk of harm to health, safety or fundamental rights of natural person or the environment. A significant risk is determined by the combination of its severity, intensity, probability of occurrence, duration of effects, and its potential to impact individuals, groups, or specific categories of people. 

    An AI system does not pose a significant risk if one of the following criteria is fulfilled:

    1. The AI system is intended to perform a narrow procedural task;
    2. The AI system is intended to improve the result of a previously completed human activity;
    3. The AI system is intended to detect decision-making patterns or deviations from prior decision-making patterns and is not meant to replace or influence the previously completed human assessment, without proper human review; or
    4. The AI system is intended to perform a preparatory task to an assessment relevant for the purpose of the use cases listed in Annex III.

    An AI system shall always be considered high-risk if the AI system performs profiling of natural persons.

    Article 6 AI Act

  • Click send to see your result

  • If you want more information about your results or if you want to know what these mean for your organisation, insert your email address here or mail wijst@bg.legal!

  • Should be Empty: