
INTRODUCTION
Artificial intelligence has grown tremendously over the years, where people are afraid of a world that might be controlled by the Al soon. As a response to the unmet ethical and safety challenges revolving Al, the European Union has taken a giant step by the formulation of the Artificial Intelligence Act. Proposed by the European Union Commission in April 2021 and agreed by the European Parliament and the Council in December 2023, the Al Act addresses potential risks to citizens’ health, safety, and fundamental rights. Being the first comprehensive legislative framework on Artificial Intelligence, the EU AI Act aims to create a harmony between Al adoption and its ethical and responsible usage. The EU AI Act is an initiative looking forwarding to the digital transformation strategy by the European Union. The EU AI Act has adopted a risk-based approach where Al Systems are regulated by classifying them into different categories based on the potential risks. The Al Act will be applied in stages, with certain obligations taking effect earlier than others.
DEFINITIONS
Before moving towards the provisions of the EU AI Act and it’s applicability, it is important to know about the prominent definitions under the Act for better understanding. The important definitions under the Act are the following:
- ‘Al system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments1.
- ‘Risk’ means the combination of the probability of an occurrence of harm and the severity of that harm2.
- ‘Provider’ means a natural or legal person, public authority, agency or other body that develops an Al system or a general-purpose Al model or that has an Al system, or a general-purpose Al model developed and places it on the market or puts the Al system into service under its own name or trademark, whether for payment or free of charge3.
- ‘Deployer’ means a natural or legal person, public authority, agency or other body using an Al system under its authority except where the Al system is used during a personal non-professional activity4.
- ‘Authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an Al system or a general-purpose Al model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation5.
- ‘Importer’ means a natural or legal person located or established in the Union that places on the market an Al system that bears the name or trademark of a natural or legal person established in a third country6.
- Distributor’ means a natural or legal person in the supply chain, other than the provider or the importer, that makes an Al system available on the Union market7.
APPLICATION
The Act applies to all Al systems impacting people in the EU deployed by the following:
- Providers based out of EU
- Deployers of Al system within EU
- Importers/distributors making Al available in the EU
- Manufactures placing Al- integrated products in EU using their name or trademark

SALIENT FEATURES
The Al Act applies a risk-based approach, dividing Al systems into different risk levels: unacceptable, high, limited and minimal risk. 8The Al systems which involve unacceptable risks are prohibited. Such Al systems have a negative impact to the fundamental values and safety to the people, unacceptable risks involving Al systems enables manipulation, social control and exploitation. Behavioural manipulation, Social Scoring, Biometric identification, Real-time and remote biometric identification systems are the major examples for unacceptable risks. This kind of Al system tends to encroach the vulnerable and marginal sections and ultimately causes potential threat to the humanity itself. Hence, Al systems of potential risks are prohibited under the EU Al Act.
The next level of the risk-based approach is the high-risk level. The Al systems which cause high risk to the fundamental values and safety are considered as high risk. Such Al systems must put place in constant check. Al system shall be considered High-Risk when it meets the criteria mentioned in section 6(1) of the Al Act. The Al system is intended as a safety component of a product (or is a product itself) covered by specific EU legislation listed in Annex I (e.g., toys, vehicles, medical devices). The EU legislation as specified above requires a third-party conformity assessment before the product or Al system is placed on EU market.
Or under section 6(2) of the Act: Annex III of the Act contains a list of Al systems that must be considered High Risk. Examples include Al systems or products which are deployed for (subject to specific conditions and exemptions) Biometrics, Critical infrastructures, Education and vocational training. Employment, worker management and access to self-employment, etc.
Next level of risk is the limited risk. The Al systems involving limited risk are considered to have some risk and subjected to less stringent requirements. These systems usually generate content that should necessarily be qualified as high risk. The Al systems such as Chatbots, “Deepfakes” ( ‘deep fake’ means AI-generated or manipulated image, audio or video content that resembles existing persons, objects, places, entities or events and would falsely appear to a person to be authentic or truthful)9 or artificially generated content (picture, voice, video), artificially edited or altered content, emotion recognition systems are fall under the category of limited risk.
The next category of risk-based approach is Minimal risk. The Al systern which has little, or no risk will come under this category. Spam filters, Ad blockers, Al enabled video games etc are the Al systems involving minimal risk.
TIMELINE
The EU AI Act happened to enforce on a staggered manner. The whole provisions of the Act would be enforced completely by 2026.
- On 2 February 2025 ban of Al systems having unacceptable risks started to apply
- Codes of practice will apply nine months after entry into force.
- Rules on general-purpose AI systems ( ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market)10 that need to comply with transparency will apply after the entry into force11.All the obligations concerning the High-risk Al systems would be applicable after 36 months from its very inception. Therefore, such Al system would get time to remove all the risk elements as mentioned under the act.
COMPLIANCES
There are certain restrictions which is applied on the Al systems which come under the different risk levels. These Al systems need to comply some rules before they could operate.
OBLIGATONS FOR HIGH-RISK SYSTEMS
All Providers of Al systems of high risk need to comply the following:
- Registration with the EU database
- Implementation of adequate risk management
- Appropriate data governance measures to detect, prevent and mitigate possible biases.
- Technical documentation
- Human oversight
- Event logging for traceability (logging should include usage, data and personnel identification (if any))
- Robust Cybersecurity
- Report serious incidents to the market surveillance authorities of the European country and take appropriate measures to mitigate such risks.
All Deployers of Al system of high risk need to comply the following:
- Appropriate technical and organizational measures as per instructions for use
- Human oversight
- If the deployer is under control of data, they must ensure control and quality over such input of data
- Documentation of logs generated for at least 6 months
- Incident reporting
- Risk management
All Importers/Distributors of Al system of high risk need to comply the following:
- Verity whether the conformity assessment (‘conformity assessment” Means the process of demonstrating whether the requirements set out in Chapter III, Section 2 relating to a high-risk Al system have been fulfilled)12has been completed by the Provider of the Al system
- Ensure proper technical documentation is drawn
- Bears the CE (European Conformity) marking on the Al system
- Importers must prove that the Provider has appointed an authorized representative I
- Notify Provider and Authorities if a risk to safety, etc. Has been detected due to usage of Al system.

OBLIGATIONS FOR LIMITED RISK SYSTEMS
The Al systems of minimal risks are not subject to stringent restrictions as it is posse less or no harm to the people. However, these systems need to follow general code of ethical conduct for ensuring safety standards. This also widely promoted to foster a healthy use of Al systems13.
PENALTIES AND PUNISHMENTS
- For the non-compliance of prohibitions under Article 5, up to €35 million or if the offender is a company up to 7% of its total annual world-wide turnover for preceding financial year or whichever is earlier.
- For violations of obligations under Articles 16, 22, 23, 24, 26, 31, 33, 34, T 50, including the transparency violations under Article 52 the fines up to €15 million or if the offender is a company 3% of global turnover for the preceding financial year or whichever is higher.
- Misleading or incorrect Information to notified bodies or national competent authorities will lead to fines up to €7.5 million or 1% of global turnover.
- For union institutions, bodies and agencies up to €15 million for non-obligation of Prohibition provisions under Article 5 and €7.5 million for non-compliance of requirements of Al system.
When determining whether to impose an administrative fine and deciding its amount in each case, all relevant aspects of the specific situation must be considered, and, where appropriate, the following factors should be considered:
- The nature, gravity, and duration of the infringement.
- The intentional or negligent character of the infringement.
- The size, annual turnover, and market share of the offender.
- Any financial benefits gained, or losses avoided due to the infringement.
- The degree of cooperation with authorities to remedy the infringement.
- The degree of responsibility of the operator, considering implemented technical and organizational measures.
- Actions taken to mitigate harm suffered by affected persons14.
EXCEPTIONS
The following are the exceptions given under EU AI Act:
- Al developed for military usage
- Used by public authorities
- Developed for science and research
- Al system undergoing a development phase (before putting out in the market)
- Al for personal use
Exceptions to High-Risk Al systems:
The Act also excludes certain Al systems from the High-Risk category that:
- Performs narrow procedural tasks
- Improve the results of previous human activities
- Do not influence human decisions
- Do purely preparatory tasks
CONCLUSION
The EU Al Act is a comprehensive legislative framework that not only affects EU companies and business but also global tech companies. The major aim of this Act is to provide a global standard for the use of Al by ensuring safe and ethical balance. Even though, it primarily ensures the protection of EU Citizens from the unauthorised and dangerous use of Al, it has an impact on all over the world as many tech companies which is based outside EU have also come under the ambit of this Act. Transparency is the key to this Act. Every Al system should be transparent enough to meet the safety and ethical standards for operating in the European Union. The negative side of this Act is that this may restrict new companies entering the European union market as the compliance cost is high. Additionally, these regulations could prevent Al innovation and lead to increased operational costs. Apart from this, new startup will find it difficult to survive in EU market as result of the strict compliance of this Act. The first step of every company having Al systems is to identify that in which category they fall i.e. deployer, provider etc. The next step is to assess whether and to which extent their Al system is related to the European union as for its application. It is crucial to understand where your Al system lies on the risk spectrum (e.g., Article 5 for prohibited Al, Article 6 for high-risk Al, etc.), as this will establish specific obligations related to the risks under the Al Act. Apart from some practicality issues and restrictions on the use of Al, overall, the intent behind the EU Al Act provides for a world order, where there is no fear of Al becomes the God.”
References
- European Union Artificial Intelligence Act, No. 2024/1689, Acts of Parliament, 2024 (EU)
- https://artificialintelligenceact.eu/
- https://www.europarl.europa.eu/topics/en/article/20230601STO93804/eu-ai-act-first-regulation-on-artificial-intelligence
- https://kpmg.com/xx/en/our-insights/eu-tax/decoding-the-eu-artificial-intelligence-act.html
- https://ai-act-law.eu/article/99/
Footnotes
- European Union Artificial Intelligence Act § art.3(1), No. 2024/1689, Acts of Parliament, 2024 (EU) ↩︎
- European Union Artificial Intelligence Act § art.3(2), No. 2024/1689, Acts of Parliament, 2024 (EU) ↩︎
- European Union Artificial Intelligence Act § art.3(3), No. 2024/1689, Acts of Parliament, 2024 (EU) ↩︎
- European Union Artificial Intelligence Act § art.3(4), No. 2024/1689, Acts of Parliament, 2024 (EU) ↩︎
- European Union Artificial Intelligence Act § art.3(5), No. 2024/1689, Acts of Parliament, 2024 (EU) ↩︎
- European Union Artificial Intelligence Act § art.3(6), No. 2024/1689, Acts of Parliament, 2024 (EU) ↩︎
- European Union Artificial Intelligence Act § art.3(7), No. 2024/1689, Acts of Parliament, 2024 (EU) ↩︎
- (European Council. (December 9, 2022). Artificial Intelligence Act Trilogue: Press conference- Part 4. [Video] ↩︎
- European Union Artificial Intelligence Act § art.3(60), No. 2024/1689, Acts of Parliament, 2024 (EU) ↩︎
- European Union Artificial Intelligence Act § art.3(63), No. 2024/1689, Acts of Parliament, 2024 (EU) ↩︎
- TOPICS, https://www.europarl.europa.eu/topics/en/article/20230601STO93804/eu-ai-act-first-regulation-on-artificial-intelligence (last visited Apr.27, 2025) ↩︎
- European Union Artificial Intelligence Act § art.3(20), No. 2024/1689, Acts of Parliament, 2024 (EU) ↩︎
- KPMG, https://kpmg.com/xx/en/our-insights/eu-tax/decoding-the-eu-artificial-intelligence-act.html (last visited Apr. 27, 2025) ↩︎
- INTERSOFT CONSULTING, https://ai-act-law.eu/article/99/ (last visited Apr.27, 2025) ↩︎