From February 2025, the first binding obligations under the EU Artificial Intelligence Regulation (AI Act) apply. These affect not only developers but also regular users of tools like ChatGPT, generative AI models, and chatbots. By August 2025, the rules will expand to standard AI models and include stricter penalties. Prepare early – implementation time is running out.
📍 Who Does the AI Act Apply To?
The Act applies to companies that:
-
use AI for customer interaction (e.g., chatbots, voice assistants),
- generate any type of content with AI (text, images, video, audio),
- make decisions based on AI (e.g., HR scoring, credit models),
- embed or connect general-purpose AI models (GPAI) – such as GPT – into their products or services.
📌 What Obligations Does the AI Act Introduce?
The AI Act introduces obligations based on the risk level of systems:
-
General users of AI: Must provide training to employees, label AI-generated outputs, and check whether tools fall under higher regulatory categories.
- GPAI models (e.g., GPT-4, Claude, Gemini): Require documentation, transparent labelling of certain outputs, and other duties.
- Prohibited systems: Ban on AI that manipulates user behavior, uses social scoring, or biometric identification of children.
- High-risk systems: Include AI used in areas with high impact on individuals and society – such as hiring, employee performance evaluation, credit scoring, allocation of social benefits, medical diagnostics, or judicial and law enforcement applications. These systems are subject to stricter requirements: technical documentation, risk assessment, EU database registration, CE marking, and auditability by regulatory authorities.
(EN: These systems are subject to stricter requirements such as technical documentation, risk assessment, registration in the EU database, CE marking, and auditability by supervisory authorities.)
✅ What You Need to Do by August 2025
1. Internal Register of AI Tools
-
Create an overview of all AI tools used and assess their risk levels.
- Identify GPAI models and any high-risk systems.
2. Prepare for GPAI Requirements (Chapter V, Articles 53–55)
-
Ensure your GPAI provider (e.g., OpenAI, Anthropic) meets the legal requirements.
- Prepare documentation, output monitoring, and labelling.
3. AI Training for Employees (Article 4)
-
Mandatory for all staff using AI systems.
- Focus on risks, limitations, and legal responsibility.
4. Check Compliance of Your AI Systems (Article 5)
-
Ensure you do not use prohibited AI: hidden behavioral manipulation, social scoring, predictive policing.
🏷 Labelling of AI Outputs (from 2 August 2026)
-
AI-generated outputs that are published without human review must be clearly labelled.
-
Exceptions apply for significantly edited or clearly satirical content.
🔗 Unsure Which AI Tools in Your Company Are Regulated?
We can help you:
-
map and classify your AI tools,
- prepare employee training,
- set up internal policies in line with the AI Act.
Contact us at office@paulqlaw.com.
🧑⚖️ Summary for Legal Teams
Timeline of Main Obligations
Date |
Key Obligation |
Legal Reference |
---|---|---|
2 Feb 2025 |
AI training (Art. 4); ban on prohibited AI practices (Art. 5) |
Art. 4, 5 |
2 Aug 2025 |
GPAI and systemic risk model requirements (documentation, labelling, disclosures); start of enforcement |
Ch. V, Art. 53–55; Art. 99 |
2 Aug 2026 |
Transparency: labelling AI-generated content, user notification when interacting with AI |
Art. 50 |
Full compliance for high-risk systems (documentation, FRIA, CE marking, registration) |
Ch. III & IV |
|
2 Aug 2027 |
Additional rules for real-time biometric identification systems |
Art. 6(1) |
Key Facts
Parameter |
Details |
---|---|
Legal Basis |
Regulation (EU) 2024/1689 ("AI Act") |
Who It Applies To |
Developers, deployers, distributors, general users of AI |
Penalties |
Up to €35 million or 7% global turnover; up to €15 million or 3% for less severe breaches (Art. 99) |
Supervisory Authority in Slovakia |
In the process of designation; obligations apply regardless of national implementation status |