Does This Even Apply to Me?
Short answer: Yes. The EU AI Act doesn't just apply to AI developers and tech giants. Every company that uses AI systems (even if it's just ChatGPT for drafting emails or an AI tool for accounting) falls under the regulation. And the deadlines are already running.
Since 2 February 2025, Article 4 has been in effect: the AI literacy obligation. On 2 August 2026, the full requirements for high-risk AI systems take effect. The fines are significant: up to €35 million or 7% of global annual turnover.
The Four Risk Categories: Simply Explained
The EU AI Act classifies AI systems by risk:
Unacceptable risk: Banned. Social scoring, manipulative AI, real-time biometric surveillance in public spaces. Rarely relevant for most SMEs.
High risk: Strict requirements. AI in recruitment, credit scoring, safety components. Extensive documentation, testing, and monitoring obligations apply.
Limited risk: Transparency obligations. Chatbots and AI-generated content must be labelled as such. Relevant for many SMEs using AI in customer-facing roles.
Minimal risk: No specific requirements. Spam filters, AI-powered spell checkers, internal automations. But: the AI literacy obligation still applies.
Article 4: The Obligation Many Overlook
The AI literacy obligation has been active since February 2025, and is frequently underestimated. It states: Every employee who operates AI systems or uses their outputs must have sufficient knowledge.
In practical terms, this means:
- Understanding how the AI system works: What can it do, what can't it?
- Being able to assess outputs: When is an AI result reliable, when isn't it?
- Knowing the risks: Data privacy, bias, hallucinations
- Understanding responsibility: Who is liable when things go wrong?
The AI Office has clarified: Formal certifications are not required. But training measures must be documented.
Your 5-Point Checklist
1. Create an AI Inventory
List all AI systems in use across your company. This includes SaaS tools with embedded AI. Many businesses use more AI than they realise.
2. Assess Risk Levels
Assign each system to a risk category. Most applications in mid-sized businesses fall under "limited" or "minimal" risk. But check carefully: an AI tool that assists with candidate screening is already "high risk".
3. Develop a Training Plan
Identify who works with AI and what knowledge gaps exist. Combine general AI awareness training with role-specific sessions. This doesn't need to be a major project; half-day workshops with supporting materials are often sufficient.
4. Establish Documentation
Record which systems are in use, who was trained and when, and what risk assessments were conducted. This documentation is your safeguard in case of an audit.
5. Set Up Monitoring
AI compliance isn't a one-off exercise. New tools, updates, changed use cases: all require regular review. Define a rhythm, such as quarterly check-ins.
Good News: SME Exemptions
The EU AI Act includes provisions for SMEs that many don't know about:
- Simplified documentation requirements: less bureaucratic burden than for large enterprises
- Reduced fines: proportional to company size
- Regulatory sandboxes: test environments where AI systems can be trialled under supervision
- Free guidance resources: The European Commission has published a repository with over 40 best-practice examples for AI literacy
Timeline: What's Due When?
| Deadline | What applies | |----------|-------------| | Since Feb 2025 | AI literacy obligation (Art. 4), all companies | | Since Aug 2025 | Ban on unacceptable AI systems | | Aug 2026 | Full requirements for high-risk AI | | Dec 2027 | Extended high-risk obligations (Digital Omnibus postponement) |
Bottom Line
The EU AI Act isn't a brake; it's a framework that builds trust. SMEs that act now don't just avoid fines, they position themselves as responsible technology users. The five steps above can be implemented in a matter of weeks. Don't wait for August 2026. The AI literacy obligation already applies today.


