Zum Hauptinhalt springen
← Zurück zum Blog

EU AI Act Deadline Delayed to December 2027? What the Omnibus Vote Means

Teilen auf LinkedIn

7 Min. Lesezeit

Update (March 28, 2026): On March 26, 2026, the European Parliament voted 569-45 to approve its position on the Digital Omnibus, aligning with the Council on the December 2027 deadline for high-risk AI and adding a ban on AI nudifier apps. Trilogue negotiations are expected to begin in April 2026. Read our full coverage: Parliament Approves AI Act Delay and Nudifier Ban.

Update (March 17, 2026): On March 13, 2026, the EU Council formally adopted its negotiating mandate on the Digital Omnibus. The Council position proposes delaying stand-alone high-risk AI obligations to 2 December 2027 and product-embedded high-risk AI to 2 August 2028. The Council also added a ban on AI nudification tools and extended SME exemptions to small mid-caps. However, this is still the Council's position only — trilogue negotiations with the European Parliament must follow before any changes become law. We cover this in detail in our dedicated article on the Council vote.


If you have searched "EU AI Act deadline postponed" lately, you are not alone. The topic has exploded because of the European Commission's Digital Omnibus proposal discussions. The short answer is simple:

No, the deadline has not been officially postponed yet.

There is a proposal in circulation and serious policy debate, but no legally adopted change is in force unless and until the full EU legislative process is completed.

For SMEs, this matters because waiting for political certainty is risky. Some obligations are already active now, and most implementation work takes months — not weeks.

Quick answer: Is the EU AI Act delayed?

  • Current legal baseline: Regulation (EU) 2024/1689 (the AI Act) remains in force.
  • Digital Omnibus status: A policy proposal and negotiations track, not final law.
  • What this means in practice: Treat any delay headlines as "possible scenario," not "confirmed fact."

If you're an SME, the safest posture is: plan against current law, while monitoring proposals.

What is the Digital Omnibus in this context?

In late 2025, the Commission floated a simplification package often described in policy circles as part of a broader "Digital Omnibus" effort. The core political intent: reduce implementation friction and sequence obligations more realistically for market participants.

In AI Act discussions, this has been interpreted as a possible shift of some high-risk compliance milestones from August 2026 toward later dates (commonly discussed: December 2027 or August 2028 windows).

Important distinction:
- Proposal language and political statements can shape expectations.
- Binding obligations only change after formal legislative adoption (Council + Parliament, then publication in the Official Journal where applicable).

So yes, the proposal is real policy noise. No, it is not final legal relief.

Why confusion is so high

Three things are happening at once:

  1. Headline simplification politics makes it sound like a broad freeze is imminent.
  2. Different AI Act obligations apply on different dates, so people mix active and upcoming duties.
  3. Vendors, consultants, and commentators use shorthand like "delayed" even when they mean "proposed delay."

This leads to dangerous behavior: companies pausing all compliance work because they assume nothing applies yet.

What is already active regardless of delay talks

Even if future milestones move, key obligations are already live (or treated as active by the Commission's implementation materials). SMEs should not ignore these.

1) Prohibited AI practices (Article 5)

The AI Act's prohibited-practice layer has already started applying in practice. If a use case falls into banned categories, this is not a "wait for 2026" issue.

For SMEs, this means every new AI use case should pass a no-go screening before rollout.

2) AI literacy obligations (Article 4)

AI literacy is not a "nice to have". Organizations using or deploying AI are expected to ensure sufficient AI literacy among relevant staff.

If your teams are using copilots, chat assistants, AI scoring, or generative workflows, literacy should be operationalized now: role-based training, minimum usage rules, escalation paths, and documentation.

General-purpose AI and transparency obligations have their own timeline logic and implementation guidance. Even where exact obligations vary by role and use case, the direction is clear: governance and transparency cannot be deferred to the last minute.

Source anchors you can rely on

When evaluating claims, start from primary EU sources:

For Omnibus-specific claims, verify against the Commission's official press material and legislative procedure updates, not social media summaries.

What could happen next procedurally

If delay language advances, the legislative path still matters:

  1. Proposal drafting/refinement at Commission level
  2. Council and Parliament positions
  3. Trilogue negotiations (if required)
  4. Formal adoption
  5. Publication and entry into force of amending provisions

Until that chain completes, current obligations remain your legal ground truth.

"Should we pause compliance work until this is clear?"

In most SMEs: No.

Pausing now creates two risks:
- Regulatory risk: you miss already-active obligations.
- Execution risk: you compress 6-12 months of governance work into a panic sprint.

Even if high-risk deadlines move, the work you do now is not wasted. Inventory, ownership, policies, oversight, and documentation improve operations and sales readiness anyway.

What SMEs should do now (regardless of delay outcome)

Here is a practical action plan you can execute immediately.

Step 1: Build or refresh your AI inventory (this week)

Document every AI-supported workflow:
- tool/model/vendor,
- purpose,
- business owner,
- affected users,
- decision impact,
- current safeguards.

No inventory = no compliance strategy.

Step 2: Run prohibited-practice screening (Article 5 gate)

Before launch of any new use case, run a simple red-flag check:
- manipulative/exploitative behavior risk,
- sensitive biometric/emotion contexts,
- social-scoring-like outcomes,
- unjustified rights impact.

Escalate unclear cases before deployment.

Step 3: Launch AI literacy baseline (Article 4)

Do not wait for perfect curriculum. Start with role-specific minimums:
- what AI tools are approved,
- where human review is mandatory,
- what data cannot be entered,
- how to escalate uncertain outputs,
- who is accountable.

Track completion and refresh cadence.

Step 4: Prioritize high-impact use cases for deeper controls

Focus first on workflows affecting:
- hiring and workforce decisions,
- eligibility/access outcomes,
- customer rights or safety,
- pricing or service denial decisions.

For these, add stronger oversight, logging, and documentation.

Step 5: Prepare evidence pack for customers and auditors

Keep one shared repository with:
- inventory,
- risk notes,
- policies,
- training records,
- incident log,
- control owners and review dates.

This helps with both compliance and procurement due diligence.

If you need a fast baseline, start here:

  • AI risk self-check quiz: /quiz
  • Potential fine exposure estimator: /fine-calculator
  • Timeline overview: /timeline
  • Practical control templates: /compliance-checklists

Scenario planning: how to decide in uncertainty

Use a two-track plan:

  • Track A (legal certainty): continue planning against current, in-force text.
  • Track B (policy flexibility): maintain a "delay scenario" roadmap so you can resequence budgets if law changes.

This avoids both overreaction and paralysis.

Common mistakes to avoid right now

  1. Treating "proposal" as "law".
  2. Stopping all work because one deadline might move.
  3. Ignoring already-active obligations (Article 4 and Article 5 implications).
  4. Outsourcing judgment entirely to vendors.
  5. Failing to appoint internal owners.

Final verdict

So, has the EU AI Act deadline been postponed?

Not yet formally, but the Council has taken a concrete step. On March 13, 2026, the Council adopted its negotiating position proposing to delay high-risk deadlines to December 2027. This is a strong political signal, but it requires Parliament agreement and formal adoption before becoming law.

For SMEs, the winning strategy is clear:
- keep moving on foundational compliance,
- monitor official EU sources,
- treat delay headlines as scenarios, not excuses.

If the deadline moves, you gain more breathing room.
If it doesn't, you avoid last-minute chaos.
Either way, you are in a better position than teams waiting for certainty.


SEO keywords covered naturally: EU AI Act deadline postponed, Digital Omnibus AI Act, EU AI Act delay 2027, Is the EU AI Act delayed, AI Act compliance deadline.

Verwandte Artikel

Which AI Systems Are Banned Under the EU AI Act?

Article 5 prohibited AI practices explained: social scoring, manipulative systems, and limits on biometric surveillance.

Artikel lesen →

What the EU AI Act Means for Small Businesses

A plain-English breakdown of why SMEs should prepare early for the August 2026 deadline.

Artikel lesen →

Machen Sie unsere kostenlose Risikobewertung

Finden Sie in 2 Minuten heraus, wo Ihr Unternehmen unter der EU-KI-Verordnung steht.

Quiz starten