Zum Hauptinhalt springen
Risk level: high Title III — High-Risk AI Systems Conformity Assessment

Article 46 — Derogation from Conformity Assessment

Derogation from conformity assessment procedure 1. By way of derogation from Article 43 and upon a duly justified request, any market surveillance authority may authorise the placing on the market or the putting into service of specific high-risk AI systems within the territory of the Member State concerned, for exceptional reasons of public security or...

Article 46
Derogation from conformity assessment procedure
1. By way of derogation from Article 43 and upon a duly justified request, any market surveillance authority may
authorise the placing on the market or the putting into service of specific high-risk AI systems within the territory of the
Member State concerned, for exceptional reasons of public security or the protection of life and health of persons,
environmental protection or the protection of key industrial and infrastructural assets. That authorisation shall be for
a limited period while the necessary conformity assessment procedures are being carried out, taking into account the
exceptional reasons justifying the derogation. The completion of those procedures shall be undertaken without undue delay.
2. In a duly justified situation of urgency for exceptional reasons of public security or in the case of specific, substantial
and imminent threat to the life or physical safety of natural persons, law-enforcement authorities or civil protection
authorities may put a specific high-risk AI system into service without the authorisation referred to in paragraph 1,
provided that such authorisation is requested during or after the use without undue delay. If the authorisation referred to in
paragraph 1 is refused, the use of the high-risk AI system shall be stopped with immediate effect and all the results and
outputs of such use shall be immediately discarded.
3. The authorisation referred to in paragraph 1 shall be issued only if the market surveillance authority concludes that
the high-risk AI system complies with the requirements of Section 2. The market surveillance authority shall inform the
Commission and the other Member States of any authorisation issued pursuant to paragraphs 1 and 2. This obligation shall
not cover sensitive operational data in relation to the activities of law-enforcement authorities.
4. Where, within 15 calendar days of receipt of the information referred to in paragraph 3, no objection has been raised
by either a Member State or the Commission in respect of an authorisation issued by a market surveillance authority of
a Member State in accordance with paragraph 1, that authorisation shall be deemed justified.
5. Where, within 15 calendar days of receipt of the notification referred to in paragraph 3, objections are raised by
a Member State against an authorisation issued by a market surveillance authority of another Member State, or where the
Commission considers the authorisation to be contrary to Union law, or the conclusion of the Member States regarding the
compliance of the system as referred to in paragraph 3 to be unfounded, the Commission shall, without delay, enter into
consultations with the relevant Member State. The operators concerned shall be consulted and have the possibility to
present their views. Having regard thereto, the Commission shall decide whether the authorisation is justified. The
Commission shall address its decision to the Member State concerned and to the relevant operators.
6. Where the Commission considers the authorisation unjustified, it shall be withdrawn by the market surveillance
authority of the Member State concerned.
7. For high-risk AI systems related to products covered by Union harmonisation legislation listed in Section A of
Annex I, only the derogations from the conformity assessment established in that Union harmonisation legislation shall
apply.

Related Blog Articles

HR & Recruitment AI: The Most Common High-Risk Category Under the EU AI Act

How Annex III employment use cases trigger high-risk obligations and what HR teams should impleme...

High-Risk AI Systems: Are You Affected?

Many companies are closer to Annex III obligations than they think. Here is how to assess your ex...

FRIA Guide for High-Risk AI Deployments

What a Fundamental Rights Impact Assessment includes, when it applies, and how SMEs can run FRIA ...

Check Your Compliance

Find out if this article applies to your AI system.

Take Risk Quiz

Compliance Templates

Download ready-made templates for EU AI Act compliance.

View Templates

Machen Sie unsere kostenlose Risikobewertung

Finden Sie in 2 Minuten heraus, wo Ihr Unternehmen unter der EU-KI-Verordnung steht.

Quiz starten