Article 10 — Data and Data Governance
Data and data governance 1. High-risk AI systems which make use of techniques involving the training of AI models with data shall be developed on the basis of training, validation and testing data sets that meet the quality criteria referred to in paragraphs 2 to 5 whenever such data sets are used. 2. Training, validation...
on the basis of training, validation and testing data sets that meet the quality criteria referred to in paragraphs 2 to 5
whenever such data sets are used.
for the intended purpose of the high-risk AI system. Those practices shall concern in particular:
collection;
aggregation;
on fundamental rights or lead to discrimination prohibited under Union law, especially where data outputs influence
inputs for future operations;
gaps and shortcomings can be addressed.
free of errors and complete in view of the intended purpose. They shall have the appropriate statistical properties, including,
where applicable, as regards the persons or groups of persons in relation to whom the high-risk AI system is intended to be
used. Those characteristics of the data sets may be met at the level of individual data sets or at the level of a combination
thereof.
are particular to the specific geographical, contextual, behavioural or functional setting within which the high-risk AI
system is intended to be used.
EN OJ L, 12.7.2024
high-risk AI systems in accordance with paragraph (2), points (f) and (g) of this Article, the providers of such systems may
exceptionally process special categories of personal data, subject to appropriate safeguards for the fundamental rights and
freedoms of natural persons. In addition to the provisions set out in Regulations (EU) 2016/679 and (EU) 2018/1725 and
Directive (EU) 2016/680, all the following conditions must be met in order for such processing to occur:
state-of-the-art security and privacy-preserving measures, including pseudonymisation;
protected, subject to suitable safeguards, including strict controls and documentation of the access, to avoid misuse and
ensure that only authorised persons have access to those personal data with appropriate confidentiality obligations;
end of its retention period, whichever comes first;
2016/680 include the reasons why the processing of special categories of personal data was strictly necessary to detect
and correct biases, and why that objective could not be achieved by processing other data.
to 5 apply only to the testing data sets.
Related Articles in This Chapter
Related Blog Articles
How Annex III employment use cases trigger high-risk obligations and what HR teams should impleme...
Many companies are closer to Annex III obligations than they think. Here is how to assess your ex...
What a Fundamental Rights Impact Assessment includes, when it applies, and how SMEs can run FRIA ...
Machen Sie unsere kostenlose Risikobewertung
Finden Sie in 2 Minuten heraus, wo Ihr Unternehmen unter der EU-KI-Verordnung steht.
Quiz starten