DAkkS and UKAS publish joint technical bulletin on the use of artificial intelligence (AI)
The German Accreditation Body (DAkkS) and the United Kingdom Accreditation Service (UKAS) have jointly published a Technical Bulletin on the use of Artificial Intelligence (AI) in accredited conformity assessment. It provides informative guidance on the information requirements that apply when introducing AI and makes recommendations to accredited conformity assessment bodies (CABs) on key aspects of dealing with AI systems.
Existing accreditation requirements remain applicable
The Technical Bulletin does not introduce any new accreditation requirements. Rather, it supports CABs in identifying and contextualising existing, technology-agnostic requirements in the context of AI systems. The relevant standard(s) used for accreditation remain the authoritative document(s).
Information obligation when introducing AI
The adoption of AI in conformity assessment procedures is considered a significant change and, therefore, must be communicated to the respective national accreditation body in a timely manner. This allows potential effects on the accreditation process to be considered in assessment activities.
Three key aspects for the responsible use of AI
In the context of AI use, the bulletin recommends considering the following three dimensions to ensure the standard-compliant use of AI systems in conformity assessment.
- Type of AI system (application areas, training data/knowledge, implementation approach),
- Use of the AI system within the conformity assessment process (administrative tasks or integration into the core conformity assessment process),
- Degree of reliance on the AI system (administrative, advisory or decision-support functionality, any attempt to delegate decisions to AI).
Important: Final decisions on conformity may not be delegated to AI systems within the existing accreditation framework.
AI in line with the fundamental principles of accreditation
With the Joint Technical Bulletin, DAkkS and UKAS are sending a joint signal for the responsible and standard-compliant use of AI in conformity assessment. The publication shows how innovation and digitalisation can be reconciled with the proven basic principles of accreditation – independence, competence, and reliability. This strengthens transparency and trust in the quality infrastructure, even in the context of the increasing use of AI.
