---
title: MDCG Guidance on AI/ML Medical Devices: Key Takeaways for Startups
description: MDCG guidance for AI/ML medical devices is mostly carried by MDCG 2019-11 Rev.1 on software qualification and classification. Here is what it says.
authors: Tibor Zechmeister, Felix Lenhard
category: AI, ML & Algorithmic Devices
primary_keyword: MDCG guidance AI ML medical devices
canonical_url: https://zechmeister-solutions.com/en/blog/mdcg-guidance-ai-ml-medical-devices
source: zechmeister-solutions.com
license: All rights reserved. Content may be cited with attribution and a link to the canonical URL.
---

# MDCG Guidance on AI/ML Medical Devices: Key Takeaways for Startups

*By Tibor Zechmeister (EU MDR Expert, Notified Body Lead Auditor) and Felix Lenhard.*

> **There is no dedicated MDCG guidance document specifically about AI or machine learning in medical devices. The closest and most directly applicable MDCG guidance is MDCG 2019-11 Rev.1 (June 2025), the Guidance on Qualification and Classification of Software in MDR and IVDR. It does not treat AI as a separate category — it treats AI-driven software the same way it treats any other medical device software, by intended purpose under MDR Article 2(1) and by classification under Annex VIII Rule 11. For founders building AI/ML medical devices in 2026, this is the primary MDCG document to read, and the gaps where MDCG has not yet spoken are as important to understand as the text it contains.**

**By Tibor Zechmeister and Felix Lenhard. Last updated 10 April 2026.**

---

## TL;DR

- There is no dedicated MDCG guidance on AI or machine learning medical devices as of April 2026. The AI-relevant MDCG guidance is carried almost entirely by MDCG 2019-11 Rev.1 on software qualification and classification.
- MDCG 2019-11 Rev.1 (June 2025) applies the same qualification test and the same Rule 11 classification logic to AI software that it applies to any other software. The intended purpose decides qualification; the severity of the decision decides classification.
- The guidance does not set AI-specific requirements for training data, model validation, bias testing, or post-market drift detection. Those obligations exist in practice, but they come from MDR Annex I, EN ISO 14971:2019+A11:2021, and EN 62304:2006+A1:2015 — not from a dedicated MDCG AI document.
- The operational interface between MDR conformity assessment and the EU AI Act for medical devices is still being worked out between the Commission, the Medical Device Coordination Group, and Notified Bodies. A dedicated MDCG guidance on AI could come later; none is in force today.
- For a founder in 2026, the practical move is to read MDCG 2019-11 Rev.1 carefully, apply it to the AI product in writing, and fill the AI-specific gaps with risk management and lifecycle work grounded in EN ISO 14971 and EN 62304 rather than waiting for an AI-specific MDCG document.

---

## What founders are actually asking when they ask for "MDCG guidance on AI"

Almost every time a founder asks where the MDCG guidance on AI medical devices lives, they expect an answer in the form of a document name and a date. They have found MDCG 2019-11 on software, MDCG 2019-16 on cybersecurity, MDCG 2021-24 on classification, and they assume there must be a parallel MDCG document that does for AI what those documents do for their respective topics.

In April 2026, that document does not exist. The Medical Device Coordination Group has not published a dedicated guidance on AI or machine learning in medical devices. The AI-relevant material sits inside the software guidance, because the MDR and the MDCG treat AI software as a subset of medical device software rather than as a new category.

This is not an oversight. It is a deliberate reading of the regulation. MDR (EU) 2017/745 does not distinguish between deterministic algorithms and learned models. Article 2(1) defines a medical device by intended purpose. Annex VIII Rule 11 classifies software by what decision the software informs. Annex I Section 17 sets the requirements for electronic programmable systems and software. None of these provisions mention AI, and the Commission's operational reading through MDCG 2019-11 Rev.1 follows the same approach: if the software qualifies as a medical device and falls under Rule 11, the rules apply the same whether the software is a rule-based engine or a neural network.

A founder who understands that is already ahead of the founder who is still waiting for an MDCG AI document to drop before they start their qualification work.

## The state of MDCG guidance for AI in 2026

Here is the honest state of play at the time of writing.

The primary applicable MDCG document is MDCG 2019-11 Rev.1, published June 2025. It is the revision of the original October 2019 guidance on qualification and classification of software under MDR and IVDR. The revision reflects several years of accumulated Notified Body and Competent Authority experience and is the current version. Anything you downloaded before June 2025 is superseded and should be replaced.

MDCG 2019-11 Rev.1 is software guidance, not AI guidance. It applies the qualification decision tree and the Rule 11 classification logic to software regardless of the underlying implementation technology. AI and machine learning are not treated as a separate track. A diagnostic AI model and a diagnostic rule-based decision tool go through the same four qualification questions and land in the same Rule 11 class for the same intended use.

Other MDCG documents touch AI only indirectly. MDCG 2021-24 on classification walks through classification rules generally, and its Rule 11 section reads consistently with MDCG 2019-11 Rev.1. MDCG 2019-16 Rev.1 on cybersecurity applies to AI systems the same way it applies to any other software — the attack surface of a model is different from a classical system, but the lifecycle obligations the guidance describes are the same. MDCG 2025-10 on post-market surveillance describes the PMS framework that every device operates under, and for AI devices it is the operational starting point for drift detection work even though the document itself does not use the word drift.

Beyond that, MDCG has not yet published a dedicated AI document. Whether one will appear — and when — is a question for the Commission and the MDCG working groups. For founders in 2026, planning around a document that may or may not arrive is not a strategy. Planning around the guidance that exists is.

## What MDCG 2019-11 Rev.1 says about software with adaptive features

MDCG 2019-11 Rev.1 is explicit about one thing that matters for AI: changes to software after it has been placed on the market. The guidance covers the handling of modules and of changes to MDSW, and its treatment of changes is the closest the document comes to addressing the question of adaptive algorithms directly.

The framework MDCG 2019-11 Rev.1 describes is consistent with the MDR's overall posture. A device has a defined configuration at the point of placing on the market. Changes to that configuration are handled through the manufacturer's change management process, and significant changes trigger regulatory consequences — notification, re-assessment, or in extreme cases a new conformity assessment. The guidance walks through how to decide which category a change falls into and how to document the change.

For a locked AI algorithm that only updates through discrete, controlled releases, this framework is a clean fit. Each release is a defined change, risk-assessed and documented, and the Notified Body relationship operates the same way it would for any other software release. Most AI medical devices on the market in the EU in 2026 operate under this model.

For a continuously learning algorithm that updates its weights automatically between audits, the MDCG 2019-11 Rev.1 framework does not offer a tailor-made pathway. The guidance is written around the assumption that a change is a discrete event that can be identified, assessed, and logged. A model that retrains silently in the field does not produce discrete events in the same sense. Founders building adaptive systems typically address this gap by pre-defining a change control envelope in the technical documentation — specifying in advance the scope, direction, and boundaries of permitted updates — so that the Notified Body can assess the envelope as a whole rather than assess each micro-update individually. This approach is not spelled out in MDCG 2019-11 Rev.1, but it is compatible with the guidance's general change-handling logic, and it is the practical path that has emerged in the field.

The guidance's module sections also matter for AI products. A platform that contains one AI-driven medical module inside a larger non-medical product can often scope the regulatory work to the module rather than to the whole platform, provided the module boundary, intended purpose, and interfaces are defined clearly. This is the same scoping move that MDCG 2019-11 Rev.1 recommends for any modular software, and it works the same way when the module happens to be AI.

## The gaps where MDCG has not yet spoken

Reading MDCG 2019-11 Rev.1 against an AI product reveals the gaps quickly. There are four.

**Training data governance.** The guidance does not specify requirements for how training datasets are assembled, documented, versioned, or evaluated for representativeness. An AI product lives and dies on its training data, and the absence of MDCG direction on this topic does not mean the obligation is absent — it means the obligation flows from MDR Annex I (state of the art, risk management, validation) and from the EU AI Act layer, not from a dedicated MDCG document.

**Bias and subgroup performance.** MDCG 2019-11 Rev.1 does not address how to demonstrate that a model performs consistently across patient subgroups, nor how to document the analysis. The obligation exists through the GSPR and through the clinical evaluation requirements of MDR Article 61, but the operational expectations come from Notified Body practice and from the wider standards landscape, not from MDCG text.

**Drift detection in post-market surveillance.** MDCG 2025-10 describes the general PMS framework. It does not prescribe drift monitoring for AI systems specifically. The operational content — what to monitor, at what cadence, with what thresholds — is not in any MDCG document today and has to be derived from the manufacturer's own risk analysis and the general PMS obligations of MDR Articles 83-86.

**The MDR and EU AI Act interface for AI medical devices.** The AI Act adds obligations on top of MDR for high-risk AI systems, which most AI medical devices are by virtue of being safety components of regulated products. How a Notified Body conducting MDR conformity assessment integrates AI Act obligations into its assessment is a question that is still being clarified in practice. There is no MDCG document in April 2026 that resolves this interface in detail, and any source that claims otherwise is overstating what has actually been published.

These gaps do not mean the requirements are optional. They mean that the work has to be grounded in the MDR text, EN ISO 14971:2019+A11:2021, and EN 62304:2006+A1:2015, and that a founder should expect their Notified Body to form an opinion on each area even though MDCG has not yet issued one.

## What to do in the meantime

Waiting for a dedicated MDCG AI guidance is not a plan. Here is the plan for 2026.

Read MDCG 2019-11 Rev.1 in full, once, with the AI product in mind. Apply the four-step qualification decision tree to the AI function in writing. Apply Rule 11 to the classification in writing, using the severity of the clinical decision the information is used to make as the driver. This gives you the defensible qualification and classification foundation that every downstream document depends on.

Ground the AI-specific work — training data governance, bias analysis, drift monitoring — in EN ISO 14971:2019+A11:2021. The risk management file is the place where AI-specific hazards are identified and controlled. Failure modes like bias, distribution shift, adversarial robustness, and explainability gaps are hazards in the ISO 14971 sense. Document them there, trace the controls, and the absence of an MDCG AI document stops being a problem — the risk file carries the argument.

Ground the software lifecycle work in EN 62304:2006+A1:2015. The standard pre-dates the modern AI era and does not contain an AI track, but its core discipline — requirements, architecture, integration, verification — still applies. Layer AI-specific practices (dataset versioning, test set isolation, model validation protocols) on top of the EN 62304 backbone. Competent Notified Bodies expect this layering in 2026 even though it is not spelled out in any MDCG document.

Track the EU AI Act obligations in parallel as a separate column in your requirements matrix. Do not merge them into your MDR file prematurely, because the operational interface is not yet settled and you may need to unpick them later. Read the AI Act text against your product once, summarise the obligations that apply, and revisit the mapping as the interface clarifies.

## Common founder confusions

**"There must be a MDCG AI document I haven't found yet."** In April 2026, there is not. MDCG 2019-11 Rev.1 is the closest applicable guidance, and it is software guidance that covers AI as a subset. If anyone points you to a document that claims to be the MDCG AI guidance, check the date and the publisher carefully before you rely on it.

**"MDCG 2019-11 is old and must be outdated for AI."** MDCG 2019-11 Rev.1 is dated June 2025. The Rev.1 is the current version. The original October 2019 text is superseded. Make sure you have the correct revision before you rely on any specific passage.

**"The AI Act replaces MDCG guidance for AI medical devices."** It does not. The AI Act is a separate Regulation that layers obligations on top of MDR. MDCG guidance still applies to the MDR layer, and the AI Act does not displace MDCG 2019-11 Rev.1 as the applicable software guidance.

**"If MDCG hasn't spoken on training data governance, I don't have to do it."** You do. The obligation flows from MDR Annex I, Article 61, and the risk management and software lifecycle standards. The absence of a dedicated MDCG document does not remove the obligation — it only means the operational framing is yours to build from the underlying sources.

**"Rule 11 doesn't really apply to AI because AI is different."** Rule 11 applies to AI software the same way it applies to any other software, because MDCG 2019-11 Rev.1 reads it that way and Notified Bodies assess in line with MDCG. Assuming AI is outside Rule 11 is a common and expensive misread.

## The Subtract to Ship angle

The absence of a dedicated MDCG AI document is, viewed through the [Subtract to Ship](/blog/subtract-to-ship-framework-mdr) lens, useful. It removes a document from the reading list that founders thought they had to track and replaces it with a clearer target: MDCG 2019-11 Rev.1 plus the AI-specific grounding in EN ISO 14971 and EN 62304. Less surface to learn, more certainty about what applies.

The Purpose Pass still asks whether the AI function has to be regulated at all. Many AI features that ship alongside a medical device are not themselves medical devices — an AI that drafts internal documents, an AI that generates marketing copy, an AI that categorises customer support tickets. The Purpose Pass cuts these out of the regulated perimeter. The Classification Pass applies Rule 11 via MDCG 2019-11 Rev.1 to the AI functions that remain. The Evidence Pass asks what minimum evidence the class actually requires and avoids the trap of running a full clinical investigation where a retrospective performance study on an independent dataset would suffice at the relevant class. The Operations Pass builds a QMS that includes AI-specific processes without bloating the wider QMS for a hardware device that does not have them.

Across all four passes, the missing MDCG AI guidance is not a blocker. The existing MDCG 2019-11 Rev.1 plus the underlying MDR articles and standards carry the weight.

## Reality Check — Where do you stand?

1. Do you have the June 2025 revision of MDCG 2019-11, not an older copy?
2. Have you walked your AI product through the MDCG 2019-11 Rev.1 qualification decision tree in writing?
3. Have you applied Annex VIII Rule 11 explicitly to your AI product, with the class and the reasoning documented?
4. If your AI module sits inside a larger non-medical platform, have you drawn the module boundary in the way MDCG 2019-11 Rev.1 describes?
5. Is your algorithm locked, or have you defined a change control envelope for updates in advance of the first conformity assessment?
6. Does your risk management file under EN ISO 14971:2019+A11:2021 include AI-specific failure modes (bias, drift, adversarial robustness, explainability gaps)?
7. Does your software lifecycle file under EN 62304:2006+A1:2015 include AI-specific practices layered on top of the standard process backbone?
8. Have you confirmed which AI-related claims in your regulatory plan rest on MDCG guidance and which rest on MDR articles, standards, or your own risk reasoning?
9. Are you relying on any MDCG document that is not in the project's ground truth catalog — and if so, have you verified it actually exists and is current?

## Frequently Asked Questions

**Is there a dedicated MDCG guidance document on AI or machine learning medical devices?**
No, not as of April 2026. The Medical Device Coordination Group has not published a guidance document specifically about AI or machine learning in medical devices. The closest and most directly applicable MDCG guidance is MDCG 2019-11 Rev.1 (June 2025), which covers software qualification and classification generally and treats AI software as a subset of medical device software.

**Does MDCG 2019-11 Rev.1 distinguish between AI software and classical software?**
No. The document applies the same qualification decision tree and the same Annex VIII Rule 11 classification logic regardless of whether the software is a deterministic algorithm or a learned model. The intended purpose drives qualification and the severity of the clinical decision the information is used for drives classification.

**Where do the requirements for AI training data governance come from if not from an MDCG document?**
They come from MDR Annex I (state of the art, risk management, validation), from the clinical evaluation obligations of MDR Article 61, from EN ISO 14971:2019+A11:2021 risk management applied to AI-specific failure modes, and from the EU AI Act layer. MDCG has not published a dedicated document on training data governance for AI medical devices.

**Can I ship a continuously learning AI medical device under MDCG 2019-11 Rev.1?**
Not cleanly without additional structure. The guidance's change-handling framework assumes discrete, identifiable changes. Most AI medical devices in the EU today ship locked or with a predefined change control envelope documented in the technical file in advance of the first conformity assessment, so that the Notified Body assesses the envelope rather than individual silent updates.

**Is MDCG 2019-11 the same as the IMDRF SaMD framework?**
No. The IMDRF publishes international SaMD guidance, which MDCG 2019-11 Rev.1 acknowledges in its context. MDCG 2019-11 Rev.1 is the European Commission's operational reading of MDR Article 2(1) and Annex VIII Rule 11 and uses the term MDSW. When you write for an EU Notified Body, use MDCG 2019-11 Rev.1 and MDSW terminology.

**Will there be a dedicated MDCG AI document in the future?**
Possibly. Whether and when such a document appears is a decision for the Commission and the MDCG working groups, and there is no published timeline in April 2026. Building a regulatory plan on the assumption that a document is coming is a poor bet. Build on MDCG 2019-11 Rev.1, the MDR text, and the applicable harmonised standards, and adjust if and when new guidance arrives.

## Related reading

- [MDR Classification Rule 11 for Software — The Short Version](/blog/mdr-classification-rule-11-software) — the entry-level walkthrough of the rule that classifies most AI medical devices.
- [Rule 11 Borderline Cases — Where Classification Gets Hard](/blog/rule-11-borderline-cases) — the edge cases where AI decision-support classification is hardest to pin down.
- [MDCG 2019-11: Guidance on Qualification and Classification of Software — Key Takeaways](/blog/mdcg-2019-11-software-guidance) — the parent post on the document this one relies on.
- [AI in Medical Devices Under MDR: The Regulatory Landscape in 2026](/blog/ai-medical-devices-mdr-regulatory-landscape) — the pillar post for this category.
- [Machine Learning Medical Devices Under MDR](/blog/machine-learning-medical-devices-mdr) — the companion post on ML model development under MDR discipline.
- [Classification of AI and ML Software Under Rule 11](/blog/classification-ai-ml-software-rule-11) — practical class boundaries for AI products.
- [Locked Versus Adaptive AI Algorithms Under MDR](/blog/locked-vs-adaptive-ai-algorithms-mdr) — the change-control side of the adaptive algorithm question.
- [Clinical Evaluation for AI and ML Medical Devices](/blog/clinical-evaluation-ai-ml-medical-devices) — the evidence expectations specific to AI.
- [Post-Market Surveillance for AI Medical Devices](/blog/post-market-surveillance-ai-devices) — drift detection and the operational PMS patterns.
- [The EU AI Act and MDR: How the Two Regulations Interact](/blog/eu-ai-act-and-mdr) — the AI Act layer that sits on top of MDCG 2019-11 Rev.1.
- [Training Data Governance for AI Medical Devices](/blog/training-data-governance-ai-medical-devices) — how to build a dataset governance file without a dedicated MDCG document to lean on.
- [The Subtract to Ship Framework for MDR Compliance](/blog/subtract-to-ship-framework-mdr) — the methodology behind the four passes referenced here.

## Sources

1. MDCG 2019-11 — Guidance on Qualification and Classification of Software in Regulation (EU) 2017/745 — MDR and Regulation (EU) 2017/746 — IVDR. First published October 2019; Revision 1, June 2025. Published by the Medical Device Coordination Group, European Commission.
2. Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices, Article 2(1) (definition of medical device); Annex VIII, Rule 11 (software classification). Official Journal L 117, 5.5.2017.
3. EN 62304:2006 + A1:2015 — Medical device software — Software life-cycle processes (IEC 62304:2006 + IEC 62304:2006/A1:2015).
4. EN ISO 14971:2019 + A11:2021 — Medical devices — Application of risk management to medical devices.

---

*This post is part of the AI, Machine Learning and Algorithmic Devices series in the Subtract to Ship: MDR blog. Authored by Felix Lenhard and Tibor Zechmeister. The absence of a dedicated MDCG guidance on AI medical devices in 2026 is not a gap in the regulatory work — it is a reminder that the MDR text, MDCG 2019-11 Rev.1, and the applicable harmonised standards already carry the weight. If your AI product sits in the space where MDCG has not yet spoken and the general framing here does not resolve your specific case, that is exactly where a sparring partner who has walked other AI MedTech founders through the same decisions earns their keep.*

---

*This post is part of the [AI, ML & Algorithmic Devices](https://zechmeister-solutions.com/en/blog/category/ai-ml-devices) cluster in the [Subtract to Ship: MDR Blog](https://zechmeister-solutions.com/en/blog). For EU MDR certification consulting, see [zechmeister-solutions.com](https://zechmeister-solutions.com).*
