MDR classification tells you what kind of medical device you have. AI Act classification tells you what kind of AI system you have. For medical AI, both apply simultaneously. The AI Act treats medical device AI that requires a notified body under the MDR as high-risk, which means you inherit a second set of obligations layered on top of MDR — not instead of it.

By Tibor Zechmeister and Felix Lenhard.

TL;DR

  • MDR and the AI Act (Regulation (EU) 2024/1689) are separate regulations that both apply to medical-device AI.
  • MDR classifies devices by risk to patient health using Annex VIII rules; Rule 11 is the key rule for software.
  • The AI Act classifies AI systems by a different logic — prohibited, high-risk, limited-risk, minimal-risk.
  • Medical device software that requires a notified body conformity assessment under MDR is treated as high-risk under the AI Act.
  • This produces a dual-obligation regime: MDR technical documentation plus AI Act requirements, assessed together where possible.
  • The MDR classification is the gating question. Get it right first, then layer the AI Act obligations on top.

Why this matters

Founders building medical AI in Europe in 2026 face two regulations pointing at the same product. The MDR has been the single source of truth for medical devices since 2021. The AI Act adds a second framework that applies to AI systems generally, including those inside medical devices.

The result is not two separate product lines and not one merged regime. It is a dual classification exercise — you answer the MDR question, you answer the AI Act question, and you handle the overlap.

Most of the confusion in the market right now comes from founders trying to treat the AI Act as either a replacement for MDR (it is not) or as a separate, parallel track with no connection to MDR (it is not that either). The right mental model is layered: MDR first, AI Act bolted on top where it applies.

What MDR actually says

MDR classification — Annex VIII Rule 11. Software intended to provide information which is used to take decisions with diagnostic or therapeutic purposes is Class IIa; if those decisions may cause death or irreversible deterioration of health, Class III; serious deterioration or surgical intervention, Class IIb. Software intended to monitor physiological processes is Class IIa (Class IIb for vital physiological parameters where variations could result in immediate danger). All other software is Class I.

Article 52 — conformity assessment routes. Class IIa, IIb and III devices require notified body involvement in conformity assessment. Class I devices (other than sterile, measuring, or reusable surgical instruments) do not — they are self-declared.

This matters for the AI Act because the AI Act uses MDR notified-body involvement as one of its triggers for high-risk classification.

What the AI Act says

Regulation (EU) 2024/1689 — the AI Act. The Act creates a risk-based framework for AI systems placed on the EU market. The four buckets are, broadly: prohibited practices, high-risk AI systems, AI systems with transparency obligations, and minimal-risk systems that are largely unregulated.

For medical devices, the relevant bucket is high-risk. The AI Act treats AI systems that are safety components of, or are themselves, products covered by Union harmonisation legislation listed in its Annex (which includes the MDR) as high-risk where those products are required to undergo a third-party conformity assessment under that legislation.

The practical reading of this rule: if your medical AI is Class IIa, IIb, or III under MDR — that is, if a notified body is involved — you are a high-risk AI system under the AI Act. If your medical AI is genuinely Class I self-declared under MDR, the AI Act high-risk trigger through this route does not fire, though other AI Act obligations (such as transparency requirements) may still apply.

Dual conformity assessment. The AI Act is designed to ride on top of existing sectoral conformity assessment rather than create a parallel process. For medical AI, the intent is that notified bodies assessing MDR conformity also assess relevant AI Act requirements, and that AI Act documentation is integrated with MDR technical documentation.

A worked example

A startup is building a Class IIa decision-support tool that uses machine learning to flag patients at risk of sepsis in the ICU. The output is used by the clinical team to prioritise assessment.

MDR path. Rule 11: software providing information used to take decisions with diagnostic purposes, serious-but-not-life-threatening harm on a wrong output — Class IIa. Conformity assessment route under Article 52: Annex IX QMS assessment with technical documentation review, notified body involved.

AI Act path. The product is an AI system (machine learning, autonomous inference, output informs clinical decisions). It is a safety component of, or is itself, a product covered by MDR that requires notified body assessment. Under the AI Act, this makes it high-risk.

What this means in practice. The founder does not run two separate projects. They run one project with two overlays:

  • The MDR technical documentation (Annex II) is built as usual, and includes all the AI-specific content that makes sense — training data description, validation data, performance metrics, risk management for model-specific hazards.
  • The AI Act obligations — data governance, transparency to users, human oversight, accuracy and robustness, logging, post-market monitoring — are mapped into the same technical documentation wherever they align with existing MDR requirements.
  • The notified body, during the MDR conformity assessment, also assesses the AI Act items for which it is authorised.

What the founder does not do is build two technical files. What the founder does not do is hope the AI Act will go away. What the founder does not do is assume that passing the MDR assessment automatically covers every AI Act requirement — some obligations (e.g. specific transparency to end users, specific logging obligations) are AI-Act-native and have no direct MDR counterpart.

The Subtract to Ship playbook

Step 1 — Do the MDR classification first. The MDR question is the gate. Until you know whether you are Class I, IIa, IIb, or III, you cannot answer the AI Act question cleanly. Use MDCG 2019-11 Rev.1 as the current authoritative reading for Rule 11.

Step 2 — Do the AI Act classification second. Check whether you meet the high-risk trigger via MDR notified-body involvement. If yes, you are high-risk. If no (genuinely Class I), check whether any other AI Act trigger applies — and check the transparency-only obligations that apply regardless.

Step 3 — Write a dual-classification memo. One page. MDR class, MDR rule, MDR conformity assessment route. AI Act category, AI Act trigger, applicable AI Act obligations. This document goes in the QMS and gets reviewed every time you touch the intended purpose.

Step 4 — Map AI Act obligations into existing MDR structure where possible. Risk management (EN ISO 14971) already covers most of what the AI Act calls "accuracy and robustness" if you do it honestly. Post-market surveillance under MDR covers most of what the AI Act calls "post-market monitoring of AI systems." Don't duplicate — map.

Step 5 — Identify the AI-Act-native gaps. There are obligations in the AI Act that do not have a direct MDR counterpart. Treat these as a delta checklist, not a second QMS.

Step 6 — Ask the notified body early. Your notified body's readiness to assess AI Act requirements alongside MDR varies. Ask in pre-submission. A 15-minute conversation now prevents a six-month misalignment later.

The Subtract to Ship instinct is right here: resist the temptation to build a separate AI-Act operation. Bolt onto what you already have.

Reality Check

  1. Is the MDR classification of your device written down and traceable to Annex VIII Rule 11?
  2. Does a notified body need to be involved under MDR? (If yes, you are almost certainly high-risk under the AI Act.)
  3. Do you have a one-page dual-classification memo that states both MDR class and AI Act category with reasons?
  4. Have you mapped AI Act obligations onto existing MDR documentation rather than building a parallel file?
  5. Have you identified the AI-Act-native obligations that have no MDR counterpart?
  6. Has your notified body confirmed what AI Act items they will and will not assess?
  7. Is your technical documentation structured so that a reviewer can find the AI Act content without hunting?

Frequently Asked Questions

If my device is Class I under MDR, does the AI Act still apply? Some AI Act obligations apply regardless of MDR class — transparency requirements in particular. The high-risk trigger via MDR third-party conformity assessment does not fire for genuine Class I, but do not assume the AI Act is silent on your product. Check with your regulatory advisor.

Do I need a separate AI Act technical file? Generally no. The expectation is that AI Act documentation is integrated with MDR technical documentation wherever the two overlap, with a clearly identified AI Act section for obligations without an MDR counterpart.

Does my notified body assess AI Act conformity? Often yes, if they are authorised to do so. Ask specifically. Notified body readiness and scope varies, and you cannot assume full coverage.

Is the AI Act already fully in force for medical devices? The AI Act has staged applicability, and some provisions apply later than others.

What happens if MDR classifies me as Class I but the AI Act triggers separately? You do your MDR self-declaration and you handle the applicable AI Act obligations separately. The two regulations are independent legal instruments; one does not excuse the other.

Will MDCG or the Commission publish joint MDR–AI Act guidance? Guidance is expected. As of publication, the working assumption is that MDCG 2019-11 Rev.1 remains the Rule 11 reference and that AI Act implementing acts will clarify the integration over time.

Sources

  1. Regulation (EU) 2017/745 on medical devices, consolidated text. Annex VIII Rule 11, Article 52, Annex IX.
  2. Regulation (EU) 2024/1689 — the AI Act.
  3. MDCG 2019-11 Rev.1 — Guidance on Qualification and Classification of Software in Regulation (EU) 2017/745 – MDR. June 2025.
  4. EN 62304:2006+A1:2015 — Medical device software lifecycle processes.
  5. EN ISO 14971:2019+A11:2021 — Medical devices — Application of risk management.