AI in Medical Devices Under MDR: The Regulatory Landscape in 2026
AI in medical devices sits at the intersection of MDR and the EU AI Act. Here is the 2026 landscape and how startups should think about the overlap.
46 in-depth guides in this cluster
AI in medical devices sits at the intersection of MDR and the EU AI Act. Here is the 2026 landscape and how startups should think about the overlap.
How MDR Rule 11, intended purpose, and significant-change rules apply to machine learning medical devices and adaptive algorithms.
The EU AI Act layers on top of the MDR for AI medical devices. Here is how the two regulations interact — and what startups need to know about the overlap.
How AI Act risk classification and MDR device classification interact for medical AI, where they overlap, and what dual classification means.
Extra obligations for high-risk AI in MedTech under the EU AI Act and how to fold them into an existing MDR technical file.
AI and ML medical software is classified under the same Rule 11 as any other SaMD. Here is how the rule applies to AI specifically.
Clinical decision support software is sometimes a medical device under MDR and sometimes not. Here is the qualification line and what it depends on.
Locked AI algorithms behave the same after release. Adaptive algorithms change over time. Under MDR, the regulatory implications are very different. Here is the full picture.
Continuously learning AI sits at the edge of what MDR can currently govern. Here is where the challenge is unsolved in 2026 and how startups should think about it.
MDCG guidance for AI/ML medical devices is mostly carried by MDCG 2019-11 Rev.1 on software qualification and classification. Here is what it says.
IMDRF AI/ML SaMD documents are informational, not EU law. Here is exactly how to map them into your MDR file without overreach.
Training data for AI medical devices must be representative, documented, and controlled. Here is what MDR and MDCG guidance expect — and what auditors check.
Data quality and bias are core regulatory concerns for AI medical devices. Here is what MDR auditors expect for dataset characterisation and bias testing.
What algorithmic transparency MDR obligations actually require: intended purpose, residual risks, performance data, and what you may protect.
Explainability AI medical devices MDR: there is no specific clause. Here is what notified bodies actually ask for and how to satisfy them honestly.
AI/ML medical devices need clinical evidence plus performance validation, bias testing, and drift monitoring. Here is how to build the clinical evaluation for an AI device.
How to validate AI medical device performance under MDR: the metrics, dataset hygiene, and study designs that hold up in a Notified Body review.
AI medical devices need PMS that monitors algorithm performance, not just complaints. Here is how to build a PMS system with drift detection and performance monitoring.
Concept and data drift can degrade AI medical device performance silently. Here are regulatory-aligned drift detection strategies under MDR.
How to fold AI-specific failure modes — drift, bias, adversarial input, data shift — into an EN ISO 14971 risk management file under MDR.
When retraining an AI/ML medical device counts as a significant change, it can trigger a new conformity assessment. Here is how to decide.
PCCP is FDA terminology. Here is how to build an equivalent change control plan for AI medical devices under the MDR significant change framework.
When a generative AI or LLM tool becomes a medical device under MDR, why hallucination is a safety hazard, and how intended purpose decides everything.
Computer-Aided Detection software flags potential findings for clinician review. Here is how MDR classifies CADe and what it requires.
CADx software under MDR sits at Class IIb or higher under Rule 11. Why the evidence bar and notified body scrutiny are stricter than for CADe.
AI radiology software is one of the most common SaMD categories. Here is how MDR classifies it and what evidence the Notified Body expects.
Digital pathology AI sits in a regulatory grey zone between MDR and IVDR. Classification, scanner validation, and evidence requirements for pathology startups.
When does a predictive analytics model become a regulated medical device under MDR? The intended-purpose threshold and Rule 11 cascade explained.
When does NLP become a medical device under MDR? Clinical scribes, decision support, diagnostic NLP, and Rule 11 classification explained.
AI remote patient monitoring under MDR: Rule 10 vs Rule 11 classification, alarm reliability, data drift, and lifecycle evidence explained.
How federated learning changes AI medical device development under MDR Annex VIII Rule 11, with GDPR data minimisation and EN 62304 evidence expectations.
When synthetic data is acceptable for AI medical device development under MDR Article 61 and Annex XIV, and when it is not a substitute for clinical evidence.
The complete regulatory playbook for AI medical device startups in 2026: MDR plus the AI Act overlay, classification, evidence, lifecycle, and PMS.
AI is reshaping the grunt work of MedTech regulatory operations. Here is what tools like Flinn.ai actually do — and the complacency risk to watch for.
AI can automate parts of regulatory documentation work. Here is what it does well, what it does poorly, and where the human stays in the loop.
How AI tools accelerate MDR clinical evaluation literature reviews under Article 61 and Annex XIV, and where human judgement stays mandatory.
AI can categorise complaints and surface PMS signals at scale. Here is how to use it under MDR without breaking the human-in-the-loop.
How lean MedTech startups use AI in regulatory affairs to match big MedTech throughput under MDR Article 10 without shortcutting compliance.
An auditor's view of seven recurring MDR compliance mistakes AI MedTech startups make, from Rule 11 classification to retraining as significant change.
The complete compliance checklist for AI/ML medical device startups in 2027 — spanning MDR, the AI Act layer, data governance, and post-market monitoring.
How formative usability evaluation helps MedTech startups find use errors early under EN 62366-1:2015+A1:2020 clauses 5.7 and 5.8.
How MDR summative usability evaluation works under EN 62366-1:2015+A1:2020 clause 5.9, with intended users, realistic environments, and recorded evidence.
A practical guide to usability test plans for medical devices under EN 62366-1:2015+A1:2020: protocol, recruitment, environment, data capture, analysis.
The 2026 startup guide to cybersecurity medical devices MDR, covering Annex I Section 17, MDCG 2019-16 Rev.1, EN IEC 81001-5-1:2022, and EN ISO 14971.
MDR Annex I Section 17 cybersecurity deep dive, covering §17.2 software lifecycle and §17.4 IT security as General Safety and Performance Requirements.
MDCG 2019-16 cybersecurity guidance walkthrough, mapping the December 2019 document (Rev.1 July 2020) to EN IEC 81001-5-1:2022 for startup notified body files.