Real-world evidence under MDR is clinical information collected from a CE marked device in routine use — through PMCF studies, patient and device registries, user and clinician surveys, complaint data, and structured real-world data pipelines — and analysed to update the clinical evaluation report. The obligation to keep the CER current with post-market clinical data sits in Article 61(11) of Regulation (EU) 2017/745, and the methods for generating that data sit in Annex XIV Part B. Real-world evidence is not a bolt-on. It is the primary way a clinical evaluation stays alive after CE marking.
By Tibor Zechmeister and Felix Lenhard. Last updated 10 April 2026.
TL;DR
- Real-world evidence (RWE) under MDR is post-market clinical data from routine use of a CE marked device, analysed to confirm safety and performance and to update the clinical evaluation.
- The legal anchor is Article 61(11), which requires the clinical evaluation to be updated throughout the life cycle of the device with clinical data obtained from the PMCF plan referred to in Annex XIV Part B and the PMS plan referred to in Article 84.
- The main RWE sources named or implied by Annex XIV Part B are PMCF studies, patient and device registries, user and clinician surveys, structured real-world data from routine clinical use, and screening of scientific literature on the device and on equivalent or similar devices.
- Complaint and vigilance data are not, by themselves, real-world evidence — but clinically relevant complaints must be routed into the PMCF evaluation so they inform the CER and not only the CAPA system.
- MDCG 2025-10 is the current operational guidance on how PMS and PMCF feed the clinical evaluation update cycle, and MDCG 2020-5 remains the reference on equivalence questions that arise when RWE from similar devices is used.
Why real-world evidence matters — the gap pre-market data cannot close
A startup finishes its clinical evaluation, assembles the technical file, clears the notified body review, and ships. The CER at that moment is a snapshot built from pre-market data: the clinical investigations that were performed, the published literature on the device and on equivalent devices, and whatever bench and usability data was relevant. That snapshot is complete for market entry. It is also already out of date the day the device is used on a real patient outside a study protocol.
That is the gap real-world evidence closes. Pre-market data predicts how the device will perform. Real-world data confirms — or refutes — that prediction across the patient populations, clinical settings, and use conditions that actual market use exposes. A device that worked beautifully in twelve trial sites might behave differently in two hundred community hospitals. A connected device that was benchmarked on six-hour usage sessions might reveal new patterns under twenty-hour continuous wear. The only way to know is to look, and looking — structured, proactive, documented — is what Annex XIV Part B calls PMCF and what the clinical evidence community calls real-world evidence.
For startups the temptation is to treat RWE as something bigger companies do when they have budget for it. That reading misses the point. Article 61(11) makes the update of the clinical evaluation with post-market data a legal obligation, not a nice-to-have. The only question is which combination of RWE methods is appropriate for the device, the risk class, and the resources of the manufacturer.
What the MDR actually says about post-market clinical data
Article 61(11) of the MDR ties pre-market clinical evaluation to the post-market lifecycle. The clinical evaluation and its documentation shall be updated throughout the life cycle of the device concerned with clinical data obtained from the implementation of the manufacturer's PMCF plan in accordance with Annex XIV Part B and the post-market surveillance plan referred to in Article 84. For Class III devices and implantable devices, the clinical evaluation report shall be updated at least annually with the data from the PMCF. (Regulation (EU) 2017/745, Article 61, paragraph 11.)
Annex XIV Part B defines PMCF as a continuous, proactive process of collecting and evaluating clinical data from the use of a CE marked device on humans within its intended purpose, with the aim of confirming safety and performance throughout the expected lifetime, ensuring the continued acceptability of identified risks, and detecting emerging risks on the basis of factual evidence. (Regulation (EU) 2017/745, Annex XIV Part B.)
Annex III, referenced by Article 83, specifies the post-market surveillance system that sits above PMCF. The PMS plan under Article 84 is the document that names every source of post-market data the manufacturer intends to use. The PSUR under Article 86 is where the clinical findings of PMCF are summarised for Class IIa, IIb, and III devices.
Read together, these provisions say the same thing in different words: the clinical evaluation is a living document, and it is kept alive by clinical data gathered from real-world use. That data is what the RWE label points to.
The main sources of real-world evidence under MDR
Annex XIV Part B names the categories of PMCF methods. In RWE terms, they map to the following sources.
PMCF studies. Post-market clinical studies conducted on the CE marked device under EN ISO 14155:2020 + A11:2024 for the aspects that apply. PMCF studies are the strongest form of RWE because they are structured, pre-specified, and controlled in methodology. They are also the most expensive. For novel Class III and implantable devices they are often unavoidable; for lower-class devices they are a tool to reach for when a specific clinical question cannot be answered by other methods.
Patient and device registries. Registries are databases that collect defined data fields on devices, procedures, and patients across many sites over long periods. They are one of the most cost-effective RWE sources, because participation in an existing registry is usually much cheaper than running a study, and the data quality is structured enough to support real clinical conclusions. For cardiac implants, orthopaedic implants, and several other device categories, registries are the pragmatic default. The PMCF plan has to name the registry, the data fields used, and how the data will feed the CER update.
User and clinician surveys. Pre-specified surveys of healthcare professionals and, where applicable, patients, with questions mapped to the clinical claims in the CER. Surveys are feasible for small companies and produce qualitative clinical signals quickly. The weakness is response rate and self-report bias, which is why surveys are usually one RWE source among several, not the only one.
Structured real-world data from routine clinical use. For connected devices, software as a medical device, and any device with telemetry or digital output, routine usage data is already being generated. Turning it into RWE is a question of designing the data pipeline, the clinical endpoints, and the analysis plan before CE marking so the data can be used under GDPR and data minimisation constraints. Real-world data from this source is often the highest-volume RWE stream a startup has access to.
Literature surveillance on the device and on equivalent or similar devices. Annex XIV Part B explicitly names screening of scientific literature as a general PMCF method, and it explicitly requires evaluation of clinical data relating to equivalent or similar devices. Literature surveillance is the cheapest RWE source, the easiest to run on a defined cadence, and the element startups most often under-execute. For equivalence-related RWE questions, MDCG 2020-5 is the reference on how equivalence claims are handled under the MDR.
Complaint data routed into the clinical channel. Complaint handling sits in the QMS and the PMS system, not in Annex XIV Part B directly. But complaints that carry clinical information — adverse events, performance issues in clinical use, off-label use patterns — are RWE signals, and MDCG 2025-10 is explicit that the PMS system must route clinically relevant data into the clinical evaluation update, not only into CAPA. A startup that logs a "skin irritation" complaint only in CAPA and never lets it reach the PMCF evaluation has broken the RWE loop.
Worked example — how PMCF data closes a CER gap
A hypothetical example, built from patterns that recur across startup projects. A Class IIa wearable cardiac rhythm monitor receives CE marking with a CER based on two pre-market clinical studies totalling 340 patients in controlled settings, plus literature on equivalent devices. The CER concludes that clinical performance is acceptable and that the main residual risk is false positive alerts at a rate consistent with published rates for the device class. The PMCF plan commits to four RWE sources: quarterly literature surveillance, a partnership with an existing arrhythmia registry, a structured semi-annual clinician survey at the first twenty deploying centres, and analysis of device telemetry on false positive rates against the pre-market benchmark.
Twelve months in, the PMCF evaluation report aggregates the data. Literature surveillance surfaces two new publications on equivalent devices, one of which reports a previously uncharacterised interaction between the sensor and a specific class of implantable pacemakers. The registry data covers 1,800 patients across eleven centres and shows performance consistent with the CER. The clinician survey flags two centres where the false positive rate is well above the pre-market benchmark, correlating with a specific patient sub-population. The telemetry data confirms the elevated rate in that sub-population.
That is a CER gap surfaced by RWE. The clinical evaluation did not predict the sub-population effect because the pre-market studies did not include enough of those patients. The PMCF evaluation report concludes that the risk file needs updating under EN ISO 14971:2019 + A11:2021 to characterise the new sub-population risk, the IFU needs updating to flag the interaction with the pacemaker class, the CER needs updating to reflect the refined false positive profile, and a targeted PMCF study is proposed to quantify the effect more precisely. None of those actions would have happened without the RWE sources the plan committed to. All of them trace back to the Annex XIV Part B objectives: confirming safety and performance, detecting emerging risks, monitoring identified risks, keeping the benefit-risk ratio current.
That is what "real-world evidence strengthens clinical evaluation" means in practice. Not a slide. A chain of documented actions from data to CER update to risk file to IFU.
Ship — a lean RWE collection playbook
A resource-constrained startup can build a real RWE programme that satisfies Annex XIV Part B. The playbook is short.
Design the RWE sources before CE marking, inside the PMCF plan, as a subsection of the PMS plan required by Article 84. Name each source, the data fields, the cadence, and the Annex XIV Part B objective each one addresses. If a source does not address at least one Annex XIV Part B objective, cut it.
For every clinical claim in the CER, identify at least one RWE source that will produce data against that claim during the device lifetime. Claims without a matching RWE source are a gap in the plan.
Commit to structured literature surveillance on the device and on equivalent or similar devices, with documented search terms, databases, cadence, and reviewer. This is the baseline RWE source for almost every device and the cheapest to run.
Identify whether a relevant registry exists. If yes, engage with it and name the participation in the PMCF plan. If no, document the search that led to that conclusion and justify the alternative RWE sources.
Build one structured feedback channel — clinician survey, patient survey, or structured user interview — with pre-specified questions that map to the clinical claims and the risk file. One channel that actually runs beats three channels that never ship.
For connected devices and SaMD, design the telemetry and real-world data pipeline at the same time as the clinical endpoints. Retrofitting is expensive or impossible under GDPR. Design before CE marking or do not commit to this RWE source in the plan.
Route clinically relevant complaints from the PMS intake into the PMCF evaluation, not only into CAPA. Document the routing rule in the PMS plan. This is where most of the free signal lives, and most startups let it leak into CAPA and never reach the CER.
Produce the PMCF evaluation report on a defined cadence that matches the CER update cycle — at least annually for Class III and implantables per Article 61(11), justified and proportionate for lower classes. The report must name actions: CER updated, risk file updated, IFU updated, or a documented decision that no update is needed with the reasoning.
That programme is not elaborate. It is Annex XIV Part B, executed as the minimum real version. It traces line by line. It survives notified body surveillance. For deeper walk-throughs see the full PMCF guide under MDR and PMCF surveys and registries for startups.
Reality Check — where do you stand on RWE?
- For each clinical claim in your CER, can you name at least one post-market data source that will produce evidence against that claim?
- Does your PMCF plan explicitly describe how complaint data is routed into the clinical evaluation channel and not only into CAPA? See also PMS feedback and risk management integration.
- Have you identified whether a relevant patient or device registry exists, and if so, are you using it?
- If you claim equivalence to another device as part of your RWE strategy, have you checked the claim against MDCG 2020-5?
- For each RWE source in your plan, can you trace it to a specific Annex XIV Part B objective?
- Does your PMCF evaluation report cadence match the CER update cadence required by Article 61(11) for your device class?
- For software or connected devices, was the real-world data pipeline designed before CE marking or after?
- When the PMCF evaluation surfaces a signal, is there a defined rule that produces a documented CER update decision within a reasonable timeframe?
Frequently Asked Questions
Is "real-world evidence" a term defined in the MDR?
No. The MDR does not use the phrase "real-world evidence" as a defined term. What the MDR requires is continuous, proactive collection and evaluation of clinical data from the use of the CE marked device under Annex XIV Part B, and the update of the clinical evaluation with that data under Article 61(11). "Real-world evidence" is the common industry label for that clinical data set.
Can complaint data be used as real-world evidence for the CER?
Clinically relevant complaint data must be routed into the PMCF evaluation and, through it, into the CER update. Raw complaint data on its own is not RWE — it becomes RWE when it is clinically analysed, aggregated against the clinical claims, and evaluated under the PMCF methodology. MDCG 2025-10 is explicit that the PMS system must feed clinically relevant signals into the clinical evaluation.
Are patient registries a sufficient RWE source for a Class III device?
For some Class III devices, participation in a well-designed registry is a significant component of the RWE programme, but it is rarely sufficient on its own. Novel Class III and implantable devices usually require a PMCF study in addition to registry participation, and always require the annual CER update under Article 61(11) that reflects all post-market clinical data.
How is RWE different from a PMCF study?
A PMCF study is one specific type of RWE source — a structured clinical study on the CE marked device, conducted under EN ISO 14155:2020 + A11:2024. RWE is the broader term that also includes registry data, survey data, structured real-world data from routine use, literature surveillance, and equivalent-device monitoring. A PMCF programme almost always combines several RWE sources, not only a study.
Do startups need to build RWE infrastructure before CE marking?
Yes, at least at the plan and pipeline level. The PMCF plan must be written before CE marking and must name the RWE sources, cadence, and methods. For connected devices, the data pipeline must be designed before CE marking because retrofitting is expensive or legally constrained. The data collection itself begins when the device is placed on the market, but the design work is pre-market.
Related reading
- Post-market clinical follow-up (PMCF) under MDR: a complete guide for startups — the pillar on the PMCF framework that RWE feeds into.
- PMCF surveys and registries for startups — deeper walk-through of two of the main RWE sources.
- PMS feedback loops and risk management integration — how complaint and feedback data is routed into the clinical and risk channels.
- What is post-market surveillance under MDR — the broader PMS system that contains the PMCF and RWE activities.
- What is clinical evaluation under MDR — the pillar on Article 61 and Annex XIV Part A that defines the CER that RWE updates.
Sources
- Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices, Article 61 (clinical evaluation), Article 61(11) (update of clinical evaluation with PMCF data), Article 83 (post-market surveillance system), Article 84 (PMS plan), Article 86 (Periodic Safety Update Report), Annex III (PMS documentation), Annex XIV Part A (clinical evaluation), and Annex XIV Part B (post-market clinical follow-up). Official Journal L 117, 5.5.2017.
- MDCG 2025-10 — Guidance on post-market surveillance of medical devices and in vitro diagnostic medical devices. Medical Device Coordination Group, December 2025.
- MDCG 2020-5 — Clinical Evaluation: Equivalence. A guide for manufacturers and notified bodies. Medical Device Coordination Group, April 2020.
- EN ISO 14155:2020 + A11:2024 — Clinical investigation of medical devices for human subjects — Good clinical practice.
- EN ISO 14971:2019 + A11:2021 — Medical devices — Application of risk management to medical devices.
This post is part of the Clinical Evaluation & Investigations series in the Subtract to Ship: MDR blog. Authored by Felix Lenhard and Tibor Zechmeister. Real-world evidence is not a slide in a pitch deck — it is the clinical data that keeps a CE marked device safe, and the loop that keeps the clinical evaluation honest after the notified body has moved on to the next file.