A medical device that is secure on the day it receives its CE certificate can become insecure six months later when an open-source library publishes a critical vulnerability. MDR Annex I §17.2 and §17.4 require cybersecurity to be maintained across the whole device lifetime, not only at release. EN IEC 81001-5-1:2022 is the standard that makes that obligation operational.

By Tibor Zechmeister and Felix Lenhard.

TL;DR

  • MDR Annex I §17.2 requires software to be developed and manufactured in accordance with the state of the art, including information security, for the whole lifetime of the device.
  • EN IEC 81001-5-1:2022 defines the health software security lifecycle activities that make the MDR obligation auditable: secure development, configuration management, vulnerability management, patch management, decommissioning.
  • The most underestimated aspect of MDR cybersecurity is not the initial threat model. It is the monitoring and patching of third-party components after the device has shipped.
  • A CVE published against a single SOUP component can convert a compliant device into a non-compliant device overnight if no lifecycle process exists to detect and react.
  • MDCG 2019-16 Rev.1 confirms that cybersecurity is part of post-market surveillance under MDR Article 83 and Annex III.
  • The Subtract to Ship minimum viable cybersecurity lifecycle is five artefacts and one weekly ritual. Not a department.

Why this matters

A founder in Graz once told Tibor that the penetration test report was already in the technical documentation, the threat model was signed off, and cybersecurity was "done". Six months after CE marking, a publicly exploited vulnerability landed in one of the Python libraries used in the device backend. Nobody at the company was watching the relevant CVE feeds. The window between public disclosure and the eventual patch was long enough that something could have happened to a patient. Nothing did. Pure luck.

Tibor has seen this pattern repeat at more than a dozen startups. The threat model gets built once, the penetration test gets run once, and the lifecycle side of cybersecurity gets dropped off the roadmap. The regulatory assumption in MDR is the exact opposite. Annex I §17.2 talks about the whole lifetime of the device. A snapshot in time is not lifecycle.

Cybersecurity is not a gate you pass. It is a loop you run.

What MDR actually says

MDR Annex I §17.2 requires that software intended to be used in combination with mobile computing platforms, or software that is itself a device, "shall be developed and manufactured in accordance with the state of the art taking into account the principles of development life cycle, risk management, including information security, verification and validation."

That one sentence is doing a lot of work. It folds information security into the development lifecycle, ties it back to risk management under EN ISO 14971:2019+A11:2021, and forces verification and validation. Then §17.4 adds the minimum IT hardening expectations and requires manufacturers to set out minimum requirements concerning hardware, IT networks characteristics and IT security measures, including protection against unauthorised access.

MDR Article 83 and Annex III push the loop further into the post-market phase. The post-market surveillance system "shall be suited to actively and systematically gathering, recording and analysing relevant data on the quality, performance and safety of a device throughout its entire lifetime". Cybersecurity data, including newly disclosed vulnerabilities affecting device components, falls inside that scope.

MDCG 2019-16 Rev.1 is the guidance document that makes the cybersecurity obligations concrete for manufacturers and notified bodies. It explicitly calls out that cybersecurity is a lifecycle property. It names EN IEC 81001-5-1:2022 as the operational framework. And it confirms that post-market vigilance, under MDR Article 87 and following, applies to cybersecurity incidents that could cause harm.

EN IEC 81001-5-1:2022 is the harmonised standard for security activities in the health software lifecycle. It maps onto the software lifecycle already defined in EN 62304:2006+A1:2015 and adds the security-specific activities: secure development planning, secure coding, security risk management integrated with ISO 14971, verification of security requirements, secure configuration management, vulnerability monitoring, patch management, and secure decommissioning. Each of those is a clause with deliverables.

A worked example

Consider a Class IIa diabetes decision-support app on a smartphone. The backend runs in the cloud on a managed Kubernetes cluster. The mobile app has three direct dependencies and roughly 180 transitive dependencies, a typical figure for a modern Swift plus JavaScript stack. The backend adds another 400 components.

At release, the manufacturer commissions a pentest, builds a threat model, documents minimum IT requirements for the customer hospital network, and signs off cybersecurity in the technical documentation. A notified body accepts the file. CE mark issued.

Three months later, a CVE with CVSS 9.8 is published against one of the JavaScript dependencies. The vulnerability allows remote code execution in a specific parsing path. The device uses that path to handle user-uploaded data. The device is now, as of the moment of disclosure, non-compliant with MDR Annex I §17.2, because "state of the art" has moved. The question is not whether it is non-compliant. The question is how long until the manufacturer notices, assesses, and patches.

Under an EN IEC 81001-5-1 lifecycle process, the chain of events is short. The SBOM is monitored against CVE feeds, the vulnerability is flagged within 24 hours, the security risk assessment is re-run through the EN ISO 14971:2019+A11:2021 risk file, the risk/benefit is recalculated, a patch is prepared, the change control under the QMS drives a software update, and the notified body is informed according to the change classification rules. The window of exposure is measured in days.

Without an EN IEC 81001-5-1 lifecycle process, the chain of events is: someone on the development team reads a tech news article weeks later, panics, opens a ticket, and the notified body finds out at the next surveillance audit. Tibor has sat on the notified-body side of exactly this conversation. It does not go well.

The Subtract to Ship playbook

Felix has coached 44 startups through MDR-adjacent work, and the cybersecurity lifecycle is one of the places founders overbuild when they finally wake up to it. They hire a full security engineer, buy a commercial SIEM, and try to stand up a SOC. None of that is needed for a minimum viable lifecycle. Five artefacts and one ritual are enough to pass a notified body audit and, more importantly, to actually keep the device secure.

Artefact 1. The security plan. One document, five to fifteen pages, that names the roles, the activities from EN IEC 81001-5-1 that apply, the cadence, and the tools. This is the plan clause of the standard. Do not skip it. Auditors read this first.

Artefact 2. The security risk assessment. Integrated into the existing EN ISO 14971:2019+A11:2021 risk file. Not a separate document. The threats, assets, vulnerabilities, and controls live inside the same hazard analysis the rest of the device uses. Felix has written a dedicated walkthrough of this integration if the reader wants the mechanics.

Artefact 3. The SBOM. A living list of every component and version. Tibor's rule from Q5 of his cybersecurity interview: a good SBOM is not a separate document, it is what the EN 62304 configuration item list should have been all along. One source of truth.

Artefact 4. The vulnerability monitoring log. A simple table that records each CVE evaluated against the SBOM, the decision, and the action. Commercial tools exist. A spreadsheet works for a startup with fewer than five components that matter.

Artefact 5. The patch and change record. Every security-relevant change ties back to the QMS change control process and, where applicable, to the significant change assessment against notified body guidance.

The ritual. Once a week, for 30 minutes, one engineer scans the CVE feeds against the SBOM, updates the monitoring log, and raises any finding that crosses the risk threshold. That is the lifecycle loop. Most weeks it finds nothing. The weeks it finds something are why the ritual exists.

The trap Tibor sees most often is founders treating cybersecurity as a cost centre that the company "will get around to" after first revenue. In Tibor's words from the Round 2 interview: every cybersecurity decision deferred now becomes a change control process with notified body re-engagement later. Up-front cost is always lower than post-certification rework. Always.

Reality Check

  1. Does your security plan name which clauses of EN IEC 81001-5-1:2022 you follow, and why the others do not apply?
  2. Is your SBOM generated from the same configuration item list your EN 62304 process uses, or are they two different lists that drift?
  3. When did you last check your dependencies against the CVE feeds, and who did it?
  4. If a critical CVE landed today against one of your top ten dependencies, how many days until a patched release reaches users?
  5. Is cybersecurity data part of your post-market surveillance plan under MDR Article 83 and Annex III, or does PMS only cover clinical and usability feedback?
  6. Does your notified body know what your change classification threshold is for security patches?
  7. If a researcher discloses a vulnerability to you tomorrow by email, who opens the ticket, who triages, who talks to the notified body?
  8. Has anyone outside the development team ever been able to answer those questions without looking at a document?

If you cannot answer one of the above in under 60 seconds, you do not yet have a cybersecurity lifecycle. You have a cybersecurity snapshot.

Frequently Asked Questions

Is the threat model a one-time deliverable or does it need updating? It needs updating. MDR Annex I §17.2 refers to the whole lifetime of the device. EN IEC 81001-5-1:2022 makes this explicit through its vulnerability management and patch management activities. A new assets mapping should be revisited at least yearly, and after any significant architecture change.

Does a Class I non-connected device still need an EN IEC 81001-5-1 lifecycle? The standard applies where health software processes or communicates data. Tibor's Q4 point: a device is not automatically out of scope just because wireless is "disabled". If the hardware could physically be re-enabled, proof is needed that it cannot happen maliciously. A short justification in the security plan is usually enough, but a justification is still required.

Who signs off on a security patch release? The same role that signs off on any other software release under the QMS, typically the regulatory affairs lead or the software release manager named in the EN 62304 release procedure. The difference is that security-motivated releases must also trigger the security risk assessment update and the significant change evaluation.

Is penetration testing required every year? MDR does not prescribe a cadence. MDCG 2019-16 Rev.1 recommends ongoing verification. A common industry pattern is an annual external pentest plus continuous internal monitoring through the lifecycle activities in EN IEC 81001-5-1. Tibor's experience on the notified body side is that annual pentests plus a credible vulnerability monitoring log is the minimum bar that survives surveillance audits.

What happens when a CVE cannot be patched because the dependency is abandoned? This is a SOUP decision. EN 62304 already requires evaluation and re-evaluation of unmaintained software components. If no patch is possible, the manufacturer must either apply compensating controls, replace the component, or accept and document the residual risk through the EN ISO 14971 process. The notified body will want to see the decision logic.

Sources

  1. Regulation (EU) 2017/745 on medical devices, consolidated text. Annex I §17.2 and §17.4. Article 83. Annex III.
  2. MDCG 2019-16 Rev.1, Guidance on cybersecurity for medical devices, July 2020.
  3. EN IEC 81001-5-1:2022, Health software and health IT systems safety, effectiveness and security. Part 5-1: Security. Activities in the product lifecycle.
  4. EN ISO 14971:2019+A11:2021, Medical devices. Application of risk management to medical devices.
  5. EN 62304:2006+A1:2015, Medical device software. Software lifecycle processes.