---
title: Cybersecurity for Connected Medical Devices: IoT, IoMT
description: Why IoT and IoMT medical devices need a dedicated cybersecurity lifecycle and why 'not connected' is rarely a complete defence under MDR Annex I.
authors: Tibor Zechmeister, Felix Lenhard
category: Electrical Safety & Systems Engineering
primary_keyword: cybersecurity IoT IoMT medical device
canonical_url: https://zechmeister-solutions.com/en/blog/cybersecurity-iot-iomt-medical-devices
source: zechmeister-solutions.com
license: All rights reserved. Content may be cited with attribution and a link to the canonical URL.
---

# Cybersecurity for Connected Medical Devices: IoT, IoMT

*By Tibor Zechmeister (EU MDR Expert, Notified Body Lead Auditor) and Felix Lenhard.*

> **Connected medical devices inherit every cybersecurity problem of consumer IoT and add a regulatory lifecycle that runs for a decade or more. Constrained compute, firmware updates at scale, secure provisioning, and long in-field lifetimes collide with MDR Annex I §17, EN IEC 81001-5-1:2022, and MDCG 2019-16 Rev.1. And as Tibor puts it, 'not connected' is rarely a complete defence when a latent Bluetooth or Wi-Fi interface is sitting on the microcontroller.**

**By Tibor Zechmeister and Felix Lenhard.**

## TL;DR
- IoMT (Internet of Medical Things) devices are subject to the same MDR Annex I §17 cybersecurity requirements as any other software-containing device, with the added challenge of constrained compute and long field life.
- Firmware updates at scale are not a convenience feature. Under EN IEC 81001-5-1:2022 and MDCG 2019-16 Rev.1, a device that cannot be patched in the field has a cybersecurity lifecycle gap.
- Secure provisioning (the first boot of each device) is where most IoMT security fails quietly. Default credentials and shared keys are still common findings.
- A device lifetime of ten to fifteen years outlasts most cryptographic algorithms and most SOUP component support windows. The PMS system has to watch both.
- "Not connected" is rarely a complete defence. Tibor's Q4 insight: if the microcontroller supports Bluetooth or Wi-Fi, even disabled, proof is needed that the radio cannot be re-enabled maliciously.
- MDR Annex I §18.8 specifically addresses electronic programmable systems and protection against unauthorised access.

## Why IoMT cybersecurity is a different animal

Tibor has audited connected insulin pumps, connected hearing aids, connected wound-care sensors, and connected rehabilitation wearables. The pattern that separates IoMT from pure SaaS is that the threat surface is physical, persistent, and often unattended. An IoMT device lives in a patient's home, on their body, or in a clinic, for years. It is rarely rebooted. It is rarely inspected. It often has no user interface to show a warning. And it usually has a radio.

Felix has coached startups who assumed that "IoMT" was just "SaaS with a sensor attached". It is not. The security lifecycle of a cloud backend can be measured in days, with instant patching and rollbacks. The security lifecycle of a field device is measured in quarters, with careful over-the-air update campaigns, battery-aware rollouts, and connectivity windows that are unpredictable. Both ends have to be secured together. Neither end is optional.

MDR Annex I §17.2 requires software to be developed considering the state of the art for information security. Annex I §17.4 requires minimum IT network and security requirements to be documented. Annex I §18.8 covers electronic programmable systems and demands repeatability, reliability, and performance in line with intended use, plus protection against unauthorised access. These three provisions together cover the entire IoMT stack: the silicon, the firmware, the radio, the network, and the backend.

## The state of the art for IoMT cybersecurity

EN IEC 81001-5-1:2022 is the health software security lifecycle standard. For IoMT it applies end to end: security risk management, secure design, secure implementation, secure verification, secure release, and security maintenance. The standard is explicit that maintenance is not a nice-to-have. It is part of the lifecycle.

MDCG 2019-16 Rev.1 is the notified body guidance on cybersecurity for medical devices. Tibor says the most underestimated aspect of MDCG 2019-16 in practice is the lifecycle piece. A device secure at launch can become insecure later when a SOUP component or library publishes a CVE. Cybersecurity is continuous, not one-time. The notified body will ask how the manufacturer monitors vulnerabilities in deployed firmware, how they decide when to release a patch, how they deliver the patch, and how they verify that the patch actually reached the field.

EN 62304:2006+A1:2015 provides the software lifecycle backbone. The SBOM (software bill of materials) is not a separate document according to Tibor. A good SBOM follows naturally from the IEC 62304 configuration item list. Libraries, SOUP components, versions, all tracked, and updated on every software change. The SBOM is what the 62304 configuration item list should have been all along.

EN ISO 14971:2019+A11:2021 handles safety risk management. Cybersecurity risks must be integrated into the same risk file, or a linked one, because confidentiality, integrity, and availability failures translate into clinical harms.

## "Not connected is rarely a complete defence"

Tibor's Q4 insight from the follow-up interview is worth quoting directly because it is a finding he has repeated across several files. When founders say "our device is not connected so cybersecurity does not apply", his answer is that this is partially true only if the device has zero wireless capability whatsoever. If the microcontroller supports Bluetooth or Wi-Fi, even disabled, proof is needed that the radio cannot be re-enabled maliciously.

Why this matters: modern microcontrollers ship with radios as a default. A cost-optimised BOM often picks a chip with Bluetooth Low Energy because it is cheaper than the radio-free variant. The firmware may disable the radio stack. The datasheet may confirm it is "off". The board may have no antenna. None of these facts, on their own, satisfy a notified body auditor. The question is whether an attacker with physical access or supply-chain access could re-enable the interface.

The answer that satisfies is a chain of evidence. Fuse bits blown to physically disable the radio. Firmware validation that refuses to boot if the radio is enabled. Hardware design without the RF matching components. A threat model that addresses the "what if the radio is re-enabled" path and documents the residual risk.

The shortcut of writing "not connected" on the cybersecurity cover sheet and moving on is the single most common finding Tibor sees in this area. The fix is not expensive. It is a paragraph in the threat model and a design verification activity in the test plan.

## Firmware updates at scale

An IoMT device that cannot be updated in the field is a timebomb. The question is not if a vulnerability will appear in a SOUP component. The question is when. MDCG 2019-16 Rev.1 and EN IEC 81001-5-1:2022 both assume that the manufacturer can deliver security patches during the product lifetime.

For a field device, delivering a patch means solving a long list of sub-problems. The device must authenticate the update as legitimate, usually via a signed image. The device must verify the integrity of the download, usually via a hash. The device must apply the update without bricking itself if power is lost, usually via an A/B partition or a bootloader recovery path. The device must report back to the backend that the update succeeded, so the manufacturer knows coverage. The device must do all of this with a battery that may be halfway empty.

None of this is novel in the IoT world. Secure boot, signed updates, A/B partitions, and over-the-air rollouts are well understood. What is novel for medical devices is the regulatory expectation that each element is verified, documented, and maintained. The design verification test plan should include an update that is maliciously signed and confirm the device rejects it. The test plan should include an update that fails mid-install and confirm the device recovers. The test plan should include a rollback scenario and confirm it works.

A worked example: a Class IIa continuous glucose monitor with a ten-year expected use life. The manufacturer picks an ARM Cortex-M microcontroller with secure boot, provisions a device-unique key in the factory, signs firmware images with a hardware-backed key in the build pipeline, and uses an A/B partition scheme for updates. The update campaign runs through the paired smartphone app, so the radio interface is Bluetooth to the phone and the phone handles the wide-area connectivity. The PMS plan includes a monthly CVE review of every SOUP component in the SBOM. When a CVE hits, the team evaluates it against the threat model, decides whether to patch, builds and signs an update, rolls it out in waves, and reports coverage to the notified body in the PSUR.

That example is not science fiction. It is what a well run IoMT product looks like. The cost of building it once is modest compared to the cost of dealing with an unpatchable fleet of 50,000 devices.

## Secure provisioning

Secure provisioning is the step where each device gets its unique identity and keys. It happens once, in the factory, and it is the quietest source of security failures. The classic mistakes: every device shares the same key, every device is shipped with a default password, every device is provisioned over an unauthenticated channel.

The good pattern: each device gets a unique asymmetric key pair generated inside the secure element, the public key is enrolled with the backend during manufacturing, and the private key never leaves the hardware. Mutual TLS with these keys then authenticates the device to the backend for its entire life. Secure boot locks the firmware image to the device identity. No shared secrets, no default passwords, no factory keys that can be dumped from a single stolen device.

Tibor recommends that the provisioning process be documented in the same design file as the firmware, because the factory is where cybersecurity intersects with manufacturing controls under EN ISO 13485.

## The long lifetime problem

A medical device expected to run for ten or fifteen years will outlive the cryptographic algorithms it launches with. SHA-1 is already retired. RSA-2048 is heading that way. TLS 1.2 will eventually be deprecated. The SOUP components in the firmware will stop receiving security updates long before the device stops being used.

The PMS plan under MDR Articles 83 to 86 has to watch for all of this. The cybersecurity risk file has to be updated when the state of the art moves. The device has to be designed with cryptographic agility: the ability to roll new algorithms and new certificate chains over the air without hardware changes. Devices without cryptographic agility become unpatchable the moment their primary algorithm is broken.

Felix coaches founders to ask one brutal question during architecture review: if the CA signing key has to rotate in year seven, can the device still validate updates? If the answer is no, the architecture needs to change before launch.

## The Subtract to Ship playbook

**Step one: write the threat model before the schematic.** Do it on one page. Device, radio, firmware, update channel, backend, factory. For each, list the threats. Feed the model into EN ISO 14971 as a section of the risk file.

**Step two: pick a chip with a secure element and secure boot.** The cost delta is small. The regulatory and security benefit is large. Document the decision in the design file.

**Step three: plan the update path on day one.** A/B partitions, signed images, bootloader recovery, coverage reporting. Prototype the update before the hardware is finalised.

**Step four: provision unique keys in the factory.** No shared secrets, no default passwords. Document the provisioning procedure as part of the manufacturing controls.

**Step five: address latent radios explicitly.** If the microcontroller has Bluetooth or Wi-Fi and the product does not use them, write the paragraph that explains how the radio is disabled and why re-enabling is prevented. Add the test.

**Step six: connect the SBOM to the PMS.** The SBOM is the IEC 62304 configuration item list. The PMS plan watches it for CVEs. The feedback loop drives firmware releases.

**Step seven: budget for a penetration test before CE marking.** Tibor notes that even startups that missed EN IEC 81001-5-1 during development can produce a credible pentest result to give the notified body external evidence that development was responsible. Pentesting is external proof where internal process was weak.

## Reality Check

1. Has the team drawn a one-page threat model covering the device, the radio, the firmware, the update channel, the backend, and the factory?
2. Does the microcontroller include a secure element, secure boot, and the ability to verify signed firmware updates?
3. If the device is marketed as "not connected", is there documented evidence that any latent Bluetooth or Wi-Fi interface on the chosen chip cannot be re-enabled maliciously?
4. Is there a signed, verified over-the-air update path that has been tested end to end, including a deliberate failure scenario?
5. Does each unit receive unique provisioning keys in the factory, with no shared secrets or default passwords?
6. Is the SBOM maintained as part of the EN 62304 configuration item list and watched by the PMS system for CVEs?
7. Does the cybersecurity plan include a penetration test before CE marking and periodic re-testing during the lifetime?
8. Has anyone asked the question "can this device still validate updates in year seven if the CA rotates" and answered it in writing?

## Frequently Asked Questions

**Is a consumer IoT checklist enough for IoMT?**
No. Consumer checklists miss the MDR obligations: clinical risk linkage under EN ISO 14971, lifecycle discipline under EN IEC 81001-5-1:2022, notified body evidence per MDCG 2019-16 Rev.1, and PMS integration.

**Can firmware be outsourced to a contract developer?**
Yes, provided the developer is qualified as a supplier under EN ISO 13485 clause 7.4 and security requirements flow down into the statement of work. The notified body holds the manufacturer accountable, not the subcontractor.

**How often should connected devices be patched?**
There is no fixed cadence. A CVE affecting the device should trigger triage within a defined time and a patch release proportional to the risk. The PMS plan should state the commitment and the release history should match.

**Does a pentest replace EN IEC 81001-5-1?**
No. A pentest is a point-in-time external check. EN IEC 81001-5-1 is a lifecycle. A pentest is good supporting evidence but does not substitute for structured security engineering.

**What happens when a SOUP component reaches end of life?**
The manufacturer must replace it, fork and maintain it, or document a compensating control and risk acceptance. The decision must be traceable in the design history file.

## Related reading
- [SBOM for Medical Devices under MDR](/blog/sbom-medical-devices-mdr) details the software bill of materials that anchors the IoMT vulnerability workflow.
- [Privacy by Design for Medical Devices](/blog/privacy-by-design-medical-devices) covers the GDPR half of connected device design.
- [Cloud Security for Medical Devices: AWS, Azure, and GCP](/blog/cloud-security-medical-devices-aws-azure-gcp) covers the backend half of the IoMT stack.
- [Firmware Development for Medical Devices under EN 62304](/blog/firmware-development-iec-62304-hardware) connects the lifecycle discipline to the hardware context.
- [Wireless Device Compliance under the Radio Equipment Directive](/blog/wireless-device-compliance-red) addresses the regulatory lens on the radio itself.

## Sources
1. Regulation (EU) 2017/745 on medical devices, consolidated text. Annex I §17.2, §17.4, §18.8.
2. EN IEC 81001-5-1:2022, Health software and health IT systems safety, effectiveness and security, Part 5-1: Security, Activities in the product lifecycle.
3. MDCG 2019-16 Rev.1 (December 2019, Rev.1 July 2020), Guidance on Cybersecurity for medical devices.
4. EN 62304:2006+A1:2015, Medical device software, Software lifecycle processes.
5. EN ISO 14971:2019+A11:2021, Medical devices, Application of risk management to medical devices.

---

*This post is part of the [Electrical Safety & Systems Engineering](https://zechmeister-solutions.com/en/blog/category/electrical-safety) cluster in the [Subtract to Ship: MDR Blog](https://zechmeister-solutions.com/en/blog). For EU MDR certification consulting, see [zechmeister-solutions.com](https://zechmeister-solutions.com).*
