Medical devices that process patient data must protect it with encryption at rest, in transit, and in backups. MDR Annex I §17.4 makes the security of software a general safety and performance requirement, and GDPR applies to every device that touches personally identifiable information. Manufacturers who design encryption and key management as a core control of the device, not as an IT afterthought, pass cybersecurity audits without drama.

By Tibor Zechmeister and Felix Lenhard.

TL;DR

  • MDR Annex I §17.4 requires manufacturers to set out minimum requirements for IT security, including protection against unauthorised access.
  • EN IEC 81001-5-1:2022 is the standard that operationalises cybersecurity across the software lifecycle, and it expects encryption decisions to be driven by threat modelling and risk analysis.
  • GDPR applies in parallel to MDR whenever a device processes personal data. Many manufacturers overlook this in both their privacy policies and their cybersecurity risk file.
  • Encryption without key management is theatre. Key generation, storage, rotation, and destruction have to be documented and testable.
  • Backups are data too. If the live database is encrypted but the backup bucket is not, the device is not protected.
  • Notified bodies ask for evidence, not marketing. The evidence chain runs from threat model to control to test to ongoing PMS.

Why encryption is a device safety topic, not an IT topic

Tibor has audited enough medical device software to see the same pattern repeat. A startup builds a clever SaMD or a connected wearable, picks a cloud provider, enables the default encryption checkbox, and assumes the cybersecurity section of the technical documentation is done. Then the notified body asks one question about key management, and the whole cybersecurity posture unravels in front of the auditor.

Encryption is not an infrastructure feature. It is a safety control. When a medical device processes diagnostic data, treatment parameters, or physiological measurements, unauthorised access to that data can change clinical decisions. That puts encryption on the same footing as any other risk control in the EN ISO 14971:2019+A11:2021 risk file.

Felix sees the startup version of this problem constantly. Founders treat cybersecurity as something the CTO handles on the side. The Subtract to Ship view is different. Anything that, if it fails, can harm a patient or invalidate a clinical decision belongs in the device design, not in the ops backlog. Encryption sits firmly in that category.

What MDR actually says about encryption

MDR Annex I §17.2 requires that software developed for incorporation into medical devices, or software that is itself a medical device, is developed and manufactured according to the state of the art, taking into account the principles of development lifecycle, risk management, including information security, verification, and validation.

MDR Annex I §17.4 goes further. It requires manufacturers to set out the minimum requirements concerning hardware, IT networks characteristics, and IT security measures, including protection against unauthorised access, necessary to run the software as intended.

The MDR does not prescribe AES-256 or TLS 1.3 by name. It demands that the manufacturer identifies what protection is needed, justifies the choice, and implements it in line with the state of the art. The state of the art for software security is captured in EN IEC 81001-5-1:2022, which the European harmonisation process treats as the reference standard for health software security activities across the lifecycle. MDCG 2019-16 Rev.1 interprets these obligations for the MDR context and expects threat modelling, secure development, and verification to connect directly to the risk file.

The practical reading: the notified body will look for a documented rationale, linked to a threat model, that explains why the encryption choices made are sufficient for the intended purpose and the identified threats. "We use the defaults" is not a rationale.

Where GDPR enters the room

This is where Tibor's Round 2 insight lands. In his experience, the intersection between MDR Annex I §17.2 and GDPR is less a conflict than an oversight. Manufacturers design software that processes personally identifiable information and then forget that GDPR applies. They miss it in their privacy policies. They miss it as an asset worth protecting in their cybersecurity risk file. The fix is awareness, not legal complexity.

Any device that collects patient identifiers, diagnostic results linked to individuals, device telemetry tied to a user account, or clinician interactions handles personal data in the GDPR sense. Health data is a special category of personal data under GDPR, with elevated protection obligations . Encryption is explicitly named in GDPR as one of the technical measures that can demonstrate appropriate protection .

The overlap means that encryption decisions have two audiences: the notified body auditing the device file, and the data protection authority that might one day investigate a breach. Both audiences want to see the same thing. A manufacturer who treats the two regimes as one integrated control set writes the documentation once. A manufacturer who treats them separately writes it twice, badly, and gets caught in the gaps.

Encryption at rest, in transit, and in backups

Three places where patient data lives, three places where encryption has to be engineered and verified.

At rest. Databases, file stores, device flash memory, and mobile app caches all hold data at rest. The risk control has to survive disk theft, cloud snapshot exposure, and compromised service accounts. In practice this means full-disk encryption for infrastructure plus field-level or column-level encryption for the most sensitive attributes. The field-level layer is what protects against the common case of an overly broad database read permission.

In transit. Every network hop inside the device system has to be protected. That includes the obvious public internet hops, but also the less visible ones: microservice-to-microservice calls inside a cloud VPC, messages on an internal queue, traffic between a mobile app and a BLE gateway, and over-the-air update channels. TLS with strong cipher suites is the current state of the art for general transport. Device-to-gateway links use whatever cryptography the protocol supports, with an explicit justification if the protocol is weaker than TLS.

In backups. Tibor has seen this one fail more than any other. The live system is encrypted, audited, and tested. The nightly backup goes to a bucket with default settings and a separate access control list that nobody reviewed. When a researcher or a malicious actor finds the bucket, the encryption on the live system is irrelevant. The rule: backups are copies of the data and inherit the same protection requirements. The backup encryption keys must be managed to the same standard as the production keys, and the restore process must be tested, because an encrypted backup that cannot be restored is not a backup.

Key management is where audits succeed or fail

The cryptographic algorithm is the easy part. Key management is the hard part and the audit part. The manufacturer has to answer, in writing, every one of these questions.

Who generates keys, how, and with what entropy source. Where keys are stored and whether the storage is separated from the encrypted data. How access to keys is authenticated and logged. How keys are rotated, and whether rotation is routine or triggered by incidents. What happens to old keys after rotation. How keys are destroyed at end of life. How key compromise is detected and responded to.

For a startup, a hardware-backed key management service from a major cloud provider covers most of this out of the box, but only if the manufacturer actually uses it correctly and documents the configuration in the technical file. A KMS that nobody points at in the documentation is not evidence of control.

A worked example: a Class IIa connected wearable

Consider a startup building a Class IIa wearable that measures a physiological signal, streams it through a mobile app to a cloud backend, and produces a clinician dashboard. Felix has coached several startups through exactly this architecture.

The threat model identifies three asset classes: the raw signal, the clinician's diagnostic interpretation, and the patient identity linking them. All three are personal health data under GDPR and all three are safety-relevant under the MDR because altering any of them could change clinical decisions.

The encryption design: device-to-gateway link uses the BLE pairing mechanism with a documented justification of its limits and a compensating control at the application layer that re-encrypts the signal before it leaves the phone. App-to-backend uses TLS 1.3 with certificate pinning. Backend storage uses full-disk encryption plus field-level encryption on patient identifiers and the signal values. Keys live in a managed KMS with automatic rotation every 90 days and audit logging routed to the security monitoring pipeline. Backups inherit the same KMS, are tested monthly, and the test evidence lives in the PMS file.

The GDPR side reuses the same controls. The privacy policy names encryption as the technical measure under the relevant GDPR article. The data protection impact assessment references the same threat model the cybersecurity risk file uses. One piece of thinking, two audiences satisfied.

The Subtract to Ship playbook

Startups who want to pass cybersecurity audits without burning three months on rework follow a short sequence. Felix has watched this sequence work across several coached cohorts.

Start from the threat model, not the tech stack. Write the list of assets, threats, and attackers before touching the cloud console. EN IEC 81001-5-1:2022 activities and MDCG 2019-16 Rev.1 give the structure.

Integrate cybersecurity into the EN ISO 14971:2019+A11:2021 risk file. A separate cybersecurity document that does not feed into the main risk file is a red flag for any experienced auditor. The same risk language, the same risk acceptance criteria, the same linkage to clinical harm.

Acknowledge GDPR explicitly as a protected asset class. Name personal data in the risk file. Name encryption as a technical measure in the privacy policy. Name the data protection officer (or the person performing the role) in both documents.

Pick boring, well-reviewed encryption. TLS 1.3, AES-256 GCM, a managed KMS, and a documented key rotation policy. Save creativity for the clinical problem, not the cryptography.

Test the backups. Every cybersecurity audit eventually asks for a restore test record. Manufacturers who run the test before the audit pass. Manufacturers who do not, fail. It is almost that simple.

Hook encryption posture into PMS. Key rotations, certificate expiries, failed decryption events, and KMS audit logs all flow into the post-market surveillance system. When something changes, the risk file gets updated, not just the ops runbook.

Reality Check

  • Can you name every place personal data is stored in the device system, including caches and backups?
  • Does your technical file explain why your encryption choices are appropriate for the identified threats, or does it just list algorithm names?
  • Is GDPR named explicitly in your cybersecurity risk file as a source of protected asset classes?
  • Have you tested a full backup restore in the last three months, and is the evidence in the PMS file?
  • If your cloud provider rotated its default encryption keys tomorrow, would you know?
  • Can you produce the key management procedure without logging into the cloud console to reconstruct it?
  • Is the person responsible for key compromise response named, reachable, and trained?

Frequently Asked Questions

Does MDR require AES-256 specifically? No. MDR Annex I §17.4 requires appropriate IT security measures and leaves the specific algorithms to the state of the art. EN IEC 81001-5-1:2022 is the reference standard, and current state of the art includes algorithms like AES-256 for symmetric encryption, but the obligation is to justify the choice against the threat model.

Is default cloud encryption enough? Usually not on its own. Default cloud encryption protects against some threats (physical disk theft from the data centre) but not others (overly broad read permissions, application-level bugs, exposed backups). The manufacturer has to analyse what it covers and what it does not, and add layers where needed.

Do I need a separate cybersecurity risk file from the ISO 14971 risk file? No, and separating them is a common source of audit findings. Cybersecurity risks belong in the EN ISO 14971:2019+A11:2021 file because they can lead to patient harm. MDCG 2019-16 Rev.1 supports this integrated approach.

Does GDPR apply to Class I devices too? Yes. Device classification under MDR does not affect GDPR applicability. Any device that processes personal data is in scope, from Class I to Class III, and health data is a special category with elevated protection obligations.

How does encryption interact with software updates? Update packages must be signed and their integrity verified, and the channels used to deliver them must be encrypted in transit. The update mechanism is its own cybersecurity control with its own threats, covered separately but using the same risk framework.

What if my notified body asks for a penetration test report? A pentest is external evidence that complements the internal process. Even if your development followed EN IEC 81001-5-1:2022 rigorously, a clean pentest report is hard to argue with and rarely wasted money.

Sources

  1. Regulation (EU) 2017/745 on medical devices, consolidated text. Annex I §17.2 and §17.4.
  2. MDCG 2019-16 Rev.1, Guidance on Cybersecurity for Medical Devices, July 2020.
  3. EN IEC 81001-5-1:2022, Health software and health IT systems safety, effectiveness and security, Part 5-1: Security, activities in the product lifecycle.
  4. EN ISO 14971:2019+A11:2021, Medical devices, Application of risk management to medical devices.
  5. Regulation (EU) 2016/679 (GDPR), relevant articles on technical measures and special categories of personal data .