---
title: Privacy by Design for Medical Devices
description: How GDPR Article 25 privacy by design and by default maps onto MDR design controls, turning data minimisation into a concrete medical device constraint.
authors: Tibor Zechmeister, Felix Lenhard
category: Electrical Safety & Systems Engineering
primary_keyword: privacy by design medical device GDPR
canonical_url: https://zechmeister-solutions.com/en/blog/privacy-by-design-medical-devices
source: zechmeister-solutions.com
license: All rights reserved. Content may be cited with attribution and a link to the canonical URL.
---

# Privacy by Design for Medical Devices

*By Tibor Zechmeister (EU MDR Expert, Notified Body Lead Auditor) and Felix Lenhard.*

> **Privacy by design for medical devices means treating GDPR Article 25 as a source of design inputs the same way MDR Annex I GSPR generates safety design inputs. Data minimisation, purpose limitation, and storage limitation stop being legal language and become concrete constraints on architecture, data models, and default settings.**

**By Tibor Zechmeister and Felix Lenhard.**

## TL;DR
- GDPR Article 25 requires data protection by design and by default for any processing of personal data, and health data is a special category under GDPR Article 9.
- Medical device manufacturers routinely process patient data yet forget GDPR applies inside the cybersecurity risk file.
- Data minimisation, purpose limitation, and storage limitation translate into concrete architectural constraints: smaller data models, shorter retention, narrower default sharing.
- MDR Annex I §17.2 requires software to be developed according to the state of the art considering information security principles, which in practice means pairing EN 62304 with EN IEC 81001-5-1:2022 and GDPR Article 25.
- Privacy and cybersecurity are two sides of the same coin: a device that leaks PII is simultaneously a cybersecurity failure and a GDPR incident.
- Tibor has seen notified body findings where the privacy policy, the cybersecurity risk analysis, and the data model disagreed with each other. That is a unified problem, not three separate ones.

## Why privacy by design is a medical device question

Tibor has audited more than fifty MDR technical files. A pattern appears in almost every software file. Manufacturers design an application that processes personal health data, write a clinical evaluation around it, document a cybersecurity risk analysis, and then forget that the very same data is also regulated by GDPR. In Tibor's experience, the privacy policy lives on the marketing website, the cybersecurity file lives in the technical documentation, and nobody notices that the two documents describe different data flows.

That split is the problem this post solves. Privacy and cybersecurity are not separate disciplines for a connected medical device. They are two perspectives on the same threat surface. A device that leaks personal data is a cybersecurity failure and a GDPR incident at once. A device that collects more data than it needs is a GDPR violation and an inflated cybersecurity attack surface at once.

Felix has coached 44 startups through the early product phase. The founders who avoid privacy debt are the ones who treat GDPR as a design input, not a legal deliverable. They draw their data model before the lawyers draft their privacy policy. They pick defaults before the marketing team writes copy. They subtract fields before they add them.

## What the regulations actually say

GDPR Article 25 is titled "Data protection by design and by default". The article is widely cited but often paraphrased. The core obligation, in plain terms, is that the controller, taking into account the state of the art and the cost of implementation, implements appropriate technical and organisational measures designed to implement data protection principles in an effective manner and to integrate the necessary safeguards into the processing. The second paragraph of Article 25 requires that, by default, only personal data which are necessary for each specific purpose of the processing are processed. 

The GDPR principles that flow into Article 25 are listed in Article 5: lawfulness, fairness and transparency, purpose limitation, data minimisation, accuracy, storage limitation, integrity and confidentiality, and accountability. Health data is a special category under Article 9, which adds an additional layer of restriction.

MDR Annex I §17.2 requires that, for software, manufacturers set out minimum requirements concerning hardware, IT networks characteristics and IT security measures, including protection against unauthorised access, necessary to run the software as intended. Annex I §17.4 requires that software be developed and manufactured in accordance with the state of the art, considering development lifecycle, risk management, and information security. Annex I §18.8 addresses electronic programmable systems with medical device function and requires repeatability, reliability, and performance in line with intended use, including protection against unauthorised access.

The state of the art for information security, in practice, is EN IEC 81001-5-1:2022 for the security lifecycle of health software, and MDCG 2019-16 Rev.1 for the notified body expectation on how a cybersecurity file is assembled. Neither document replaces GDPR. Both interlock with it.

## Data minimisation as a design constraint

The single most powerful move in privacy by design is to shrink the data model. Felix calls this subtraction. Instead of starting with "what data could we collect" and filtering down, start with "what is the smallest data set that lets the device achieve its intended purpose". Every field beyond that minimum is a liability: a GDPR risk, a cybersecurity risk, a storage cost, and an audit burden.

A worked example. A remote cardiac rhythm monitor needs ECG trace, timestamp, and a device identifier to deliver its clinical function. The product team initially proposed collecting GPS location, ambient temperature, accelerometer data, and a free-text diary field "in case it is useful later". None of that was necessary for intended purpose. Felix coached the team to cut the collection surface to the clinical minimum, park the "nice to have" fields behind an explicit future change request, and document the decision in the design history file. The result: a smaller database, a smaller breach surface, a simpler privacy notice, and faster notified body review.

Data minimisation also means field-level decisions. A date of birth can often be replaced by an age bracket. A full postal address can be replaced by a region code. A patient identifier can be a pseudonymous hash rather than an email address. Each substitution is a concrete design input, traceable through the requirements specification under EN 62304.

## Purpose limitation as a versioning discipline

GDPR Article 5 says personal data shall be collected for specified, explicit and legitimate purposes and not further processed in a manner incompatible with those purposes. For a medical device, this is almost the same discipline as MDR intended purpose under Article 2(12): "intended purpose means the use for which a device is intended according to the data supplied by the manufacturer on the label, in the instructions for use or in promotional or sales materials or statements and as specified by the manufacturer in the clinical evaluation".

The two definitions rhyme on purpose. When the intended purpose is tight, the lawful basis for processing is clear, the data map is bounded, and the notified body review is cleaner. When founders later say "and we also want to use the data for research", that is a new purpose. Under GDPR it requires a new lawful basis or compatible-use analysis. Under MDR it may trigger a significant change. Tibor has seen both fail at once when a startup added research use without updating either the intended purpose or the privacy notice.

The playbook: version the purpose. Write it down in the clinical evaluation plan, in the privacy notice, and in the software requirements specification. When any of those change, the others get a change request.

## Storage limitation and retention

Storage limitation is the principle that data should be kept in identifiable form no longer than necessary. For medical devices, this collides with other obligations. MDR Article 10(8) requires manufacturers to keep technical documentation and the declaration of conformity for at least ten years after the last device has been placed on the market, fifteen years for implantable devices. Clinical data retention under Annex XIV and vigilance under Articles 87 to 92 create their own timelines.

The privacy by design move is to separate identifiable personal data from technical and regulatory records. Aggregated, de-identified, or pseudonymised data can be retained for the regulatory period. Directly identifying personal data can be deleted or pseudonymised on a shorter cycle, driven by clinical need and consent. The point is to make the decision explicitly, document it in the design file, and wire it into the system. Retention becomes a configurable parameter, not an accident.

## The Subtract to Ship playbook

Felix has seen privacy by design work when a small team treats it as an architectural decision that happens once, early, and in writing. Here is the five-step playbook.

**Step one: draw the data map first.** Before any code, before any privacy notice, list every data element the device will touch. For each, write the purpose it serves, the lawful basis under GDPR, the retention horizon, and the transfer surface. If a row cannot be justified, delete the field.

**Step two: pick defaults that are private.** GDPR Article 25(2) says defaults must limit processing to what is necessary. For a medical device, this means optional sharing is opt-in, logging verbosity is minimum by default, and identifying metadata is not included in diagnostic exports unless the user asks for it.

**Step three: fold privacy into the risk file.** EN ISO 14971 does not cover information security by default. MDCG 2019-16 Rev.1 and EN IEC 81001-5-1:2022 do. Privacy harms (loss of confidentiality, re-identification, discrimination) should appear in the cybersecurity risk analysis alongside classic CIA harms. Tibor recommends a single integrated risk file with a privacy-harm column.

**Step four: write the privacy notice from the data map, not the other way round.** The privacy notice is a downstream artifact. When it is written by legal counsel without the data map, it describes a fiction. When it is generated from the actual field list, it describes reality. Tibor has rejected files where the privacy notice claimed the app did not collect location while the source code did.

**Step five: link the privacy change process to the software change process.** EN 62304 configuration management and EN IEC 81001-5-1 security lifecycle updates should trigger a privacy review. Any new field, new integration, or new recipient of data gets a data protection impact assessment before the merge, not after release.

This playbook subtracts three things. It subtracts data. It subtracts overlapping documents. It subtracts the legal-versus-engineering split that wastes weeks during notified body review.

## Reality Check

1. Can the founding team produce a one-page data map listing every personal data element the device processes, with purpose, lawful basis, and retention for each row?
2. Does the privacy notice describe the same data flows as the source code and the cybersecurity risk file? Has anyone diffed them in the last six months?
3. Are the default settings of the device the most privacy-preserving of the options offered, per GDPR Article 25(2)?
4. Does the cybersecurity risk file include privacy harms, or only classic confidentiality-integrity-availability harms?
5. Can the team demonstrate that collected data is the minimum necessary for the intended purpose under MDR Article 2(12), and can each field be deleted on request without breaking regulatory retention?
6. Is there a written procedure linking software change control under EN 62304 to a data protection impact assessment trigger?
7. Has a data protection officer or qualified privacy expert reviewed the design before the notified body submission, not after?

## Frequently Asked Questions

**Does GDPR apply to a medical device that runs only on the device itself and never sends data anywhere?**
If personal data is processed inside the device, GDPR applies to the controller, which is usually the clinical user or the healthcare provider. The manufacturer is often a data processor or a joint controller depending on the architecture. Even fully offline devices trigger GDPR obligations the moment personal data is written to disk.

**Is pseudonymisation enough to exit GDPR scope?**
No. Pseudonymised data is still personal data under GDPR Recital 26 because re-identification remains possible with additional information. Only fully anonymised data, where re-identification is impossible with reasonable means, falls outside GDPR.

**Who is the data controller for a connected medical device?**
It depends on who decides the purposes and means of processing. In most B2B medical device deployments the hospital or clinic is the controller and the manufacturer is a processor. In direct-to-patient apps the manufacturer is usually the controller. Contracts under GDPR Article 28 should make the roles explicit.

**Does privacy by design have to be documented for the notified body?**
The notified body reviews MDR compliance, not GDPR compliance directly. However, the cybersecurity file under MDR Annex I §17.2 and §17.4 overlaps heavily with GDPR obligations. A well documented privacy by design process strengthens the cybersecurity file and reduces notified body questions.

**What happens if the privacy notice and the data model disagree?**
Tibor treats that as a serious finding. It indicates that the design control process is broken. It also creates GDPR exposure because the privacy notice is a binding representation to data subjects. Fix the source of truth, then regenerate the notice.

**Can data be shared with a cloud provider outside the EU?**
International transfers under GDPR Chapter V require an adequacy decision, standard contractual clauses, or another lawful mechanism, plus a transfer impact assessment. For health data this is a high-scrutiny area. The cloud infrastructure choice affects both MDR cybersecurity obligations and GDPR transfer obligations at once.

## Related reading
- [Cloud Services for Medical Devices under MDR](/blog/cloud-services-medical-devices-mdr) covers the regulatory frame for cloud hosting and complements the privacy analysis here.
- [Cloud Security for Medical Devices: AWS, Azure, and GCP](/blog/cloud-security-medical-devices-aws-azure-gcp) translates privacy-by-design data residency questions into concrete cloud vendor decisions.
- [Cybersecurity for Connected Medical Devices: IoT and IoMT Challenges](/blog/cybersecurity-iot-iomt-medical-devices) extends this article into the connected hardware layer.
- [SBOM for Medical Devices under MDR](/blog/sbom-medical-devices-mdr) shows how software bill of materials supports both privacy and cybersecurity evidence.
- [MDR Software Lifecycle under EN 62304](/blog/mdr-software-lifecycle-iec-62304) anchors the change-control discipline that privacy by design depends on.

## Sources
1. Regulation (EU) 2017/745 on medical devices, consolidated text. Annex I §17.2, §17.4, §18.8.
2. Regulation (EU) 2016/679 (GDPR), Articles 5, 9, 25, 28, Chapter V. 
3. EN IEC 81001-5-1:2022, Health software and health IT systems safety, effectiveness and security, Part 5-1: Security, Activities in the product lifecycle.
4. MDCG 2019-16 Rev.1 (December 2019, Rev.1 July 2020), Guidance on Cybersecurity for medical devices.
5. EN 62304:2006+A1:2015, Medical device software, Software lifecycle processes.

---

*This post is part of the [Electrical Safety & Systems Engineering](https://zechmeister-solutions.com/en/blog/category/electrical-safety) cluster in the [Subtract to Ship: MDR Blog](https://zechmeister-solutions.com/en/blog). For EU MDR certification consulting, see [zechmeister-solutions.com](https://zechmeister-solutions.com).*
