Health data sits in the GDPR special category tier, which means the ordinary lawful bases are not sufficient on their own. A MedTech startup must find a specific lifting condition for each processing activity involving clinical data, document it, and reflect it in the same asset inventory that drives the cybersecurity risk file. All GDPR article numbers in this post are flagged for qualified legal review.

By Tibor Zechmeister and Felix Lenhard.

TL;DR

  • Health data is treated as special category data under GDPR .
  • Special category data cannot be processed under the ordinary lawful bases alone. A specific lifting condition is required on top of an ordinary lawful basis.
  • In Tibor's audit experience, startups consistently default to "explicit consent" as the lifting condition without checking whether it is the most appropriate one for their use case. For some activities it is, for many it is not.
  • Data derived from a medical device, even when the device is just a thermometer or a step counter used clinically, is health data the moment it is linked to an identifiable person and used in a clinical context.
  • The lifting condition has to be picked before the device is designed, not afterwards. It drives consent flows, retention periods, sharing grants, and withdrawal handling, all of which become features in the product.
  • As with every GDPR topic on this blog, the MDR remains the single regulatory source of truth for this project. Everything below GDPR should be verified by a qualified data protection lawyer.

Why this matters (Hook)

A remote patient monitoring startup is preparing the first commercial rollout of a Class IIa device. The device measures vital signs continuously, uploads them to the cloud, and shows a dashboard to the treating clinician. The cybersecurity documentation is clean. The MDR submission is in with a notified body. Legal sends a draft privacy notice for review. The notice says that processing is based on "consent". The founder ticks the box.

Six months into the rollout, a hospital procurement officer refuses to sign the deployment contract. The reason is that consent in the clinical context is almost impossible to make truly free, because the patient cannot meaningfully refuse without affecting care. The hospital's own data protection officer insists on a different lifting condition rooted in healthcare provision, not consent. The startup now has to rewrite the privacy notice, rewrite the onboarding flow, rewrite the withdrawal mechanism, and explain the change to its notified body because the intended purpose statement now references a different legal framing.

This is not a legal technicality. It is a product design decision that was made in the privacy policy draft and turned out to be wrong. Tibor has watched this exact pattern derail multiple startup rollouts. The fix is to pick the lifting condition at the scoping stage and let it shape the product.

What MDR actually says (Surface)

MDR Annex I Section 17.2 requires that software be developed with risk management including information security. The assets protected by information security include personal data. When the personal data is clinical, the protection obligation meets a second body of law, the GDPR, which treats health data as special category data and raises the bar.

This post cannot quote GDPR as authoritatively as it quotes MDR, because GDPR is not the regulatory source of truth for this blog. The following statements are accurate to Tibor and Felix's working understanding and must be verified by a qualified data protection lawyer before reliance. All are tagged as [MDR VERIFY] to make that hierarchy explicit.

[MDR VERIFY] GDPR defines special category data to include data concerning health, genetic data, and biometric data used to uniquely identify a natural person. Processing of these categories is prohibited in principle and is only permitted when one of a set of specific lifting conditions applies on top of an ordinary lawful basis for processing.

[MDR VERIFY] The most commonly invoked lifting conditions for a MedTech startup are: explicit consent of the data subject, processing necessary for the provision of healthcare or treatment, processing necessary for reasons of public interest in the area of public health, and processing necessary for scientific research subject to appropriate safeguards.

[MDR VERIFY] Member State law can add conditions and limitations to the processing of health data. This is why a pan-European rollout cannot rely on a single national privacy notice.

The practical MDR link is that the lifting condition chosen for a processing activity becomes a constraint on the cybersecurity threat model under EN IEC 81001-5-1:2022 and on the ISO 14971 risk file. If consent is the basis, withdrawal must be a working feature and must propagate to backups, dashboards, and any downstream analytics. If healthcare provision is the basis, retention is usually driven by clinical record-keeping duties that are longer than a patient might expect from a consumer app.

A worked example (Test)

Consider a digital therapeutic for chronic pain that uses a SaMD mobile app plus a wearable. The startup initially defaults to consent for everything: onboarding collects consent for processing, storage, sharing with the treating physician, and anonymised research. The consent flow is a single checkbox on the account creation screen.

A clinical partner raises concerns. The pain clinic that will prescribe the therapeutic argues that consent for core clinical processing is not meaningful because the patient cannot benefit from the therapeutic without accepting. The partner proposes that core processing runs on a different lifting condition more suited to healthcare provision, and that consent is reserved for the research use case where the patient can genuinely decline without losing access to treatment.

The startup rewrites the flow. There are now two distinct data activities:

  • Activity A: processing of clinical measurements and therapy progress to deliver the therapeutic effect. Lifting condition [MDR VERIFY]: necessary for the provision of healthcare, combined with the appropriate ordinary lawful basis such as contract or a clinical legal obligation, subject to Member State specifics.
  • Activity B: processing of the same clinical data in anonymised or pseudonymised form for research into treatment efficacy. Lifting condition: explicit consent, with genuine ability to decline.

The consent flow on the onboarding screen collapses to a single opt-in for Activity B. Core processing no longer asks for consent because consent is not the legal basis for it. The privacy notice is rewritten to tell the patient the truth: here is what we do to treat you, here is what we would like to do with your data for research, you can say no to the second without losing access to the first.

The cybersecurity threat model is updated. Asset confidentiality gets a new acceptance criterion: research exports must be pseudonymised before leaving the clinical data store. The ISO 14971 risk file gets a new hazard: if a research export accidentally contains identifying data, a confidentiality breach occurs and the therapeutic relationship with the patient could be damaged. The safety consequence is tracked.

One product decision, taken at the scoping stage, simplified the entire downstream stack.

The Subtract to Ship playbook (Ship)

Felix's default playbook for a startup processing health data is to subtract the easy answer and do the harder upfront thinking.

Step 1. Inventory every processing activity separately, not the device as a whole. A device is not one processing activity. It is usually five to fifteen. Onboarding, clinical measurement, clinician review, support, research, analytics, marketing lookalike audiences, backup, disaster recovery, account deletion. Each activity gets its own row.

Step 2. For each activity, pick an ordinary lawful basis and a special category lifting condition. The lifting condition is not filled in once for the whole product. It is filled in per activity. This is the single most common shortcut Tibor has seen founders take, and the one that comes back to cost them.

Step 3. Verify the picked condition against the clinical reality. Consent in a care relationship is fragile. Healthcare provision is usually stronger for core clinical activities. Public health is narrow. Research has its own safeguards. A qualified data protection lawyer should verify each choice.

Step 4. Let the chosen conditions shape the product. If consent is the basis for Activity B, the app needs a real withdrawal button and a real propagation of that withdrawal into the research pipeline. If healthcare provision is the basis for Activity A, retention is governed by clinical record-keeping duties, which must be reflected in the retention policy, in the database design, and in the privacy notice.

Step 5. Feed every activity into the same asset inventory that drives the cybersecurity threat model. Special category data is an asset with a heightened sensitivity label. The asset inventory is the single document that keeps GDPR thinking and MDR cybersecurity thinking in the same room.

Step 6. Document the picked conditions and the reasoning, not just the outcome. Supervisory authorities expect a manufacturer to be able to explain why a particular lifting condition was chosen. A one-line answer is not enough. A short memo per activity, written at scoping time, satisfies the demonstrable accountability obligation.

Reality Check

  1. Has every processing activity in the device been listed separately, rather than the device being described as one processing activity?
  2. For each activity, is the special category lifting condition documented and justified?
  3. Has a qualified data protection lawyer reviewed the picked lifting conditions at least once?
  4. If consent is the claimed basis for any core clinical activity, can the patient genuinely decline without losing access to care?
  5. Does the product actually support withdrawal when consent is the basis, including propagation to backups and downstream systems?
  6. Is the retention period for each activity documented and linked to the lifting condition that justifies it?
  7. Is the special category data tagged in the cybersecurity asset inventory with a sensitivity level that drives stricter controls?
  8. Is there a written record of why each lifting condition was chosen, not just which one was chosen?

Frequently Asked Questions

Is step count from a consumer wearable health data? On its own, step count is not automatically special category data. The moment it is linked to an identifiable person and used in a clinical context to inform a care decision, it becomes health data under GDPR and attracts the special category rules [MDR VERIFY]. A wellness step counter and a clinically prescribed rehabilitation therapy using the same sensor can sit on different sides of the line.

Can explicit consent cover every processing activity in a medical device? In theory some startups try this. In practice, consent in a care relationship is rarely free, because declining affects access to treatment. Most care-related processing should rest on a healthcare-provision lifting condition rather than consent [MDR VERIFY]. Consent is better reserved for activities the patient can genuinely decline without losing access to care.

Does GDPR forbid training an AI model on real patient data? Not as a blanket rule. It requires a specific lifting condition, appropriate safeguards, and usually pseudonymisation. Research-focused lifting conditions and local Member State implementations shape what is permitted [MDR VERIFY]. A qualified data protection lawyer and, for AI, a risk assessment against the forthcoming AI Act are both needed.

What is the single biggest mistake founders make on this topic? Tibor's answer is that founders treat GDPR as a one-off privacy policy exercise at launch. It is not. It is a design-time constraint. Every processing activity baked into the product has a GDPR consequence, and it is cheaper to pick the right lifting condition before code is written than to rewrite the onboarding flow after a hospital procurement officer pushes back.

Does the notified body ask about special category data? The notified body enforces MDR, not GDPR. What it will check is whether the cybersecurity risk file and the intended purpose statement are consistent with how the product actually handles personal data. A privacy notice that contradicts the threat model will generate findings even if no GDPR article is cited.

Is there a European harmonised answer or does every country differ? Core GDPR is harmonised. Health data is an area where Member State law explicitly adds conditions and limitations [MDR VERIFY]. A pan-European rollout cannot rely on a single privacy notice and a single lifting condition across all territories.

Sources

  1. Regulation (EU) 2017/745 on medical devices, consolidated text. Annex I Section 17.2.
  2. EN IEC 81001-5-1:2022, Health software and health IT systems safety, effectiveness and security, Part 5-1: Security, activities in the product life cycle.
  3. EN ISO 14971:2019+A11:2021, Medical devices, Application of risk management to medical devices.
  4. [MDR VERIFY] Regulation (EU) 2016/679 (General Data Protection Regulation), consolidated text. Article 9 and related provisions on special category data are cited from working memory and must be verified by qualified counsel before publication or reliance.