Most DiGA rejections Tibor has heard about are not surprises. They cluster around a small set of predictable failures: CE marking without DiGA-grade clinical evidence, weak or missing economic benefit data, interoperability gaps, and data protection shortfalls. This post is an honest reality check, not a recipe for guaranteed success. DiGA is hard because it asks for evidence startups are not used to producing.

By Tibor Zechmeister and Felix Lenhard.

TL;DR

  • DiGA rejections tend to cluster around four themes: clinical evidence, economic benefit, interoperability, and data protection .
  • A CE mark under MDR is the starting line for DiGA, not the finish line. Minimal Class I evidence is almost never sufficient for the BfArM positive healthcare effect bar .
  • Positive healthcare effect is the core DiGA concept and covers both medical benefit and structural-procedural improvements .
  • Interoperability failures often come from missing or outdated support for German health IT standards. Founders outside Germany consistently underestimate this requirement .
  • Data protection rejections rarely come from a single obvious mistake. They come from small, accumulated gaps that together fall short of the BfArM threshold .

Why this matters

Founders rarely hear about DiGA rejections. They hear about the listings. A new DiGA enters the directory, a LinkedIn post goes up, an investor deck adds a slide. The applications that failed do not usually get the same attention. That creates a distorted picture of what the pathway actually asks for.

Tibor has not taken a product through DiGA directly. What he has is the pattern recognition that comes from talking to manufacturers who did, and from reading the guidance BfArM has published for applicants. The pattern is consistent enough to name, with the caveat that specific rejection-reason statistics at the population level should be verified against the current BfArM DiGA report .

This post is framed honestly. It is not a guaranteed playbook. It is a list of the places where applications tend to fail, and why, so that founders can stress-test their own application before BfArM does.

What BfArM actually asks for

The DiGA pathway sits on top of two things: a CE mark under MDR, and a body of evidence specific to DiGA. Both are necessary. Neither is sufficient on its own.

The CE mark confirms that the product is a medical device that meets the MDR general safety and performance requirements, with a technical file, a risk management process under EN ISO 14971:2019+A11:2021, software lifecycle documentation under EN 62304:2006+A1:2015, and appropriate cybersecurity under EN IEC 81001-5-1:2022 and MDCG 2019-16 Rev.1. This is the MDR layer.

On top of that, BfArM asks for evidence in the following areas, as described in the current BfArM DiGA guide :

  • Safety and functional suitability.
  • Data protection and data security.
  • Quality, including interoperability, robustness, consumer protection, ease of use, support for service providers, quality of medical content, patient safety.
  • Positive healthcare effects, defined as a medical benefit or a patient-relevant structural and procedural improvement in care.

The fourth item is where most rejections begin. A startup can arrive with a strong CE mark, reasonable data protection, and a clean usability file, and still fail because the positive healthcare effect is not demonstrated at the level BfArM expects.

A worked example of the four failure modes

Consider a seed-stage startup with a digital therapy application for anxiety. The product is CE marked as Class IIa software under MDR Annex VIII Rule 11. The technical file is clean. The founder believes DiGA is the natural next step.

Here is how each of the four failure modes typically appears for a product in this profile.

1. Clinical evidence gap

The CE clinical evaluation shows that the product meets its intended purpose based on literature, analogue device data, and a small formative study. This is enough for Class IIa under MDR Article 61 and Annex XIV. It is usually not enough for BfArM.

BfArM expects a comparative study, typically a randomised or at least controlled design, showing that the DiGA improves a patient-relevant endpoint compared to a reference . Founders who submit a CE clinical evaluation as their DiGA evidence are often pointed toward the provisional listing path, where they must complete the study within a defined window, or they are declined outright.

This is not a loophole failure. It is a threshold failure. The bar exists, and it is higher than MDR Class IIa.

2. Economic benefit / positive healthcare effect ambiguity

Positive healthcare effect has two flavours under the DiGA framework: medical benefit and structural-procedural improvement . Founders usually argue medical benefit. They argue it in language borrowed from MDR clinical evaluation, which asks whether the device performs its intended purpose. That is not the same question BfArM asks.

BfArM asks whether the product changes a patient-relevant outcome in a way that would justify German statutory health insurance paying for it. "The app works as intended" is an MDR answer. "Users of the app showed a measurable improvement on an accepted clinical scale versus a control group" is closer to the DiGA answer.

Economic benefit is different again. Not every DiGA needs a cost-effectiveness argument, but applications that frame themselves primarily on cost savings need the supporting data to hold up . Vague savings estimates based on internal assumptions rarely survive review.

3. Interoperability gaps

Interoperability is a quiet killer. A startup building for an international market tends to optimise for the lowest common denominator of integration. A DiGA application must support the German health IT landscape, including relevant interoperability standards referenced by BfArM and the interoperability directory .

Common gaps:

  • No support for accepted standards for data export in patient-readable and machine-readable form.
  • No plan for integration with the electronic patient record (elektronische Patientenakte, ePA) .
  • Non-German language support that falls below the German-language standard BfArM expects.

These are not MDR requirements. They are DiGA requirements. Founders who treat them as optional polish find out during review that they are not.

4. Data protection and data security shortfalls

Data protection failures rarely come from one obvious mistake. They come from small gaps that together fall below the threshold. Common patterns:

  • Processors based outside the EU without a defensible legal basis.
  • Incomplete DPIA (data protection impact assessment).
  • Missing or outdated TOMs (technical and organisational measures).
  • Data minimisation claims in the privacy notice that do not match the actual telemetry collected by the app.
  • No clear deletion path on user request.

On top of GDPR, DiGA has its own data protection criteria administered through a structured BfArM checklist, and (depending on the risk profile) may require adherence to BSI-aligned security expectations . Startups that meet GDPR but have not separately mapped against the DiGA criteria frequently find gaps.

The Subtract to Ship playbook

Felix's Subtract to Ship approach is useful here because rejection prevention is a subtraction problem. The goal is not to add more features. It is to remove the predictable failure modes before submission.

Step 1: Map your evidence to BfArM, not to a notified body. Print the current BfArM DiGA guide. For each required evidence area, mark what you have and what is missing. Do this honestly, with someone other than the founder doing the review.

Step 2: If you do not have a comparative clinical study, plan one now. Even if you intend to use the provisional listing window to generate evidence, you need a credible study design and a realistic timeline before you submit. A study that is still being designed during the provisional window is a study that will miss the window.

Step 3: Separate the medical benefit argument from the economic argument. If you are claiming medical benefit, your endpoints must be patient-relevant and measurable. If you are claiming structural-procedural improvement, your argument must be about care process changes, not cost savings. Mixing the two in a single submission is a red flag.

Step 4: Audit your interoperability story against the German stack. Specifically, your data export formats, ePA integration plan, and German-language coverage. If you do not have German-speaking support in-house, get it before submission, not after.

Step 5: Run a structured data protection review against the DiGA criteria, not just GDPR. Commission a DPIA if you have not already. Update your TOMs. Reconcile your privacy notice with your actual data flows. Identify and relocate non-EU processing where necessary.

Step 6: Subtract features that add risk without adding evidence. Every feature is a potential interoperability, usability, or data protection finding. Founders often add features to strengthen the product in the eyes of investors, and in doing so add failure modes in the eyes of BfArM. The smallest product that can demonstrate a positive healthcare effect is usually the strongest application.

Step 7: Speak to rejected applicants, not just successful ones. Tibor's recommendation is clear. Rejection stories are where the real lessons live. A 30-minute conversation with a manufacturer whose application failed is worth more than a year of reading BfArM FAQs.

Reality Check

A diagnostic for founders who believe their application is ready.

  1. Is your clinical evidence a comparative study with patient-relevant endpoints, or is it an MDR clinical evaluation dressed up in DiGA language?
  2. Can you state your positive healthcare effect in one sentence, and does that sentence mention a specific endpoint or a specific care process change?
  3. Have you mapped your evidence against the current BfArM DiGA guide section by section?
  4. Does your application have a German-language version and a documented ePA integration plan ?
  5. Has a data protection specialist (not just your general counsel) reviewed the application against the DiGA-specific data protection criteria?
  6. Have you spoken to at least one DiGA applicant whose first submission was rejected or sent back with substantive findings?
  7. If your submission entered the provisional listing window, is your comparative study already designed, ethics-approved, and contracted with a site?
  8. Would you bet your runway on this application being accepted on first submission, or are you budgeting for at least one revision cycle?

If more than two of these questions return an uncomfortable answer, the submission is not ready.

Frequently Asked Questions

What is the most common reason DiGA applications are rejected? Based on patterns Tibor has heard from manufacturers, insufficient clinical evidence for a positive healthcare effect is the most consistently cited failure mode. Specific population-level statistics should be verified against the latest BfArM DiGA report .

Does a CE mark under MDR guarantee DiGA acceptance? No. CE marking is a prerequisite. It confirms MDR conformity. It does not substitute for the DiGA-specific evidence on positive healthcare effect, interoperability, and data protection.

Is a small single-arm study enough for positive healthcare effect? Usually not. BfArM expects a comparative design for medical benefit claims. Structural-procedural improvement claims may have different evidence designs, but the bar is still defined .

What data protection requirements apply beyond GDPR? DiGA has its own data protection and data security criteria, assessed through a structured process, and may reference BSI-aligned security expectations depending on the product's risk profile .

Can we submit to DiGA if we are not a German company? Yes. There is no requirement to be a German legal entity, but the application must meet German interoperability, language, and data protection expectations, which startups outside Germany consistently underestimate.

What should we do if our application is rejected? Read the decision carefully, address the findings systematically, and use the experience before your next submission. Rejections are signal, not failure. Founders who treat them as data improve materially on the second attempt.

Sources

  1. Regulation (EU) 2017/745 on medical devices, consolidated text. Article 2, Article 51, Article 61, Annex VIII Rule 11, Annex XIV.
  2. BfArM, The Fast-Track Process for Digital Health Applications (DiGA) according to § 139e SGB V – A Guide for Manufacturers, Service Providers and Users. Current version .
  3. Digitale Gesundheitsanwendungen-Verordnung (DiGAV), consolidated version .
  4. SGB V (German Social Code, Book V), digital health application provisions .
  5. BfArM DiGA annual report(s) for aggregate rejection and listing statistics .
  6. MDCG 2019-16 Rev.1, Guidance on Cybersecurity for Medical Devices, December 2019, Rev.1 July 2020.
  7. EN IEC 81001-5-1:2022, Health software and health IT systems safety, effectiveness and security.