---
title: Static Code Analysis for Medical Software: Tools and Implementation
description: Static code analysis medical software MDR: how SAST fits EN 62304 verification, how to pick rule sets, triage false positives, and capture evidence.
authors: Tibor Zechmeister, Felix Lenhard
category: Software as a Medical Device
primary_keyword: static code analysis medical software MDR
canonical_url: https://zechmeister-solutions.com/en/blog/static-code-analysis-medical-software
source: zechmeister-solutions.com
license: All rights reserved. Content may be cited with attribution and a link to the canonical URL.
---

# Static Code Analysis for Medical Software: Tools and Implementation

*By Tibor Zechmeister (EU MDR Expert, Notified Body Lead Auditor) and Felix Lenhard.*

> **Static code analysis is a valid verification method under EN 62304 clause 5.5, and for Class B and Class C software it is one of the most cost-effective ways to produce defensible evidence for criteria humans read poorly — uninitialised variables, boundary violations, null dereferences, unsafe memory patterns. The standard does not prescribe a tool. It requires that your chosen tool is qualified for its intended use under EN ISO 13485:2016+A11:2021 clause 4.1.6, that your rule set is justified against your coding standard and risk analysis, and that findings — including suppressions — are documented.**

**By Tibor Zechmeister and Felix Lenhard.**

## TL;DR
- Static application security testing (SAST) and broader static analysis satisfy part of EN 62304:2006+A1:2015 clause 5.5 software unit verification, especially clause 5.5.4 criteria for Class B and C.
- Under EN ISO 13485:2016+A11:2021 clause 4.1.6, software tools used in the QMS — including the static analyser — must be validated for intended use before use.
- You are not required to use a commercial tool. Categorical options include compiler-integrated analysers, open-source linters, and dedicated commercial SAST platforms. Choice is risk- and language-based.
- Rule-set selection must trace back to your coding standard and your risk analysis, not to tool defaults.
- Suppressed and ignored findings must be justified in writing. "False positive — dismissed" without a reason is an audit finding waiting to happen.
- EN IEC 81001-5-1:2022 reinforces SAST as part of the secure development lifecycle for health software.

## Why static analysis earns its place in verification

There are classes of defect human reviewers miss consistently: uninitialised variables in seldom-hit branches, a null pointer the caller forgot to check, a signed/unsigned comparison that inverts a boundary, a memory leak that matters only after ten hours of runtime. These are exactly the defects EN 62304 clause 5.5.4 points at when it asks, for Class B and C software, about "proper event sequence, data and control flow, planned resource allocation, fault handling, initialisation of variables, self-diagnostics, memory management and memory overflows, and boundary conditions."

Static analysis finds them before the code runs. That matters for two reasons. First, evidence: a clean SAST report tied to a specific build is a durable, reproducible verification artefact — the kind an auditor can understand in three minutes. Second, cost: once configured, static analysis runs on every commit and keeps finding the same classes of defects forever, for the price of triaging its output.

The trap is that SAST without discipline produces noise, the noise gets ignored, and real findings disappear into the pile. The playbook below is about avoiding that.

## Where static analysis sits in EN 62304 and the QMS

**EN 62304 clause 5.5.** Clause 5.5 covers software unit implementation and verification. Clause 5.5.2 requires the manufacturer to establish strategies, methods and procedures for verifying the software units. Static analysis is a method. Clause 5.5.3 establishes unit acceptance criteria — conformance to design, conformance to interfaces, conformance to programming procedures or coding standards. Clause 5.5.4 adds, for Class B and C, the criteria listed above.

Static analysis directly covers large parts of clause 5.5.3 (coding standard conformance) and clause 5.5.4 (initialisation, memory management, boundary conditions, fault handling patterns). It does **not** cover intent, requirement conformance, or risk control correctness — those remain the job of review and testing.

**EN ISO 13485:2016+A11:2021 clause 4.1.6 — validation of software tools used in the QMS.** This clause requires that the manufacturer document procedures for the validation of software tools used in the quality management system, and that such validation is proportionate to the risk associated with the use of the software. A static analyser that produces verification evidence for a released medical device is a software tool used in the QMS. It therefore needs to be validated for its intended use, and the validation effort should be proportionate.

**EN IEC 81001-5-1:2022.** The cybersecurity lifecycle standard for health software explicitly references static analysis as part of secure implementation practices. For any SaMD with connectivity, cloud components, or exposure to untrusted input, SAST is effectively expected as state of the art under MDR Annex I §17.2 and §17.4.

**MDR Annex I §17.2.** Annex I §17.2 of Regulation (EU) 2017/745 requires software to be developed according to state of the art, taking into account the principles of lifecycle, risk management, and V&V. EN 62304 is the harmonised route; static analysis sits inside that route.

## Tool categories (not brands)

Static analysis tools fall into broad categories. The right choice depends on language, runtime, safety class, and team size.

- **Compiler-integrated analysers.** Strict warning levels treated as errors form a meaningful baseline. For languages with strong native tooling, the built-in analyser may cover a large share of clause 5.5.4 criteria.
- **Open-source linters and analysers.** Configurable, community-maintained, strong for coding standard enforcement. Lower cost, higher configuration burden. Qualification under ISO 13485 clause 4.1.6 still applies — price tag does not change the regulatory obligation.
- **Dedicated commercial SAST platforms.** Purpose-built for defect and security analysis, often with medical-device-relevant rule sets and reporting formats geared toward regulated environments.
- **Security-specific SAST.** CWE/OWASP-style vulnerability detection. Relevant where EN IEC 81001-5-1:2022 applies.
- **Language-specific ecosystem analysers.** For your language there is usually an analyser the community considers state of the art — itself part of "state of the art" under Annex I §17.2.

A typical startup runs two or three of these in combination: a compiler baseline, a linter enforcing the coding standard, and either a commercial SAST or a security-focused open-source tool. Each tool's role belongs in the verification plan.

## Rule-set selection: the step startups skip

The worst thing you can do with a static analyser is enable every rule the vendor offers. You will drown in findings, the team will start ignoring the tool, and the resulting "evidence" will be worse than having no tool at all.

Rule sets should be selected to trace back to:

1. **Your coding standard.** You have a coding standard (ISO 13485 clause 7.3.2 requires planned development outputs; EN 62304 clause 5.1.4 requires software development standards, methods and tools). Your rule set should enforce your coding standard — not a different one.
2. **Your language and runtime.** Rules that do not apply to your stack should be off. This is not laziness, it is noise reduction.
3. **Your risk analysis.** If a hazard analysis entry identifies a failure mode that a static analysis rule can detect, that rule is in scope. If your EN ISO 14971:2019+A11:2021 file flags memory exhaustion as a risk, your rule set had better include memory-related rules.
4. **Your software safety class.** Class A needs less; Class B and C need the clause 5.5.4 criteria addressed. The rule set should visibly cover them.
5. **Your cybersecurity exposure.** If EN IEC 81001-5-1:2022 applies, security rules should be on.

The rule set, the justification for each category of rule enabled, and the exclusions with rationale should live in a short document under your QMS. One page is enough for most startups. It is also exactly what an auditor will ask for after the SAST report itself.

## Tool qualification under ISO 13485 clause 4.1.6

Tool qualification sounds heavy. For a static analyser on a small codebase, it is not. Proportionate steps:

1. **Define intended use.** One sentence: "Used to verify software units against coding standard X and clause 5.5.4 criteria Y, Z for Class B software."
2. **Assess the risk of tool failure.** False negatives (missed defects) are the regulatory concern.
3. **Validate proportionately.** Run the tool against a small known-defect corpus, confirm it flags what it should, record the result. Pin version, configuration, and platform.
4. **Record it.** Tool, version, configuration, date, operator, result, approval.
5. **Revalidate on material change.** Version bump, rule-set change, or platform change triggers a short revalidation.

This is "validate that your use of the tool produces the evidence you claim" — not "validate the vendor's entire tool."

## False-positive triage without losing the signal

Every analyser produces false positives — findings where the rule applies correctly but the flagged code is, in context, not defective. Triage has to be disciplined, because every suppression is a decision not to act on a potential defect.

- **Suppress at the smallest scope.** Line-level where possible; global suppressions are usually a mistake.
- **Every suppression has a written reason** a reviewer who has never seen the code can understand.
- **Suppressions are peer-reviewed**, same discipline as code review.
- **Suppression density is a metric.** A file with 40 suppressions probably has a design problem, not 40 false positives.
- **Suppressions expire** — reviewed annually or on major tool/code changes.
- **Unsuppressed findings block merge.** Otherwise findings accumulate and the signal drowns.

## A worked example: the first week of SAST on a Class B SaMD

A startup building Class B SaMD introduces static analysis in week one of a release cycle.

**Day 1.** One-page SAST plan: tools selected, rule-set rationale traced to coding standard and EN ISO 14971 file, qualification plan per ISO 13485 clause 4.1.6.

**Day 2.** Baseline run on main produces 612 findings. They do not fix 612 findings this week.

**Day 3.** Categorisation: 180 coding-standard violations in legacy files; 220 style-level; 80 correctness warnings; 50 clause 5.5.4-relevant (initialisation, memory, boundary); 82 noise from rules that should not have been enabled.

**Day 4.** Disable the 82 noise rules and document why. Globally suppress the 220 style findings with a dated cleanup action — mixing style noise with correctness findings guarantees the correctness ones get missed.

**Day 5.** Fix the 50 clause 5.5.4 findings and the 80 correctness warnings. Commit to burning down legacy findings at one file per sprint.

**Day 6.** Wire the analyser into CI as a required check. New findings block the PR; baseline must not increase without a justified suppression.

**Day 7.** Qualify the tool against a small known-defect corpus, document the result, archive it.

By week's end: durable, reproducible, audit-ready evidence on every commit — and 130 real issues fixed.

## The Subtract to Ship playbook for static analysis

**1. One page of SAST plan, not ten.** Intended use, tools, rule-set justification, tool qualification approach, suppression policy. One page.

**2. Rule set tied to your coding standard and your risk file, not to tool defaults.** Auditors ask "why these rules" — have an answer.

**3. Qualify the tool proportionately.** A known-defect corpus run plus version and configuration pinning is enough for most startups under ISO 13485 clause 4.1.6.

**4. CI as the gate, archive as the evidence.** CI runs on every commit; the archive stores periodic baseline reports linked to releases. The archive is audit evidence; CI is not.

**5. Treat suppressions as a controlled document.** Every suppression justified, reviewed, scoped, and dated. Density tracked.

**6. Combine with review and testing — not in place of them.** Static analysis covers part of clause 5.5.4. Code review covers intent and design. Unit testing covers behaviour. All three mapped to criteria.

**7. For connected devices, turn on security rules explicitly.** EN IEC 81001-5-1:2022 and MDR Annex I §17.4 expect it.

## Reality Check

1. Do you have a written software unit verification plan that names static analysis as a method and traces its rule set to your coding standard and risk file?
2. Is your static analyser qualified for intended use under EN ISO 13485:2016+A11:2021 clause 4.1.6, with records?
3. Are your clause 5.5.4 criteria — initialisation, memory, boundary, fault handling — explicitly mapped to the rules that cover them?
4. Does every suppressed SAST finding have a written justification a reviewer can understand?
5. Can you produce a SAST report for any given release from a controlled archive, not just from CI?
6. Do new findings block merge, or do they accumulate as "we'll get to them"?
7. For connected software, have you enabled security-focused rules aligned with EN IEC 81001-5-1:2022?

## Frequently Asked Questions

**Does EN 62304 require static analysis?**
EN 62304 does not require static analysis by name. It requires software unit verification with planned methods and acceptance criteria. Static analysis is one of the recognised methods, and for Class B and C software it is, in practice, state of the art under MDR Annex I §17.2 for covering several clause 5.5.4 criteria.

**Can I use only a free open-source tool?**
Yes, if it covers your rule-set needs and is qualified for intended use under ISO 13485 clause 4.1.6. Tool price has no bearing on the regulatory obligation. The same qualification, rule-set justification, and evidence discipline apply.

**Does every suppressed finding need a written reason?**
Yes. A suppression without a reason is a documented decision not to act on a potential defect, with no rationale. That is the exact situation audits and vigilance investigations target.

**How do I handle SAST for third-party or SOUP components?**
SOUP handling is governed by EN 62304 clauses 5.3.3, 5.3.4, and 7.1.3. Running SAST on SOUP is often impractical; instead, SOUP is evaluated through its problem reports, version control, and intended use analysis. SAST is focused on your own source.

**Does a CI pass count as verification evidence?**
CI output is part of the evidence, but the archived report, the rule-set justification, the tool qualification record, and the traceability to the release are what the auditor will look at together. CI alone is a technical artefact; the evidence package is what satisfies clause 5.5.5.

**How often should I update the tool and the rule set?**
On material change to tool, rule set, platform, or language — and at a planned review cadence defined in your SAST plan (annually is common). Every update triggers a proportionate revalidation under ISO 13485 clause 4.1.6.

## Related reading
- [Software Unit Testing under IEC 62304](/blog/software-verification-unit-testing-iec-62304) — the test-based verification method that complements SAST.
- [Validating QMS Software Tools under MDR](/blog/validating-qms-software-tools-mdr) — the ISO 13485 clause 4.1.6 obligation SAST tools fall under.
- [Code Review Practices under MDR](/blog/code-review-practices-mdr) — the human review layer that complements static analysis.
- [Software Problem Resolution under IEC 62304](/blog/software-problem-resolution-iec-62304) — what happens when SAST finds a real defect in released software.
- [Software Traceability: Requirements, Tests, Risks](/blog/software-traceability-requirements-design-tests-risks) — the trace that links SAST findings to the right release and unit.

## Sources
1. Regulation (EU) 2017/745 on medical devices, consolidated text. Annex I §17.2 and §17.4.
2. EN 62304:2006+A1:2015 — Medical device software — Software lifecycle processes. Clauses 5.1.4, 5.3.3, 5.3.4, 5.5.1–5.5.5, 7.1.3.
3. EN ISO 13485:2016+A11:2021 — Medical devices — Quality management systems. Clause 4.1.6, 7.3.2.
4. EN IEC 81001-5-1:2022 — Health software and health IT systems safety, effectiveness and security — Part 5-1: Security — Activities in the product lifecycle.

---

*This post is part of the [Software as a Medical Device](https://zechmeister-solutions.com/en/blog/category/samd) cluster in the [Subtract to Ship: MDR Blog](https://zechmeister-solutions.com/en/blog). For EU MDR certification consulting, see [zechmeister-solutions.com](https://zechmeister-solutions.com).*
