---
title: Software Validation vs. Verification: The Difference for SaMD
description: Verification asks did we build the software right. Validation asks did we build the right software. Here is how the two split for SaMD under MDR.
authors: Tibor Zechmeister, Felix Lenhard
category: Software as a Medical Device
primary_keyword: software validation verification SaMD
canonical_url: https://zechmeister-solutions.com/en/blog/software-validation-vs-verification-samd
source: zechmeister-solutions.com
license: All rights reserved. Content may be cited with attribution and a link to the canonical URL.
---

# Software Validation vs. Verification: The Difference for SaMD

*By Tibor Zechmeister (EU MDR Expert, Notified Body Lead Auditor) and Felix Lenhard.*

> **For Software as a Medical Device under MDR, verification and validation are two distinct activities that answer two different questions. Verification asks whether the software was built correctly against its specified requirements, and it lives inside EN 62304:2006+A1:2015 at clauses 5.5 software unit implementation and verification, 5.6 software integration and integration testing, 5.7 software system testing, and 5.8 software release. Validation asks whether the software meets user needs and intended use in the actual or simulated use environment, and it lives in the QMS under EN ISO 13485:2016+A11:2021 clause 7.3.7 design and development validation, together with the clinical evaluation required by MDR Article 61 and Annex XIV. IEC 62304 does not cover validation. Confusing the two — or treating verification as a substitute for validation — is one of the most common reasons a SaMD technical file fails a Notified Body review.**

**By Tibor Zechmeister and Felix Lenhard. Last updated 10 April 2026.**

---

## TL;DR

- Verification and validation are separate activities with separate definitions, separate owners, and separate evidence paths. A SaMD technical file needs both, and one cannot substitute for the other.
- Verification is "did we build the software right." It runs inside EN 62304:2006+A1:2015 at clauses 5.5, 5.6, 5.7, and closes at clause 5.8 release. It tests the software against its specified software requirements.
- Validation is "did we build the right software." It runs inside the QMS under EN ISO 13485:2016+A11:2021 clause 7.3.7, tests the device against user needs and intended use, and is supported by the clinical evaluation under MDR Article 61 and Annex XIV.
- EN 62304:2006+A1:2015 scopes itself deliberately to verification of software against software requirements. It does not cover design validation. That is not an omission — it is a boundary.
- The V-model clarifies the split: the left side of the V flows from user needs down to software units, and each horizontal line is the corresponding verification or validation activity.
- The worked example below shows the same SaMD feature traced through both paths.
- The Subtract to Ship playbook is to build the V-model on one page, own both sides, and cut any activity that does not close a specific V line.

---

## Why the distinction matters for your SaMD

A Notified Body review of a SaMD technical file almost always opens with the same check: does the manufacturer understand the difference between verification and validation, and can they show the evidence for both. The question is not pedantic. It is diagnostic. Teams that conflate the two tend to have written excellent EN 62304:2006+A1:2015 test reports and then stopped — believing that because every software requirement is covered by a passing test, the device is validated. It is not. The device is verified. Whether it actually addresses the clinical need, in the hands of the intended user, in the intended environment, with the intended workflow, is a different question the test suite cannot answer.

The distinction also matters because the two activities have different failure modes and different remediation costs. A verification gap is usually a specification or coverage problem — add the missing requirement, add the missing test, re-run. A validation gap is a design problem — the software may technically do what its specification says, but what the specification said was not what the user actually needed. Fixing a validation gap often means going back through requirements, design, and implementation. Discovering the gap at the Notified Body audit stage is the most expensive place to discover it.

SaMD compounds the stakes because the device is software. There is no mechanical prototype to stress-test. The validation has to be designed in from the requirements stage, or it is not there at all.

## What verification means under EN 62304:2006+A1:2015

Verification under EN 62304:2006+A1:2015 is the set of activities that confirm the software has been built in accordance with its specified software requirements. The standard addresses verification directly at clause 5.5 software unit implementation and verification, clause 5.6 software integration and integration testing, and clause 5.7 software system testing. Clause 5.8 software release then confirms that verification is complete before the software is released. The activities scale with the software safety class — Class A has lighter expectations than Class B, which has lighter expectations than Class C — but the structure is the same.

The three layers map naturally to test levels. Clause 5.5 verification tests the smallest units of the software against their unit-level specifications. Clause 5.6 integration testing exercises the behaviour of combined units against the integration specification. Clause 5.7 system testing runs the fully integrated software against the software system requirements — end-to-end, in a controlled environment, producing the records that show every requirement in scope has been addressed by a passing test.

Crucially, EN 62304:2006+A1:2015 scopes itself to software against software requirements. The standard does not define validation and does not prescribe validation activities. It assumes the manufacturer operates a broader QMS in which validation sits. For SaMD, that broader QMS is EN ISO 13485:2016+A11:2021 — and the hand-off from IEC 62304 verification to ISO 13485 validation is where many teams drop the baton. For the deeper lifecycle activity map, see post 376.

## What validation means under EN ISO 13485:2016+A11:2021 and MDR

Validation is the confirmation, by examination and provision of objective evidence, that the requirements for a specific intended use can be consistently fulfilled. For a medical device, it runs under EN ISO 13485:2016+A11:2021 clause 7.3.7 design and development validation. The clause expects validation to be performed in accordance with planned arrangements, on representative product, under defined conditions, and in the actual or simulated use environment, with records of results.

For SaMD, validation has several strands. The first is design validation itself under clause 7.3.7 — does the software, as a whole device, meet the user needs and intended uses defined at the start of design. The second is usability validation under EN 62366-1:2015+A1:2020 — does the user interface support the intended user performing the intended tasks safely. The third is clinical evaluation under MDR Article 61 and Annex XIV — is there clinical evidence that the device achieves its intended clinical benefit and does so with an acceptable benefit-risk profile. Where the SaMD qualifies as medical device software under MDCG 2019-11 Rev.1 and is classified under Annex VIII Rule 11, the clinical evaluation and the Rule 11 classification together shape how extensive the validation evidence needs to be.

MDR Annex I Section 17.2 obliges the manufacturer to develop software in accordance with the state of the art taking into account the principles of development lifecycle, risk management, and verification and validation. Note the wording — verification and validation, not verification or validation. The Regulation itself puts both on the manufacturer. MDR Article 10 assigns the obligation to operate the QMS under which these activities run, and Annex II requires the technical documentation to include the verification and validation results.

## Who owns what — the V-model mapping

The V-model is the cleanest way to see verification and validation at once. The left leg of the V descends from user needs to system requirements to software requirements to software architecture to software detailed design to software units. Each node on the left is a specification. The right leg of the V ascends from software unit verification to software integration testing to software system testing to design verification and then to design validation and clinical evaluation. Each node on the right is a verification or validation activity that corresponds to a specific node on the left.

The horizontal lines across the V are the traceability that proves each specification has a corresponding evidence record. Unit specifications on the left connect to unit verification on the right — EN 62304:2006+A1:2015 clause 5.5. Integration specifications connect to integration testing — clause 5.6. Software system requirements connect to software system testing — clause 5.7. Software requirements connect to design verification — still inside the IEC 62304 verification envelope. But user needs and intended use at the top of the V do not connect to a clause 5 activity. They connect to design validation under EN ISO 13485:2016+A11:2021 clause 7.3.7, usability validation under EN 62366-1:2015+A1:2020, and clinical evaluation under MDR Annex XIV.

The top line of the V is the one that most often breaks. A team that owns only EN 62304:2006+A1:2015 closes every line below the software requirements level and leaves the user needs line open. A Notified Body sees the open line immediately. The fix is not more tests — it is the validation and clinical evaluation path at the top.

## A worked example — one SaMD feature through both paths

Consider a SaMD that performs automated lesion measurement on dermatology images, with the intended use "to support clinicians in monitoring skin lesion changes over time." One user need is "the clinician can reliably compare lesion size between visits."

The verification path translates that user need into system requirements — "the software shall compute lesion area in square millimetres with calibration from user-provided reference" — into software requirements — "REQ-042: given a segmented lesion mask and a known pixel-to-mm ratio, compute area with error under 5 percent on the validation image set." The verification activity is a clause 5.7 system test that feeds REQ-042 a controlled test image set, checks the output against ground truth, and records pass or fail. Clause 5.6 integration tests verify the upstream segmentation pipeline. Clause 5.5 unit verification covers the area computation function in isolation. When all three pass and the traceability matrix shows REQ-042 mapped to the passing test IDs, verification for that requirement is closed.

None of that verification evidence says anything about whether a clinician, looking at the software's output in a real clinic, on a real follow-up visit, with real images acquired under typical conditions, can actually perform the monitoring task safely. That is the validation question. The validation path tests the same user need — "the clinician can reliably compare lesion size between visits" — by running the SaMD with representative users, representative images, and representative workflows, and measuring whether the users make correct monitoring decisions. Usability validation under EN 62366-1:2015+A1:2020 contributes to the evidence. The clinical evaluation under MDR Annex XIV places the validation results in the context of the device's clinical benefit claim.

Verification can show the area-computation error is below 5 percent on the validation set. Validation can reveal that the real-world image quality brings the error into the 8-10 percent range, or that the clinician interprets the output differently than the design team expected. These are the findings no amount of clause 5.7 system testing will produce.

## The V&V plan playbook

A lean V&V plan for a SaMD is one document, ideally one page plus appendices, that defines both sides of the V and the evidence path for each. The playbook runs in this sequence.

First, write user needs and intended use. These are the top of the V and the anchor for validation and clinical evaluation. Be specific about user, environment, clinical task, and workflow. Ambiguous user needs produce unverifiable validation later.

Second, derive system requirements and software requirements in a traceability matrix that links each downward. The matrix is the instrument that guarantees verification can close every branch.

Third, define the verification activities per clause 5.5, 5.6, 5.7 of EN 62304:2006+A1:2015, scaled to the software safety class. Record which tests cover which requirements. Decide the pass thresholds before the tests run.

Fourth, define the design validation activities under EN ISO 13485:2016+A11:2021 clause 7.3.7. Specify the validation conditions — actual or simulated use environment, representative users, representative product, defined protocol, recorded results. For SaMD this includes the summative usability evaluation under EN 62366-1:2015+A1:2020.

Fifth, define the clinical evaluation plan under MDR Article 61 and Annex XIV, and trace it back to the user needs and intended use. The clinical evaluation consumes validation evidence; it is not a substitute for it.

Sixth, define the release boundary under EN 62304:2006+A1:2015 clause 5.8. Release happens only when verification is complete, validation is complete, and clinical evaluation supports the intended use claim. The release record is the single point that says all three lines have closed. For the release activity itself, see post 387. For the CI/CD pattern that runs verification continuously, see post 398.

Seventh, wire risk management under EN ISO 14971:2019+A11:2021 into both paths. Verification gaps and validation gaps both feed risk analysis, and the risk file is consulted at every gate.

## Reality Check — Can your SaMD survive a V&V audit?

1. Can you produce, on one page, a V-model that shows user needs at the top, software units at the bottom, and every horizontal verification or validation line in between?
2. Is every software requirement traced to at least one passing test under EN 62304:2006+A1:2015 clause 5.5, 5.6, or 5.7?
3. Do you have design validation evidence under EN ISO 13485:2016+A11:2021 clause 7.3.7 — in actual or simulated use environment, with representative users, on representative product?
4. Is your usability validation under EN 62366-1:2015+A1:2020 complete and traceable to the safety-related use scenarios?
5. Does your clinical evaluation under MDR Article 61 and Annex XIV connect the validation evidence to the clinical benefit claim?
6. If a Notified Body asked whether verification alone satisfies your MDR obligations, could you explain why the answer is no in one sentence?
7. Is your release record under clause 5.8 conditional on verification, validation, and clinical evaluation all closing, or only on verification?
8. When verification and validation disagree — the tests pass but the user study shows the feature does not help — does the team treat the validation finding as authoritative, or try to re-argue the tests?

Any question you cannot answer with a clear yes is a gap a Notified Body will find. The gap is almost always on the validation side.

## Frequently Asked Questions

**What is the difference between software verification and software validation for a SaMD?**
Verification confirms the software has been built correctly against its software requirements, and it lives inside EN 62304:2006+A1:2015 at clauses 5.5, 5.6, and 5.7. Validation confirms the device meets user needs and intended use, and it lives inside EN ISO 13485:2016+A11:2021 clause 7.3.7 together with usability validation under EN 62366-1:2015+A1:2020 and the clinical evaluation under MDR Article 61 and Annex XIV. One answers "did we build it right," the other answers "did we build the right thing."

**Does EN 62304:2006+A1:2015 cover design validation?**
No. EN 62304:2006+A1:2015 is deliberately scoped to the software lifecycle processes that produce software matching its specified requirements. Validation against user needs and intended use sits in the broader QMS under EN ISO 13485:2016+A11:2021 clause 7.3.7 and in the clinical evaluation under MDR Annex XIV. This boundary is explicit in the standard, not an oversight.

**Can passing all EN 62304 tests substitute for design validation?**
No. Passing clause 5.5, 5.6, and 5.7 tests shows the software matches its software requirements. It does not show those software requirements correctly capture the user needs and the clinical benefit. A device can pass every IEC 62304 test and still fail design validation because the requirements themselves missed something the clinical context demanded.

**Where does usability validation fit into the V&V picture for SaMD?**
Usability validation under EN 62366-1:2015+A1:2020 is part of the design validation branch. The summative usability evaluation is one of the strongest evidence sources that the device, in the hands of the intended user, in the intended use environment, supports safe task performance. For most SaMD, usability validation is not optional under MDR Annex I Section 5.

**What role does clinical evaluation play in SaMD validation?**
Clinical evaluation under MDR Article 61 and Annex XIV sits on top of the validation evidence. It places the device's performance and safety in the clinical context, supports the clinical benefit claim, and is required for every medical device under MDR regardless of class. For SaMD classified under Annex VIII Rule 11, the extent of the clinical evidence scales with the classification.

**Who owns verification versus validation inside a startup SaMD team?**
In a small team, one person often writes both plans, but the execution splits. Verification execution is an engineering activity close to the development team. Validation execution is a cross-functional activity involving clinical, usability, and QMS roles. MDR Article 10 assigns the overall QMS obligation to the manufacturer, and the named Person Responsible for Regulatory Compliance under Article 15 is accountable for the documentation being complete and current.

## Related reading

- [MDR Software Lifecycle Requirements: How IEC 62304 Helps You Demonstrate Conformity](/blog/mdr-software-lifecycle-iec-62304) — the EN 62304:2006+A1:2015 lifecycle map that the verification side of the V sits inside.
- [DevOps for Medical Software: Continuous Integration and Deployment Under MDR](/blog/devops-medical-software-mdr) — the CI/CD pattern that runs verification continuously as the evidence engine.
- [Software Architecture Documentation Under IEC 62304](/blog/software-architecture-documentation-iec-62304) — the clause 5.3 design artefact the verification branch tests against.
- [MDR Design Validation Under ISO 13485](/blog/mdr-design-validation-iso-13485) — the ISO 13485 clause 7.3.7 validation activity in depth.
- [Software Integration and Integration Testing Under EN 62304](/blog/software-integration-testing-iec-62304) — the clause 5.6 verification activity on the right leg of the V.
- [Software Release Process Under EN 62304](/blog/software-release-process-iec-62304) — the clause 5.8 release activity that closes verification and expects validation to be closed alongside.
- [The Subtract to Ship Framework for MDR Compliance](/blog/subtract-to-ship-framework-mdr) — the methodology pillar this post applies to V&V planning.

## Sources

1. Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices, Annex I Section 17 (software and IT security requirements), Article 10 (manufacturer obligations), Article 61 and Annex XIV (clinical evaluation), Annex II (technical documentation including V&V results). Official Journal L 117, 5.5.2017.
2. EN 62304:2006+A1:2015 — Medical device software — Software life-cycle processes (IEC 62304:2006 + IEC 62304:2006/A1:2015). Harmonised standard covering clauses 5.5 software unit implementation and verification, 5.6 software integration and integration testing, 5.7 software system testing, and 5.8 software release.
3. EN ISO 13485:2016+A11:2021 — Medical devices — Quality management systems — Requirements for regulatory purposes, clause 7.3.7 design and development validation.
4. EN ISO 14971:2019+A11:2021 — Medical devices — Application of risk management to medical devices.
5. EN 62366-1:2015+A1:2020 — Medical devices — Part 1: Application of usability engineering to medical devices.
6. MDCG 2019-11 Rev.1 — Guidance on Qualification and Classification of Software in Regulation (EU) 2017/745 — MDR and Regulation (EU) 2017/746 — IVDR. October 2019; Rev.1 June 2025.

---

*This post is a category-9 spoke in the Subtract to Ship: MDR blog, focused on the verification/validation boundary that every SaMD technical file has to cross. Authored by Felix Lenhard and Tibor Zechmeister. The MDR is the North Star for every claim in this post — EN 62304:2006+A1:2015, EN ISO 13485:2016+A11:2021, EN ISO 14971:2019+A11:2021, and EN 62366-1:2015+A1:2020 are harmonised tools that provide presumption of conformity with specific MDR obligations, not independent authorities. For startup-specific regulatory support on SaMD V&V planning, V-model design, and validation evidence strategy, Zechmeister Strategic Solutions is where this work is done in practice.*

---

*This post is part of the [Software as a Medical Device](https://zechmeister-solutions.com/en/blog/category/samd) cluster in the [Subtract to Ship: MDR Blog](https://zechmeister-solutions.com/en/blog). For EU MDR certification consulting, see [zechmeister-solutions.com](https://zechmeister-solutions.com).*
