MDR design verification is the planned activity that proves your design outputs meet your design inputs. That what you built matches what you specified. Under EN ISO 13485:2016+A11:2021 clause 7.3.6, verification must be planned, performed, recorded, and tied back to the inputs it tested. It is the "did we build the device right" half of design controls, distinct from validation, which asks "did we build the right device." The legal hook is MDR Article 10(9), which requires a QMS covering product realisation. The evidence produced under clause 7.3.6 lives in Annex II Section 6 of Regulation (EU) 2017/745. Verification is not testing in the abstract. It is a specific test against a specific specification with a specific acceptance criterion, documented under document control, and traceable to the design input it answers.
By Tibor Zechmeister and Felix Lenhard. Last updated 10 April 2026.
TL;DR
- MDR design verification is governed by clause 7.3.6 of EN ISO 13485:2016+A11:2021, applied under the design and development plan required by clause 7.3.2, all traceable to MDR Article 10(9).
- Clause 7.3.6 requires that design and development verification be performed in accordance with planned and documented arrangements to ensure that design and development outputs meet the design and development inputs.
- Every verification activity has a specification under test, a method, an acceptance criterion, a result, and a pass/fail decision recorded under document control.
- Verification is not validation. Verification tests outputs against inputs. Validation tests the finished device against user needs and intended use. A device can pass verification and still fail validation.
- Verification records live in Annex II Section 6 of Regulation (EU) 2017/745 and must trace bidirectionally to the design inputs they test and to the GSPR and risk controls they support.
- When a design change is made after verification, clause 7.3.9 requires the change to be evaluated for its impact and the affected verification activities to be repeated or re-assessed.
Why design verification decides what lands in Section 6
A founder with a shipping device and a weak Section 6 walks into a Notified Body audit carrying a pile of test reports. The reports look thorough. The auditor picks one, reads it, and asks a single question: which design input did this test verify? If the answer is a confident pointer to a numbered requirement in the design inputs document, the conversation keeps moving. If the answer is "that was a general performance test we ran," the conversation stops. A general test is not a verification activity. Verification is a planned check that a specific output meets a specific input. Everything that is not that is something else. Useful, perhaps, but not clause 7.3.6 evidence.
The same move shows up from the other direction. An auditor picks a design input. "the device shall measure within plus or minus five percent across the stated operating range". And asks for the verification record. A team that ran clause 7.3.6 properly points to a controlled test report with the specification cited, the method described, the acceptance criterion stated, the result captured, and the pass/fail decision signed off. A team that did not runs around looking for test data that was probably produced but never linked back to the input. The verification may have happened. The evidence that it happened in response to that specific input does not exist in a form the auditor can accept.
Design verification is the half of design controls where the inputs you wrote down in clause 7.3.3 are confronted by the outputs you produced in clause 7.3.4. It is a closing of the loop. If the loop is not closed, the rest of the file cannot rest on it.
What verification means versus what validation means
The two words are not interchangeable in clause 7.3, even though they are used loosely in everyday engineering conversation. Clause 7.3.6 (verification) and clause 7.3.7 (validation) are separate sub-clauses with different inputs, different activities, and different evidence.
Verification checks that design and development outputs meet the design and development inputs. The inputs are the specifications the team wrote down. The performance numbers, the tolerances, the functional requirements, the regulatory requirements the device must satisfy. The outputs are what the team built: the drawings, the code, the hardware, the labelling. Verification asks whether the outputs match the inputs that generated them. The answer is measured against pass/fail acceptance criteria defined before the test. Electrical safety testing against a defined limit. Software unit tests against a functional requirement. Accuracy testing against a tolerance band. These are verification activities.
Validation checks that the finished device, in the hands of its intended users, in its intended use environment, meets the user needs and intended use. Validation asks whether the device actually does what it is supposed to do for the people who will use it. Summative usability evaluation, clinical validation, simulated-use testing. These are validation activities, and they are governed by clause 7.3.7.
A device can pass every verification test and fail validation. The specifications can be correct against themselves and wrong against the world. That is why both clauses exist and why the file must distinguish their records. A Section 6 that lumps verification and validation into one folder of "test reports" fails the basic requirement to show each activity was performed against the right reference. The verification and validation evidence post (210) walks through how both sit inside Annex II Section 6; this post stays on the verification half.
What clause 7.3.6 actually requires
Clause 7.3.6 of EN ISO 13485:2016+A11:2021 is titled "Design and development verification." The clause requires the organisation to perform design and development verification in accordance with planned and documented arrangements to ensure that the design and development outputs meet the design and development inputs. It requires procedures for verification to be documented. It requires records of the results of verification, the criteria, the conclusions, and any necessary actions to be maintained.
The clause also recognises that if the intended use requires the device to be connected to or have an interface with other devices, verification must include confirmation that the design outputs meet design inputs when so connected or interfaced.
In plain startup language, that translates into a short list the project has to satisfy for every verification activity:
- Planned arrangements. Verification does not happen ad hoc. The activity was anticipated in the design and development plan under clause 7.3.2 and assigned to a stage. See the design and development planning post (290) for how the plan reserves verification activities by stage.
- A specific design input under test. The activity names the input from clause 7.3.3 that it is verifying. Not "performance in general". A specific, numbered requirement.
- A documented method. The procedure the verification used is written down so a reviewer or a later team member could run the same method and get a comparable result.
- An acceptance criterion defined before the test. The criterion says what counts as pass and what counts as fail. A result with no acceptance criterion is a data point, not a verification.
- A recorded result. The actual outcome is captured under document control with a stable identifier, a date, and an owner.
- A pass/fail conclusion. The record states explicitly whether the design output met the input or did not.
- Necessary actions where the result was a fail. A failed verification triggers corrective action. A change to the design, a change to the input, or an investigation. The action is recorded and closed.
- Interface confirmation where applicable. Where the device connects to or interoperates with other devices as part of its intended use, verification confirms the outputs meet the inputs in that configuration, not only in isolation.
Clause 7.3.6 is not a long clause. It is short on purpose. The weight is in the discipline of doing it properly rather than in the number of words the standard uses to describe it.
Planning verification activities
Verification activities are not added to the project when a test is about to happen. They are planned in clause 7.3.2 at the point the design and development plan is written, and they are refined as the design inputs and outputs become concrete.
For every stage of the plan where verification belongs, the plan names the verification activities that will happen at that stage, assigns owners, and allocates resources. When the inputs document is baselined under clause 7.3.3, each input is tagged with the verification method that will prove it. Inspection, analysis, demonstration, or test. When the outputs are produced under clause 7.3.4, each output is linked to the inputs it implements and to the planned verification activity that will confirm the link.
By the time the team runs a verification test, three things are already in place: the input being tested, the method selected to test it, and the acceptance criterion defined. The test run itself produces the result and the conclusion. The record is the document that ties all five together. Input, method, criterion, result, conclusion. Under document control, with a named reviewer.
The fast startups do this planning once, then maintain it as the project progresses. The slow startups improvise verification when the audit clock starts ticking and then try to back-document the planning. Back-documented planning is always visible to an auditor because the dates, the owners, or the rationales do not quite line up with the rest of the project history.
Traceability to inputs. The spine of clause 7.3.6
The single defining property of a verification activity is that it answers an input. Without the link from the verification record to the design input under test, the activity is not clause 7.3.6 verification. It is engineering testing that may or may not satisfy anything the Regulation requires.
The practical shape of this link is a traceability matrix maintained through the project. The matrix lists every design input down one axis and carries, for each input, a reference to the design output that implements it, the verification activity that tests it, the acceptance criterion, and the pass/fail result. The matrix is bidirectional. Every verification record also names the input it answered. Stable identifiers are used throughout. Page numbers, file paths, and screenshots of folders are not identifiers. Document IDs, requirement IDs, and test case IDs are.
When an auditor walks into Section 6 top-down from the GSPR checklist in Annex II Section 4, or bottom-up from the risk controls in Annex II Section 5, the traceability matrix is what lets them find the verification evidence in two steps rather than twenty. A disciplined matrix turns a painful audit into a boring one, which is the outcome you want.
The matrix is not separate from the work. It is produced as the inputs, outputs, and verification activities are produced. Reconstructing a traceability matrix after the verification is done is always more work than maintaining it alongside the verification, and it is always less credible.
Recording verification evidence
A verification record is a controlled document. The record identifies the specification under test, states the method, states the acceptance criterion, captures the result, and states the pass/fail conclusion with a signed or electronically-equivalent approval by the named owner.
For software, this means a controlled test report that references the software requirements specification, the test cases, the acceptance criteria, and the results. Continuous-integration pipeline output can be attached or referenced as raw evidence but cannot stand alone as the verification record. The verification and validation evidence post (210) covers the software V&V expectations under EN 62304 in more depth.
For hardware, the record is a test report against a standard or an internal specification, with the test setup, the equipment used, the calibration status of the measurement instruments where applicable, and the raw data retained or referenced.
For inspection-based or analysis-based verification. Where "test" is the wrong word because no physical test is run. The record is still a controlled document with the same elements. A design review that inspects an output against an input produces a review record. An engineering analysis that proves an output meets an input produces an analysis report. Clause 7.3.6 accepts any method appropriate to the input being verified, as long as the method is documented, the criterion is defined, and the record is controlled.
Re-verification after design changes
Design changes are inevitable. Clause 7.3.9 of EN ISO 13485:2016+A11:2021 requires that design and development changes be identified, reviewed, verified and validated as appropriate, and approved before implementation. The review of the change determines the effect on constituent parts and product already delivered, and on inputs and outputs of the design and development process.
In practice, this means that when a design change is made. A component swap, a firmware update, a specification adjustment, a risk control change. The team has to assess which previously-completed verification activities are still valid and which need to be repeated. The rule is not that every change triggers every test again. The rule is that every change is analysed for its verification impact, and the analysis is recorded.
The analysis asks three questions. Does the change affect any design input? Does the change affect any design output that was previously verified? If either answer is yes, does the change invalidate the previous verification result or leave it still valid? Where the change invalidates a previous verification, the affected activity is repeated against the changed output and the record is updated. Where the change does not invalidate the previous verification, that conclusion is recorded with a rationale.
A failure pattern here is assuming that small changes do not need re-verification. A firmware update that "just fixes a bug" may have changed behaviour in areas that were previously verified, and the previous verification result is no longer a guarantee. A disciplined change review catches this. An undisciplined one produces a device on the market whose verification record is already stale.
Common mistakes in design verification
- Running tests without a defined acceptance criterion. A result with no criterion is not a pass or a fail. It is a measurement.
- No link from the test report to a specific design input. The test happened, the report exists, but the input it was answering is not named. An auditor cannot close the loop.
- Treating verification and validation as one folder. Clause 7.3.6 and clause 7.3.7 are separate. Evidence that mixes them cannot be sampled cleanly against either clause.
- Verification plan invented after the work is done. Back-dated planning is detectable and is the single fastest credibility loss in a design audit.
- Failed tests with no corrective action recorded. Clause 7.3.6 requires records of any necessary actions. A failed verification that disappeared without a trail is a worse finding than a failure that was handled properly.
- Software verification reduced to CI screenshots. CI output is raw data. The controlled record is the verification evidence. Under EN 62304 the controlled record is expected.
- Interface verification skipped. When the intended use includes connection to other devices, verification in the connected configuration is explicitly required by clause 7.3.6. Teams that only verify the device in isolation fail this sub-requirement.
- No re-verification after design changes. Clause 7.3.9 is where this obligation lives. A verification record that predates the current design is not evidence of conformity for the current design.
The Subtract to Ship angle on verification
Applied to clause 7.3.6, Subtract to Ship (post 065) is a simple rule: every verification activity in the file must trace to a specific design input, and every design input that needs verification must have exactly one canonical verification activity tied to it. Extra tests that do not answer inputs come out. Duplicate verification activities that test the same input with overlapping methods are consolidated to one canonical record. Exploratory engineering testing that never reached an acceptance criterion is moved out of the clause 7.3.6 evidence chain and into an engineering file if it is worth keeping at all.
The move that produces the biggest return is writing the acceptance criterion before the test, not after. Teams that write the criterion first run fewer tests, get cleaner pass/fail answers, and never face the trap of a result that cannot be classified. Teams that run the test first and then try to decide what the criterion should have been spend twice the engineering time and produce half the audit credibility.
Subtraction here is not about doing less verification. It is about doing exactly the verification the Regulation and the standard require, once, at the right stage, with the record built at the time of the work. What comes out is the work that was never required and was never serving the certification outcome.
Reality Check. Where do you stand?
- For every design input in your inputs document, can you name the specific verification activity that will answer it, or did answer it, and the acceptance criterion defined before the test?
- Does your file distinguish verification records (against inputs) from validation records (against user needs and intended use) with separate identifiers, separate owners where appropriate, and no cross-contamination between folders?
- For every verification record, is the input under test named with a stable identifier, is the method documented, is the acceptance criterion stated before the result, and is the pass/fail conclusion explicit and signed?
- Where the device is intended to connect to or interoperate with other devices, have you performed verification in the connected configuration and recorded the result?
- When the last design change was made, did you run a documented analysis of which verification activities were affected, repeat the ones that needed repeating, and record the rationale for the ones you left alone?
- Could an auditor pick a random design input from your inputs document and follow a single chain. Input, output, verification activity, result, conclusion. To a controlled record in under two minutes?
- Are your failed verification results kept in the record with corrective actions traced and closed, or have they quietly disappeared from the file?
- Has any verification activity in your file been run without an acceptance criterion that was defined before the test started?
Any "not yet" answer is a specific thing to fix before the next audit window.
Frequently Asked Questions
What is the difference between design verification and design validation under MDR? Design verification under EN ISO 13485:2016+A11:2021 clause 7.3.6 proves that design outputs meet design inputs. That what you built matches what you specified. Design validation under clause 7.3.7 proves that the finished device meets user needs and intended use in the intended use environment. Both are required. A device can pass verification and fail validation if the specifications do not capture what users actually need, and a device can pass validation only if it was built to specifications that were verifiable in the first place.
Does the MDR itself require design verification? The MDR text does not use the phrase "design verification" as a standalone obligation. MDR Article 10(9) requires a QMS covering product realisation, including design and development, proportionate to the risk class and type of device. The harmonised standard EN ISO 13485:2016+A11:2021 provides presumption of conformity with that obligation, and clause 7.3.6 is where verification is specified. The evidence produced under clause 7.3.6 is cited in Annex II Section 6 of Regulation (EU) 2017/745 as part of the pre-clinical data demonstrating conformity with the General Safety and Performance Requirements in Annex I.
How detailed does a verification record need to be? Detailed enough that a reviewer who was not present at the test can read the record and understand the input under test, the method used, the acceptance criterion, the result, and the conclusion, without having to ask a question. For a simple inspection-based verification this can be a short controlled document. For a complex software or electrical safety verification it can be a long test report. The standard is completeness against the five elements. Input, method, criterion, result, conclusion. Not a fixed page count.
Do I need to re-verify a design after every change? You need to analyse every change for its verification impact and re-verify wherever the change affects a previously-verified output. Clause 7.3.9 of EN ISO 13485:2016+A11:2021 governs this. The analysis itself is recorded. Small changes may not require re-running any verification activity; larger changes may require rerunning several. The rule is that the decision is documented and defensible, not that every change triggers every test.
Can test output from a continuous integration pipeline be the verification record? It can be part of the record but not the whole record. Clause 7.3.6 requires controlled documentation that identifies the input under test, the method, the acceptance criterion, the result, and the conclusion, with an owner. CI output is raw test data that can be referenced from the controlled record. The controlled record is what an auditor reads; the CI output is what the record points to as evidence.
Where in the technical documentation do verification records live? Annex II Section 6 of Regulation (EU) 2017/745 is the location. Section 6 holds the pre-clinical and clinical data, including design verification reports. The records are referenced from the GSPR checklist in Annex II Section 4 and from the risk management file in Annex II Section 5 through stable identifiers that form the traceability chain an auditor walks during sampling.
Related reading
- What Is a Quality Management System for Medical Devices? – the pillar post for the Quality Management Under MDR cluster and the home of MDR Article 10(9).
- MDR Design and Development Planning: Using ISO 13485 Section 7.3 to Comply – deep dive on clause 7.3.2, where verification activities are first anticipated and assigned to stages.
- Design Inputs Under MDR – deep dive on clause 7.3.3, the source of the specifications that verification activities test against.
- Design Outputs Under MDR – deep dive on clause 7.3.4 and how outputs are prepared so verification can confirm they meet the inputs.
- Design Reviews Under MDR – deep dive on clause 7.3.5 and on the review discipline that sits alongside verification.
- Design Validation Under MDR – deep dive on clause 7.3.7 and the validation half of the loop.
- Design Transfer Under MDR – deep dive on clause 7.3.8 and how verified outputs move to production.
- The Design History File (DHF): Documenting Your Development Story Under MDR – how the clause 7.3.10 design and development file collects verification records alongside the other design outputs.
- Verification and Validation Evidence in Technical Documentation – how verification and validation evidence together populate Annex II Section 6.
- The Subtract to Ship Framework for MDR Compliance – the methodology that disciplines the verification work described in this post.
Sources
- Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices, Article 10(9) (quality management system obligation, including product realisation and design and development) and Annex II Section 6 (pre-clinical and clinical data, including verification). Official Journal L 117, 5.5.2017.
- EN ISO 13485:2016+A11:2021. Medical devices. Quality management systems. Requirements for regulatory purposes. Clause 7.3.6 (Design and development verification), with clauses 7.3.2 (planning), 7.3.3 (inputs), 7.3.4 (outputs), and 7.3.9 (change control) as the surrounding framework.
- EN ISO 14971:2019+A11:2021. Medical devices. Application of risk management to medical devices. The harmonised standard that ties risk controls to the verification activities that confirm their implementation.
This post is part of the Quality Management Under MDR cluster in the Subtract to Ship: MDR blog, linking up to the QMS pillar. Authored by Tibor Zechmeister and Felix Lenhard. The MDR is the North Star. EN ISO 13485:2016+A11:2021 clause 7.3.6 is the tool. Verification is where a device stops being a claim and starts being something a Notified Body can accept on the evidence.