Design validation under the MDR is the activity that proves the finished device — or a device representative of the finished device — actually meets the user needs and the intended use in the hands of the intended users, in the intended use environment. It is governed by clause 7.3.7 of EN ISO 13485:2016+A11:2021, traced to MDR Article 10(9) on product realisation, and its outputs live in Annex II Section 6 of Regulation (EU) 2017/745. Validation is not verification. Verification asks whether the device was built against its specifications. Validation asks whether the device, as built, solves the clinical problem for the real user. Usability validation under EN 62366-1:2015+A1:2020 is a mandatory thread inside that, and the clinical evaluation sits alongside it. A validation plan written against the real intended use in the first months of development is the fastest way to get this right; a validation study bolted on at the end is the slowest.
By Tibor Zechmeister and Felix Lenhard. Last updated 10 April 2026.
TL;DR
- Design validation is required by clause 7.3.7 of EN ISO 13485:2016+A11:2021 and traces legally to MDR Article 10(9) on product realisation and the Annex I General Safety and Performance Requirements.
- Validation is performed on the finished device or on a device representative of the finished device, under defined conditions, before release for routine production.
- Validation answers whether the device meets the user needs and the intended use — not whether it meets the internal specifications (that is verification under clause 7.3.6 and post 294).
- Usability validation under EN 62366-1:2015+A1:2020 is an integral part of design validation for any device with a user interface, and the summative evaluation feeds directly into the clause 7.3.7 record.
- The clinical evaluation under MDR Article 61 and Annex XIV is the clinical evidence leg of the validation story; it is separate from, but linked to, the clause 7.3.7 design validation file.
- Validation outputs live in Annex II Section 6 of the technical documentation, not in a parallel file. The auditor expects the chain: user need → validation activity → validation record → Annex II Section 6.
A rehearsal that nearly shipped
A four-person team had a Class IIa device ready for release on a Monday. Verification was complete, every specification ticked, every unit test green, every bench test report signed. On the Friday before, the CTO asked a physiotherapist — one of the actual intended users — to run through the device on a patient. Not a structured validation study. Just a rehearsal.
The physiotherapist picked the device up, held it the wrong way round, misread the screen because the room lighting washed out the display, and performed the measurement on the wrong anatomical site. The device did exactly what the specification said it should do. The user did something the team had never imagined. The Friday rehearsal was not a validation. But it was a warning that validation had not been thought through.
The team pushed release by six weeks, ran a proper clause 7.3.7 validation on production-representative units with three physiotherapists in a real clinic room, fixed the screen contrast, rewrote two labels, and added one training step to the instructions for use. Then they shipped. The auditor later opened the validation file, saw the three user sessions, the observation notes, the design changes traced back to the validation findings, and moved on within fifteen minutes.
This post is about getting to that file on purpose, not by luck.
Validation versus verification — the one distinction that matters
Verification and validation sound similar and get mixed up constantly in startup teams. Clause 7.3.6 (verification) and clause 7.3.7 (validation) of EN ISO 13485:2016+A11:2021 are two different activities with two different questions.
- Verification asks: does the device, as built, meet the design outputs that were derived from the design inputs? It is specification-versus-implementation. Bench tests, code reviews, integration tests, environmental tests, electrical tests — all verification. Post 294 covers this in depth.
- Validation asks: does the device, as built, meet the user needs and the intended use? It is intended-purpose-versus-real-world-performance. Usability summative evaluations, simulated use studies on production-representative units, and the clinical evaluation together make the validation case.
A device can pass every verification activity and still fail validation. The specification can be met perfectly while missing the real user need — because the inputs were incomplete, or the use environment was misunderstood, or the intended user turned out to be different from the person the specification was written for. Validation is the last honest check before release.
The MDR does not use the word "validation" in its article text as a free-standing term, but it requires that a device achieve its intended purpose under normal conditions of use (Annex I General Safety and Performance Requirement 1) and that the manufacturer have a QMS covering product realisation proportionate to the risk class (Article 10(9)). EN ISO 13485:2016+A11:2021 clause 7.3.7 is the harmonised standard mechanism for satisfying that expectation inside the QMS.
What clause 7.3.7 actually requires
Clause 7.3.7 (Design and development validation) of EN ISO 13485:2016+A11:2021 sets out a short but precise list of obligations. In the plain language of a startup running against it:
- Validation is performed in accordance with planned and documented arrangements. The validation plan is set in advance, not improvised. The plan is anchored in clause 7.3.2 design and development planning (post 290) where the validation activities at each stage were scheduled in the first place.
- Validation is performed on the finished device or representative product. The units used for validation must be either the production device or devices manufactured in a way that makes them representative of the finished product. Prototypes built in a different way than the production units cannot carry validation on their own.
- Validation confirms that the resulting product meets the requirements for the specified application or intended use. The reference standard is the user needs and the intended use, not the internal design specification. This is the hard pivot from verification.
- Validation is performed before release for routine production. A device released to market without validation complete is non-compliant with clause 7.3.7, and by extension with the QMS obligation under MDR Article 10(9).
- Validation records are maintained. Records of the validation activities, the results, any necessary actions, and the identity of the people performing and approving the work are all part of the design and development file under clause 7.3.10.
- Clause 7.3.7 also requires, as part of validation, clinical evaluation and/or performance evaluation of the device as required by applicable regulatory requirements. For MDR devices this links clause 7.3.7 directly to MDR Article 61 and Annex XIV clinical evaluation obligations. The validation case and the clinical evaluation are not independent files.
The clause is short. The work it anchors is substantial.
Validation in a representative use environment
The phrase "representative use environment" is the part of validation that most startup teams under-specify. Validation is not running the device on a lab bench under ideal conditions. It is running a production-representative device with intended users performing intended tasks under conditions that reflect the real use environment closely enough that the results are meaningful.
For a home-use device, the use environment includes things like typical domestic lighting, distraction, no clinical supervisor, first-time users reading the instructions for use, and the realistic range of user ability. For a hospital device, the use environment includes the clinical workflow around the device, the presence of other equipment, the real time pressure of the unit it is used in, and the interaction with hospital IT systems. A validation study run in a silent engineering room with the CTO as the user does not reflect either.
The validation plan defines what "representative" means for the specific device. It states the user profile (who the intended users are, what training they have), the use environment (where the device is used, what the conditions are), the use scenarios (what tasks the users perform), and the criteria by which success or failure is judged. These definitions come from the use specification that was set during the usability engineering process under EN 62366-1:2015+A1:2020 — and which should have fed the design inputs in the first place.
Validation on the wrong environment is not validation. It is a bench test in disguise. Auditors read validation reports with exactly this suspicion and ask, early, what the representative use environment was and how it was selected.
Usability validation under EN 62366-1:2015+A1:2020
Every device with a user interface — which is almost every device — is subject to a usability engineering process under EN 62366-1:2015+A1:2020, the harmonised standard that supports MDR Annex I Sections 5 and 22 on usability and human factors. The standard structures the usability work across the whole product lifecycle: use specification, user interface specification, formative evaluations during development, and a summative evaluation at the end.
The summative usability evaluation is the part that directly feeds clause 7.3.7 design validation. It is an evaluation with intended users, performing intended tasks, on the finished or production-representative device, in a representative use environment, aimed at confirming that the usability of the device is adequate — that is, that use errors related to safety have been reduced to acceptable levels and that the identified critical tasks can be performed correctly.
A startup that runs a proper summative evaluation under EN 62366-1:2015+A1:2020 has done most of the usability leg of clause 7.3.7 validation at the same time. The summative evaluation report becomes part of the design validation record. The two activities are not parallel tracks. They are one activity viewed from two angles — usability engineering and design controls — and the sensible move is to plan them together from the start.
Posts 476 and 477 in this blog cover the usability engineering process and the summative evaluation in the depth they deserve. The short version is: if the usability file under EN 62366-1:2015+A1:2020 is done honestly, the usability leg of design validation is done honestly. If the usability file is invented at the end, the design validation inherits that weakness directly.
The clinical evaluation linkage
Clause 7.3.7 explicitly ties design validation to clinical evaluation and/or performance evaluation as required by regulation. For an MDR device, this means the design validation story cannot be separated from the MDR Article 61 clinical evaluation — and, for some devices, from clinical investigations run under MDR Articles 62 through 82 and Annex XV.
The clinical evaluation report (CER) answers whether there is sufficient clinical evidence that the device achieves the intended clinical benefit and that the benefit-risk profile is acceptable. The design validation file answers whether the finished device, in the hands of users in a representative environment, performs as needed. These are two different questions with overlapping answers. Both questions have to be answered, both have to be documented, and both outputs live in Annex II Section 6 of the technical documentation.
A lean startup that runs a tight clinical evaluation aligned to the same intended purpose and user population as the design validation avoids the common trap of having two different "devices" described — the clinical one and the engineering one — which drift apart. The alignment is set in clause 7.3.2 design planning, by defining the intended use, the intended users, and the use environment once, and making both the clinical evaluation and the design validation downstream of that single definition.
Where the documentation lives
Design validation records live in the design and development file under clause 7.3.10 of EN ISO 13485:2016+A11:2021, and the outputs feed the technical documentation at Annex II Section 6 of Regulation (EU) 2017/745 — the section on pre-clinical and clinical data. Post 210 walks through how verification and validation evidence is organised inside Annex II Section 6 so an auditor can trace from a user need or a GSPR item into the specific validation record that supports it.
The traceability chain runs: user need (captured in the use specification under EN 62366-1:2015+A1:2020) → design input → design validation activity → validation record → Annex II Section 6 reference. Every validation activity in the file connects back up the chain to a user need, and down to the record that shows the result. No orphans. That traceability is the single hardest test an auditor will apply to a validation file.
The Design History File (post 298), which is the plain-language name for the design and development file under clause 7.3.10, holds the complete validation record set — protocols, results, user session notes, deviations, change requests triggered by validation findings, and the final validation sign-off. Annex II Section 6 references those records; it does not duplicate them.
Common mistakes startups make
- Calling verification "validation" and leaving the real validation undone. Running bench tests against the spec and labelling the report "design validation" is the commonest and most audit-visible failure.
- Validating on prototype hardware or software that is not production-representative. Clause 7.3.7 is explicit that validation is on the finished device or a representative one. A prototype built differently from the production unit cannot carry the validation alone.
- Running validation in a lab room with engineers as the users. The use environment and the user profile must be representative of real use. Engineers in a silent room are not intended users of most medical devices.
- Skipping summative usability evaluation under EN 62366-1:2015+A1:2020. The summative is where most of the real-world validation signal comes from, and it is mandatory for any device with a user interface.
- Treating the clinical evaluation and the design validation as two disconnected files. They describe the same device to the same intended use. Drift between them is a red flag in review.
- Starting validation planning at the end of development. Clause 7.3.2 requires validation activities to be scheduled in the design plan. Validation that is not planned from the start always gets rushed and always misses things.
The Subtract to Ship angle on design validation
Subtract to Ship (post 065) applied to clause 7.3.7 is a discipline of running the validation activities that actually answer the user-need question, and cutting the ones that only answer the specification question — because those belong in verification (post 294). Most bloated validation files are bloated because they duplicate verification under a different cover page. Most thin validation files are thin because they skip the uncomfortable parts: real users, real environment, real tasks.
The move that pays back most reliably is writing the validation plan in the design planning phase, against the intended use and user profile defined in the use specification, and running a single summative usability evaluation that does double duty as the core of the design validation record. A small startup can do this well with three to six users, production-representative units, a structured protocol, observation notes, and a short validation report that links each finding back to a user need and forward to any design change triggered. That is the whole job. Anything added beyond that must earn its place against a real audit question.
Subtraction is not cutting validation. It is cutting everything in the validation file that is not validation.
Reality Check — Where do you stand?
- Can you state, in one sentence, the difference between your device's verification activities and its validation activities — and point to separate records for each?
- Does your validation plan define the intended users, the intended use environment, and the use scenarios in terms that match the use specification from your usability file under EN 62366-1:2015+A1:2020?
- Are your validation activities run on the finished device or on units that are truly representative of the finished device — not on engineering prototypes built differently?
- Has a summative usability evaluation been performed with intended users, on production-representative units, in a representative environment?
- Does your clinical evaluation under MDR Article 61 describe the same intended use, the same user population, and the same clinical claims as your design validation file?
- Can an auditor trace from a user need to the validation activity that addresses it and to the validation record that documents the result?
- Was the design validation complete and signed off before release for routine production, as clause 7.3.7 requires?
- If your validation file was opened today by an experienced notified body auditor, would they see a credible answer to "does this device, as built, meet the user needs?" or would they see a bench test report in a validation wrapper?
Any "not yet" answer is a place to fix before the next audit window.
Frequently Asked Questions
Is design validation the same as clinical evaluation under the MDR? No. Design validation under clause 7.3.7 of EN ISO 13485:2016+A11:2021 confirms that the finished device meets the user needs and the intended use in a representative use environment. Clinical evaluation under MDR Article 61 and Annex XIV establishes sufficient clinical evidence that the device achieves its intended clinical benefit with an acceptable benefit-risk profile. They are separate files with linked content; clause 7.3.7 explicitly requires that clinical or performance evaluation forms part of the validation activity where applicable.
Can I validate on prototype units or do I need production units? Clause 7.3.7 requires validation on the finished device or on a device that is representative of the finished device. Prototype units built with different processes, different components, or different software than the production version cannot carry design validation on their own. The validation units must be manufactured such that the validation results transfer to what customers receive.
How many users do I need for the summative usability evaluation? EN 62366-1:2015+A1:2020 does not prescribe a fixed number. The number depends on the risk of use errors, the complexity of the user interface, and the diversity of the intended user population. Common practice for medium-risk devices is in the range of a small number of users per distinct user group, enough to reveal use errors related to safety. The rationale for the chosen number is documented in the summative evaluation plan.
Where does the design validation record live in the technical documentation? The validation record itself lives in the design and development file under clause 7.3.10 of EN ISO 13485:2016+A11:2021 (the Design History File in plain language). Its outputs are referenced in Annex II Section 6 of Regulation (EU) 2017/745, which is the pre-clinical and clinical data section of the technical documentation. Post 210 covers how the evidence is organised for audit traceability.
Does clause 7.3.7 apply to a lean Class I device? Yes. Clause 7.3.7 applies to any design project within the scope of EN ISO 13485:2016+A11:2021. The depth of the validation work is proportionate to the risk class and type of device under MDR Article 10(9), but the obligation to validate the finished device against user needs and intended use does not disappear for a low-risk device.
What happens if a validation finding forces a design change? The change is processed through clause 7.3.9 design and development changes, with full impact assessment on inputs, outputs, verification results, and the risk file under EN ISO 14971:2019+A11:2021. Validation is then repeated on the affected areas, on production-representative units, and the design plan under clause 7.3.2 is updated accordingly. Validation findings that drive change are a sign validation is working, not failing.
Related reading
- The Subtract to Ship Framework for MDR Compliance — the methodology applied to every clause 7.3 obligation, including 7.3.7.
- Verification and Validation Evidence in Technical Documentation — how V&V outputs are organised inside MDR Annex II Section 6 for auditor traceability.
- MDR Design and Development Planning: Using ISO 13485 Section 7.3 to Comply — clause 7.3.2, where the validation activities are first scheduled.
- MDR Design Verification: Using ISO 13485 to Prove the Device Was Built Right — clause 7.3.6, the sister activity to validation.
- MDR Design Transfer: Moving From Development to Production Under ISO 13485 — clause 7.3.8, the step that comes after validation is complete.
- The Design History File (DHF): Documenting Your Development Story Under MDR — where the clause 7.3.7 validation records physically live.
- Usability Engineering Under MDR: Running EN 62366-1:2015+A1:2020 in a Startup — the process that feeds the usability leg of design validation.
- Summative Usability Evaluation: The Study That Feeds Design Validation — the specific evaluation inside EN 62366-1:2015+A1:2020 that anchors clause 7.3.7 for devices with a user interface.
Sources
- Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices, Article 10(9) (quality management system obligation including product realisation), Annex I (General Safety and Performance Requirements, including the requirement that devices achieve their intended purpose under normal conditions of use), and Annex II Section 6 (pre-clinical and clinical data). Official Journal L 117, 5.5.2017.
- EN ISO 13485:2016+A11:2021 — Medical devices — Quality management systems — Requirements for regulatory purposes. Clause 7.3.7 (Design and development validation), with supporting clauses 7.3.2 (planning), 7.3.6 (verification), 7.3.9 (changes), and 7.3.10 (design and development file). The harmonised standard providing presumption of conformity with the design and development portion of MDR Article 10(9).
- EN 62366-1:2015+A1:2020 — Medical devices — Part 1: Application of usability engineering to medical devices. The harmonised standard that governs the usability engineering process, including the summative evaluation that feeds the usability leg of clause 7.3.7 design validation.
- EN ISO 14971:2019+A11:2021 — Medical devices — Application of risk management to medical devices. The harmonised standard for the risk management process that validation findings feed back into when design changes are triggered.
This post is part of the Quality Management Under MDR cluster in the Subtract to Ship: MDR blog. Authored by Tibor Zechmeister and Felix Lenhard. The MDR is the North Star. EN ISO 13485:2016+A11:2021 clause 7.3.7 is the tool. Validation is where a startup finds out whether the device it built is the device the user needed — and the honest ones find out on purpose, not by accident.