Verification and validation evidence in MDR technical documentation is the body of test reports, engineering studies, software V&V records, biocompatibility data, usability evaluations, and clinical evaluation results that together prove a device meets both its own specifications and its intended purpose. It lives in Annex II section 6 of Regulation (EU) 2017/745. Verification answers "did we build the device right, against the specifications?" Validation answers "did we build the right device, against the intended purpose and user needs?" Both are required. EN ISO 13485:2016+A11:2021 clauses 7.3.6 and 7.3.7 define the design verification and validation activities inside the QMS. EN ISO 14971:2019+A11:2021 ties the V&V back to risk controls. EN 62304:2006+A1:2015 defines the software V&V lifecycle for medical device software. Section 6 holds the evidence; the GSPR checklist in section 4 and the risk file in section 5 are how an auditor walks from a requirement or a risk into the specific V&V record that supports it.

By Tibor Zechmeister and Felix Lenhard. Last updated 10 April 2026.


TL;DR

  • Annex II section 6 of Regulation (EU) 2017/745 is the location inside the technical documentation where pre-clinical and clinical data live — including verification test reports, validation studies, software V&V, biocompatibility, and the clinical evaluation report.
  • Verification and validation are not synonyms. Verification checks the device against its specifications. Validation checks the device against the intended purpose and user needs. MDR Annex I demands both, and EN ISO 13485:2016+A11:2021 clauses 7.3.6 and 7.3.7 define how the QMS runs each one.
  • Every V&V record must trace forward to at least one GSPR item in section 4 of Annex II and at least one risk control in the risk management file under EN ISO 14971:2019+A11:2021. Records that do not trace are shelfware.
  • For software-containing devices, software V&V under EN 62304:2006+A1:2015 produces a specific set of records that belong in section 6 and map to the software safety class of the item under test.
  • Auditors sample section 6 by picking a requirement or a risk and walking the chain down into the evidence. Broken traceability is the single most common finding here and the easiest to avoid if the links are built as the work is done, not reconstructed at the end.

Verification versus validation — the distinction that decides the file

The two words get used interchangeably in conversation. They are not interchangeable in the file. Auditors who open section 6 expect to see both, labelled correctly, with different inputs and different acceptance criteria.

Verification is the check that the device, as built, meets the specifications the design defined. Did the output match the input? The input is the design specification — the numbers, the tolerances, the performance claims the team wrote down. The output is the device as manufactured. Verification is done against those specifications with test methods that produce pass or fail answers. Electrical safety testing against a defined limit. Mechanical fatigue testing against a cycle count. Accuracy testing against a measurement range. These are verification activities.

Validation is the check that the device, in the hands of its intended users, in its intended use environment, meets the intended purpose and user needs. Did we build the right thing? Usability validation with representative users performing representative tasks. Clinical validation that the device produces the clinical benefit the intended purpose claims. Performance validation in simulated or real use conditions. Validation is done against user needs and intended use — not against internal specifications.

The distinction matters because a device can pass every verification test and still fail validation. A glucose meter can meet its measurement accuracy specification in the lab (verification pass) and still fail validation because the intended users — elderly patients with dexterity limitations — cannot operate it correctly. Verification covered the number. Validation covered the use. Both are required. Neither substitutes for the other.

EN ISO 13485:2016+A11:2021 makes the QMS-level distinction explicit. Clause 7.3.6 covers design and development verification, which shall be performed to ensure that design and development outputs meet the design and development inputs. Clause 7.3.7 covers design and development validation, which shall be performed to ensure that the resulting product is capable of meeting the requirements for the specified application or intended use. Both clauses require records of the results and any necessary actions. A section 6 that merges verification and validation into one undifferentiated pile of reports is a section 6 that has not read these clauses.

What belongs in Annex II section 6

Annex II section 6 of Regulation (EU) 2017/745 holds the pre-clinical and clinical data that demonstrate the device is safe and performs as intended. The practical contents vary by device but the typical categories are the same across most files.

  • Design verification test reports. The evidence that the design outputs meet the design inputs. Electrical safety testing, mechanical testing, performance testing, interoperability testing, stability and shelf life testing where applicable, sterility testing where applicable. Each report identifies the specification under test, the test method, the acceptance criterion, the result, and the pass/fail decision.
  • Design validation records. The evidence that the finished device meets user needs and intended use. Summative usability evaluation, simulated use testing, validation in the intended use environment, and the clinical evaluation report under MDR Article 61 and Annex XIV.
  • Biological evaluation. Where the device has direct or indirect patient contact, the biological evaluation within a risk management process. The evaluation can cite prior testing, material characterisation, and new tests where needed.
  • Software verification and validation records. For devices containing software, the software V&V lifecycle records built under EN 62304:2006+A1:2015. Unit verification, integration verification, system verification, and software validation against software requirements.
  • Electrical safety and EMC reports. For medical electrical equipment, the test reports against the applicable parts of the EN 60601 series, including the collateral EMC standard.
  • Usability engineering file. The formative and summative usability evaluations and the usability validation record, typically structured under the applicable usability engineering standard.
  • Clinical evaluation report. The output of the clinical evaluation process under MDR Article 61 and Annex XIV. It is a distinct deliverable and a dense one, but it lives under section 6 because it is the principal validation evidence for the clinical claims in the intended purpose.

Each record in section 6 has a controlled identifier, a version, an owner, a date, and a status under the document control procedure required by EN ISO 13485:2016+A11:2021. A report that is not under document control is a report the auditor cannot trust, because its provenance cannot be established.

The volume of section 6 varies enormously. A Class I software utility can have a short section 6. A Class IIb active implantable can have thousands of pages. The test is not volume but completeness against the specifications, the intended use, the risk file, and the GSPR checklist.

Traceability from requirements to tests — how the chain is supposed to work

The single most audited property of section 6 is traceability. Every test report in section 6 should answer a specific question that was asked somewhere else in the file. Two ends of the chain matter most.

One end is the GSPR checklist in Annex II section 4. Each row of the checklist names a method of demonstration and an evidence pointer. When the method is "in-house testing" or "conformity with a harmonised standard via testing," the evidence pointer lands in section 6 on a specific test report. The auditor picks a GSPR row — say, the electrical safety requirement — and follows the pointer into section 6 to read the report. The report has to match the claim the row makes, at the version cited.

The other end is the risk management file in section 5, built under EN ISO 14971:2019+A11:2021. Risk controls are not just sentences in a risk register. Each control is implemented somewhere in the device and verified somewhere in section 6. The auditor picks a hazard, follows its risk control, and asks where the control was verified. The answer is a V&V record in section 6, identified by a stable reference. If the risk file names the control but no V&V record confirms it was implemented and tested, the residual risk claim collapses.

A working traceability scheme has three properties. It uses stable identifiers — document IDs, requirement IDs, risk IDs — not page numbers or file paths. It is bidirectional, so a test report in section 6 knows which GSPR rows and which risk controls depend on it, and a GSPR row knows which test reports support it. And it is maintained through document control, so every update to a test report triggers a review of the dependent rows and risks in the same change. The GSPR Annex I checklist post covers the checklist side of this discipline in detail, and the MDR Annex II structure post places section 6 inside the full file.

The failure mode to avoid is a section 6 organised by test category — one folder for electrical, one for mechanical, one for software — with no links back into the GSPR checklist or the risk file. The reports exist. The linkage does not. The auditor can find the reports and still conclude the file is incomplete, because the reports do not demonstrate anything about the conformity argument until they are tied to the requirements they satisfy.

Software V&V references — what EN 62304 expects in section 6

For devices containing software as a component or as the device itself, EN 62304:2006+A1:2015 governs the software development lifecycle. The standard is referenced by MDR Annex I for software lifecycle processes and it defines the software V&V activities that produce records for section 6.

The standard classifies each software item into one of three safety classes. Class A is software where no injury or damage to health is possible. Class B is software where non-serious injury is possible. Class C is software where death or serious injury is possible. The required V&V activities scale with the class. A Class A item requires less evidence than a Class C item, but even Class A requires a documented software requirements specification, a software architecture for the system level, software unit implementation, and integration and system testing sufficient to demonstrate the requirements are met.

Software V&V records that typically land in section 6 include the software requirements specification, the software architecture document (Class B and C), the software unit verification records (Class C explicitly requires documented unit verification), integration test records, system test records, and the software validation record that shows the finished software meets the software requirements. Anomaly management records — the list of known issues, the analysis of their risk, and the rationale for their acceptance or resolution — also belong in this cluster.

The common failure in startup software V&V files is treating the test output from a continuous integration pipeline as the V&V record. Automated test output is useful raw data. It is not an V&V record by itself. The record is the controlled document that references the specification under test, the test cases designed to exercise it, the acceptance criteria, the results, and the pass/fail conclusion under the owner's signature or electronic equivalent. CI output goes into the controlled record as an attachment or a referenced artifact, not instead of it.

The software V&V under EN 62304 post walks through the class-by-class expectations in more depth.

Common V&V gaps in startup technical files

Across section 6 chapters reviewed in startup audits, the same gaps repeat.

  • Verification and validation treated as synonyms. The file has tests but no distinction between which tests prove the device meets its specifications and which prove the device meets the intended use. The auditor asks for design validation evidence separately from design verification and the team points back at the same folder.
  • Test reports without acceptance criteria. The report shows the test was performed and the results were recorded, but nothing in the report says what the result was compared against. A result without an acceptance criterion is not a pass or a fail — it is a data point.
  • No link from the report to a specification. The report tested something but does not identify which requirement from the design inputs or from the GSPR checklist it was testing. The reader has to guess.
  • Software V&V reduced to CI screenshots. The controlled record is missing. The CI output is treated as sufficient evidence. Under EN 62304:2006+A1:2015 it is not.
  • Usability validation skipped. Summative usability evaluation under the usability engineering process is deferred or replaced with informal feedback. A section 6 with no summative usability record for a device that requires one is a section 6 with a structural hole.
  • Clinical evaluation report disconnected from the intended purpose in section 1. The clinical claims in the intended purpose drift between submissions and the clinical evaluation no longer evaluates what the device description now claims. Scope drift inside the file.
  • No version of the V&V records themselves. Reports exist without a version number, a status, or a date. The auditor asks which version of the report is current and the team cannot answer.
  • Risk controls not verified. The risk file names a control. Nothing in section 6 verifies the control was implemented and is effective. Residual risk claim is unsupported.
  • Reports for superseded designs left in place. An earlier prototype's test report is still in section 6 next to the production design's report, with no indication which one applies. The auditor assumes the wrong one is current.

Every one of these is preventable at the moment the record is produced. The fix is not heroic. It is simply doing the controlled record properly at the time of the work rather than trying to reconstruct it the week before audit.

How auditors sample section 6

The auditor does not read section 6 end to end. Nobody reads section 6 end to end. The auditor samples. Understanding how they sample lets you build the section to survive the sample.

Three sampling approaches show up consistently. The first is top-down from the GSPR checklist. The auditor picks three or five rows from section 4, ideally covering different chapters of Annex I, and follows each pointer into section 6. They read the referenced report, check that the version matches, check that the acceptance criterion is present, check that the result is explicit, and check that the report corresponds to the specification the row claims it covers. If the chain holds on the samples, confidence in the whole file rises. If it breaks on one sample, the auditor widens the sample.

The second approach is bottom-up from the risk file. The auditor picks a hazard and asks how the risk control was verified. They follow the risk control reference into section 6, read the V&V record, and check that the record actually covers what the control claims. A risk control that says "software alarm implemented" and a V&V record that does not exercise the alarm under the conditions the hazard describes is a broken chain.

The third approach is sideways from a user-facing claim. The auditor reads the intended purpose in section 1 or a performance claim on the label or in the IFU in section 2 and asks for the validation evidence. They follow into section 6 looking for a validation record — not a verification record — that supports the claim under realistic use conditions. Claims that were verified in the lab but never validated in use are common and get caught here.

Building section 6 so it survives all three samples means building three sets of working links: from the GSPR checklist in section 4, from the risk file in section 5, and from the intended purpose and claims in sections 1 and 2. The links are the product. The test reports are the raw material.

The Subtract to Ship angle

Section 6 attracts padding. Every engineer wants to add the edge-case tests they ran, the exploratory studies that did not reach a conclusion, the early prototype reports that are no longer relevant, and the parallel test series that were run before the design froze. The accumulated file is heavy, internally contradictory, and harder to audit than a leaner file would be.

The Subtract to Ship move for section 6 is to apply the same rule that the Subtract to Ship framework for MDR applies everywhere else. Every record in section 6 has to trace to a specific design input, a GSPR row, a risk control, or an intended-purpose claim. If it does not, it comes out. Records that describe superseded designs come out — or are explicitly marked as historical and moved out of the live evidence chain. Duplicate test series come out, with one canonical report retained. Exploratory data that did not reach an acceptance criterion comes out of section 6 and goes into the engineering file if it is worth keeping at all.

A disciplined subtraction pass on a typical startup section 6 removes 20 to 40 percent of the volume without touching a single required piece of evidence. What remains is faster to update, easier to audit, and internally consistent in a way the bloated version never was. Structure and discipline beat volume here, the same way they beat volume in every other section of the file.

Reality Check — Where do you stand?

  1. For every test report in your section 6, can you name the specific design input or GSPR row it was testing and the acceptance criterion it was tested against?
  2. Does your file distinguish verification records (against specifications) from validation records (against intended use and user needs), explicitly, with different owners if necessary?
  3. For every risk control in your risk management file, is there a specific V&V record in section 6 that verifies the control was implemented and is effective?
  4. If the auditor picks an intended-purpose claim tomorrow, can you show a validation record that supports it under realistic use conditions — not just a verification test in a lab?
  5. Are your software V&V records controlled documents that reference the software requirements specification, the test cases, and the acceptance criteria, or are they CI pipeline outputs standing alone?
  6. Does every test report in section 6 carry a stable identifier, a version, a date, and a status under your document control procedure?
  7. Have you deliberately removed superseded-design reports from the live evidence chain, or are they still sitting next to the current ones with no indication which is which?
  8. Could you walk an auditor from a GSPR row, from a risk control, and from an intended-purpose claim all the way down into section 6 without opening more than three documents in any of the three walks?

Frequently Asked Questions

What is the difference between verification and validation under MDR? Verification is the check that the device meets its design specifications — the inputs the design team wrote down. Validation is the check that the finished device meets the user needs and intended use. Both are required by MDR Annex I via the QMS, and EN ISO 13485:2016+A11:2021 distinguishes them explicitly in clauses 7.3.6 (verification) and 7.3.7 (validation). A device can pass verification and fail validation if the specifications do not fully capture what the users need.

Where in the technical documentation does V&V evidence live? Annex II section 6 of Regulation (EU) 2017/745 is the location. Section 6 holds the pre-clinical and clinical data, including design verification reports, design validation records, software V&V, biocompatibility evidence, usability engineering file, electrical and mechanical testing, and the clinical evaluation report.

Does software V&V under EN 62304:2006+A1:2015 replace design verification and validation under ISO 13485? No. It is one component of it. EN 62304:2006+A1:2015 defines the software development lifecycle including the V&V activities specific to software, with expectations scaling by software safety class (A, B, or C). The QMS obligations under EN ISO 13485:2016+A11:2021 clauses 7.3.6 and 7.3.7 still apply to the device as a whole. For a software-only device, the EN 62304 records are a large part of the V&V evidence but the QMS framing around them still comes from EN ISO 13485:2016+A11:2021.

Do I need to run new validation if my device is based on well-established technology? The requirement to validate against intended use does not disappear because the technology is well-established. What can change is the evidence mix. Established technology may allow validation to draw heavily on existing literature or prior evidence for the specific aspects the technology covers. The device as a whole — including how it puts the established technology to use in its intended application — still has to be validated against its own intended purpose. The Subtract to Ship framework for MDR describes how to identify the cheapest legitimate evidence pathway without cutting required work.

How are V&V records linked to the risk management file? Under EN ISO 14971:2019+A11:2021 the risk management process identifies hazards, estimates and evaluates risks, and implements risk controls. Every risk control has to be verified — the verification establishes that the control was implemented and is effective. That verification evidence sits in section 6 and is linked back to the risk file by stable identifiers so the chain is traceable in both directions. A risk file without verification links into section 6 is a risk file that cannot defend its residual risk claims.

Can continuous integration test results serve as the software V&V record? They can be part of the record but not the whole record. EN 62304:2006+A1:2015 expects controlled documentation that identifies the requirements under test, the test design, the acceptance criteria, and the results, with appropriate authorisation. CI output is raw test data that can be referenced from the controlled record or attached to it. The controlled record is what an auditor reads. The CI output is what the record points to as raw evidence.

How much V&V is enough for a Class I device? Enough to demonstrate that the device meets every applicable GSPR in Annex I and that every risk control in the risk file has been verified. For a simple Class I device, this can be a compact section 6. The Regulation does not set a volume target; it sets a completeness and traceability target. A compact section 6 that covers every applicable requirement and every risk control is sufficient. A voluminous section 6 with gaps is not.

Sources

  1. Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices, Article 10(4) (manufacturer obligation to draw up and keep up to date the technical documentation), Annex I (General Safety and Performance Requirements), Annex II section 6 (pre-clinical and clinical data, including verification and validation). Official Journal L 117, 5.5.2017.
  2. EN ISO 13485:2016 + A11:2021 — Medical devices — Quality management systems — Requirements for regulatory purposes, clauses 7.3.6 (design and development verification) and 7.3.7 (design and development validation).
  3. EN ISO 14971:2019 + A11:2021 — Medical devices — Application of risk management to medical devices.
  4. EN 62304:2006 + A1:2015 — Medical device software — Software life-cycle processes.

This post is part of the Technical Documentation & Labeling series in the Subtract to Ship: MDR blog. Authored by Felix Lenhard and Tibor Zechmeister. Tibor has audited section 6 on both sides of the Notified Body table — as the lead auditor sampling test reports and as a founder assembling the V&V evidence for his own devices. The discipline described here is the discipline that turns section 6 from the heaviest part of the file into a section an auditor can walk through without friction.