Design reviews under EN ISO 13485:2016+A11:2021 clause 7.3.5 are the planned, documented, multi-disciplinary evaluations that check whether the design is meeting its inputs, whether problems are being identified and resolved, and whether the project is ready to proceed to the next stage. The clause requires reviews at suitable stages defined in the design plan, the participation of representatives of the functions concerned with the stage being reviewed and other specialist personnel as needed, at least one independent reviewer who is not directly responsible for the stage under review, and records of the review including the results, actions, participants, and the date. The legal hook is MDR Article 10(9), which requires a QMS covering design and development proportionate to the risk class and type of device. Clause 7.3.5 is the specific mechanism that turns "we looked at the design" into a defendable governance artefact. A one-hour honest review with a written record beats a four-hour theatrical meeting with no decisions, every time.
By Tibor Zechmeister and Felix Lenhard. Last updated 10 April 2026.
TL;DR
- Design reviews are governed by clause 7.3.5 of EN ISO 13485:2016+A11:2021, at the stages planned under clause 7.3.2, traceable to MDR Article 10(9).
- Clause 7.3.5 requires planned reviews with multi-disciplinary participation, at least one reviewer independent of the stage being reviewed, and records of the results and any actions.
- A design review is not a status meeting. It is a decision gate where the design is evaluated against its inputs and a pass/hold/rework decision is made and recorded.
- Independence is a functional requirement, not a headcount requirement. In a three-person team, an independent reviewer can be an external advisor or a team member from a different functional area.
- Records matter more than meetings. The auditor will ask to see the review record, the list of participants, the issues raised, the actions assigned, and the closure evidence for those actions.
- The fastest startups run short, honest reviews at the stages they planned. The slowest ones either skip reviews or run theatrical reviews where nothing is decided and nothing is written down.
A review that worked and a review that did not
A four-person team building a Class IIa patient monitoring device scheduled four design reviews across an eighteen-month project, at the stages their clause 7.3.2 plan had pre-committed to: inputs review, architecture review, pre-verification review, and pre-release review. Each review ran for ninety minutes. Each had the three internal engineers plus one external clinical advisor acting as the independent reviewer. Each produced a one-page record listing the documents reviewed, the issues raised, the actions assigned with owners and due dates, and the gate decision (proceed, proceed with actions, hold). When the Notified Body auditor opened the design file, the four review records were the first thing the auditor looked at. Seven minutes later the auditor moved on. The reviews were credible because they had actually happened.
A different startup held what they called "weekly design reviews" — the whole team sat in a room, talked about whatever felt important, and left. No documents reviewed against. No agenda. No independent reviewer. No record beyond a shared note that read "discussed progress, no blockers." When the auditor asked which review had approved the transition from concept to detailed design, the team could not point to one. Technically there had been dozens of meetings. Formally there had been zero design reviews. The finding written that day cost the company three months.
Same clause. Same standard. Two completely different relationships with it.
What clause 7.3.5 actually requires
Clause 7.3.5 of EN ISO 13485:2016+A11:2021 sits inside the design and development section of the standard, after planning (7.3.2), inputs (7.3.3), and outputs (7.3.4). It says, in plain terms, that at suitable stages the organisation shall conduct systematic reviews of the design and development in accordance with planned arrangements to evaluate the ability of the results of the design to meet the requirements, and to identify and propose necessary actions.
The clause then defines the mandatory participants, the independence requirement, and the record obligation:
- Multi-disciplinary participation. Participants shall include representatives of the functions concerned with the stage being reviewed and other specialist personnel as needed. This is the anti-silo rule. A hardware review is not only attended by hardware engineers. A software review is not only attended by software engineers. The stage being reviewed determines who must be in the room, and anyone whose work feeds or depends on that stage is a candidate.
- Independence. Participants shall include at least one person who is not directly responsible for the stage of design and development being reviewed. This is the single most commonly misunderstood requirement in the clause, and the one small teams worry about the most. It is a functional requirement, not a headcount requirement. The independent person is someone whose work is not being evaluated in the review. In a small company the independent reviewer is often an external advisor, a clinical user, a QMS lead from a sister company, or a team member from a functional area outside the stage under review.
- Records. Records of the review and any necessary actions shall be maintained. The record is not optional, and it is not a slide deck. It is a dated, signed document listing the participants, the documents reviewed, the issues raised, the actions assigned with owners and due dates, and the gate decision.
The clause does not prescribe a format. It does not require a minimum duration. It does not require a specific number of attendees beyond "representatives of the functions concerned" plus "at least one independent reviewer." What it does require is that the review is planned, that it happens at a stage the plan identified, that it evaluates the design against its inputs, that it produces actions where problems are found, and that it is recorded.
Step 1 — Define the purpose of each review in the plan
Under clause 7.3.2, the design and development plan lists the stages of the project and the review activities at each stage. Clause 7.3.5 then runs against those planned reviews. The purpose of each review is not improvised on the day. It is pre-committed in the plan.
Typical review purposes, in the order they usually appear in a lean project:
- Inputs review. The purpose is to confirm that the design input set is complete, unambiguous, verifiable, and not in conflict with itself, before detailed design starts. This is the clause 7.3.3 review that post 291 walks through.
- Architecture review. The purpose is to confirm that the proposed architecture can satisfy the inputs, that the interfaces between sub-systems are defined, and that the risk controls from EN ISO 14971:2019+A11:2021 are reflected in the architecture.
- Pre-verification review. The purpose is to confirm the design outputs are ready to go into formal verification — the outputs exist, they trace to inputs, and the verification protocols exist.
- Pre-validation review. The purpose is to confirm verification is complete, the design is frozen for validation, and the residual risk profile has been reviewed.
- Pre-release review. The purpose is to confirm validation is complete, transfer is ready, and the design file is in the state required for the technical documentation under MDR Annex II.
Each review has one job. A review that tries to cover the whole project at every stage is either theatre or exhausting, usually both.
Step 2 — Decide when to hold each review
The timing is set in the plan under clause 7.3.2, not on the day. Reviews happen at the transition between stages — after the work of the stage is finished and before the next stage starts. A review that is scheduled in the middle of a stage, because the team "wants early feedback," is a working meeting, not a clause 7.3.5 review. Both can be useful; only one of them counts for the standard.
A good rule for a lean startup: schedule the review at the point where a gate decision is actually needed. If the team is about to commit budget or time to the next stage, that is a review. If the team is still discovering the work of the current stage, it is not.
Step 3 — Pick the right participants
Clause 7.3.5 requires representatives of the functions concerned with the stage being reviewed. In a three-person startup this is usually the whole team plus at least one independent reviewer. In a larger company, it is the design leads of each affected function.
The independent reviewer is the lever. Options for a small team:
- An external advisor engaged for the review (a clinical user, a regulatory consultant not otherwise involved in the project, a medical advisory board member).
- A team member from a clearly different functional area whose work is not being evaluated at this stage (e.g., the clinical lead reviewing an electronics-focused architecture review).
- A reviewer from a partner company or a sister project under a written engagement.
- The founder in their capacity as management representative, only if their day-to-day work is not the subject of the review — which is rare in a three-person team, so usually this does not qualify.
The independent reviewer should see the documents before the meeting, not during it. Reviews where the independent reviewer is reading the architecture for the first time while the team presents are not reviews. They are briefings.
Step 4 — Run the meeting against an agenda
A clause 7.3.5 review is short. Ninety minutes is plenty for most reviews in a lean project. The agenda runs against the design inputs and outputs of the stage, in this order:
- Purpose of the review and the gate decision being sought.
- List of documents being reviewed (with version numbers).
- Walk-through of the design against the inputs, led by the design owner.
- Issues raised by participants, captured on the spot.
- Actions proposed, with owners and due dates.
- Gate decision — proceed, proceed with actions, or hold.
- Record sign-off.
The chair is the person who owns the stage being reviewed. The scribe is someone other than the chair — in a small team, usually the independent reviewer, who is in the best position to hear everything without defending the work. The meeting ends with a decision, not with "we'll figure it out later."
Step 5 — Produce the outputs and records
Clause 7.3.5 requires records of the review and any necessary actions. The record is a dated document that contains, at minimum:
- Date, stage, and purpose of the review.
- List of participants with their roles, clearly marking the independent reviewer.
- List of documents reviewed, with version numbers.
- Issues raised during the review.
- Actions assigned, with named owners and due dates.
- Gate decision.
- Sign-off by the chair and the independent reviewer.
A one-page record is enough for a small project. The discipline is that every item above is present and every action is tracked to closure. Actions that close without evidence are not closed. The closure evidence lives alongside the original record in the design and development file under clause 7.3.10 (the subject of post 298 on the Design History File under MDR).
When an auditor asks about a review, the auditor wants to see the record, then the closure evidence for the actions the record generated, then the next review record showing the actions were verified. This is the full loop. Missing any part of it is a finding.
Step 6 — How small teams maintain independence
The single most common question from small teams is "how can we have an independent reviewer when there are only three of us?" The answer is that independence is about the function, not the org chart. Options that work in practice:
- Engage an external reviewer for the planned review dates only. A clinical advisor, a regulatory consultant, or a senior engineer from another company can be engaged by the hour for the four or five reviews a lean project needs. The cost is small. The audit value is large.
- Use advisory board members. If the company has a scientific or clinical advisory board, those members can serve as independent reviewers for the reviews relevant to their expertise.
- Rotate independence across functional areas. If the team has a hardware lead and a software lead, each can act as the independent reviewer for the other's stage, because neither is directly responsible for the stage being reviewed.
- Contract a fractional QMS lead who attends reviews. A fractional quality lead who is not in the design work itself is a credible independent reviewer for most stages.
What does not work: the founder acting as the independent reviewer for a stage they were hands-on in. The spirit of the clause is that a second pair of eyes, not invested in the work being evaluated, looks at the design. A founder who wrote half the inputs cannot credibly be independent of the inputs review.
Step 7 — Avoid the common mistakes
- Calling status meetings "design reviews." A weekly stand-up is not a clause 7.3.5 review. It has no stage purpose, no gate decision, no independent reviewer, and no record in the format the standard expects.
- Skipping the record. A review that was held but not recorded is, for audit purposes, a review that did not happen. The standard is explicit: records shall be maintained.
- Inviting the independent reviewer to the meeting only. The independent reviewer needs the documents in advance. A reviewer reading architecture for the first time in the room is not reviewing, they are being briefed.
- Assigning actions without owners or due dates. An action without an owner closes by itself after three weeks of nobody doing anything. The auditor will ask who owns each open action, and the answer must be a name.
- Holding reviews in the middle of a stage. Reviews are gate events. The gate is at the end of a stage, not halfway through.
- Letting the gate decision drift. A review that ends with "we'll decide next week" is not a review. It is a postponed decision. The gate decision is either made in the room or the review is rescheduled and the reason for the postponement is recorded.
- Confusing design reviews with verification. Verification (clause 7.3.6) is the testing of the design against the inputs. The design review evaluates whether the design and its verification evidence are ready to pass the gate. They are different activities with different records.
The Subtract to Ship angle on design reviews
Subtract to Ship (post 065) applied to clause 7.3.5 is direct: every review must have a gate purpose traceable to the clause 7.3.2 plan, and every element in the review record must serve that purpose. Reviews that exist only because "reviews are what mature companies do" are cut. Reviews that exist because the plan requires a gate decision at that stage are kept and run tightly.
The move the first team above got right is the move that works: fewer reviews, each one short, each one honest, each one with a written record and a real independent reviewer. Four one-hour reviews across eighteen months, with closure tracking, is more audit-credible than weekly theatrical reviews that produce no records. The subtraction is in the theatre, not in the governance.
The same rule applies inside the review itself. The agenda has six items, not twenty. The record is one page, not ten. The actions are specific, not aspirational. Every subtraction frees attention for the decision the review exists to make.
Reality Check — Where do you stand?
- Does your design and development plan list the specific stages at which clause 7.3.5 reviews will be held, and is each review linked to a gate decision?
- For every review that has already happened, can you produce a dated record with participants, documents reviewed, issues raised, actions with owners and due dates, and the gate decision?
- Does every review record show at least one participant clearly marked as the independent reviewer, who was not directly responsible for the stage under review?
- Did the independent reviewer receive the documents in advance, or were they reading them for the first time in the room?
- Are the actions from every past review tracked to closure, with closure evidence stored alongside the original record in the design file?
- When the auditor asks "who approved the transition from inputs to detailed design?", can you answer with a name, a date, and a document?
- Are you running reviews at stages your plan pre-committed to, or are you calling whatever meetings happen to occur "design reviews" after the fact?
- If clause 7.3.5 required you to produce the last three review records right now, how many would survive an antagonistic read?
Any "not yet" is a place to do the work before the next audit window.
Frequently Asked Questions
Does EN ISO 13485 require a specific number of design reviews? No. Clause 7.3.5 of EN ISO 13485:2016+A11:2021 requires reviews at suitable stages of design and development in accordance with planned arrangements. The number and timing of reviews are set in the clause 7.3.2 design and development plan and are proportionate to the project. A lean Class IIa project typically has four to six reviews. A Class III implantable project has many more. What the standard is strict about is that the planned reviews happen, are independent, and are recorded.
Who counts as an independent reviewer in a three-person team? Someone who is not directly responsible for the stage of design and development being reviewed. In a small team, this is usually an external advisor engaged for the review, a clinical user, a fractional QMS lead, an advisory board member, or — if the team is split across functional areas — a team member whose work is not part of the stage under review. The founder who wrote half the design cannot credibly be independent of a review of that design.
Is a weekly team stand-up a design review? No. A stand-up is a working meeting. A clause 7.3.5 design review is a planned, documented, gate-level evaluation of the design against its inputs, with multi-disciplinary participation, an independent reviewer, and a formal record. Both can be valuable, but only the second one satisfies clause 7.3.5.
What must the review record contain? Date, stage, and purpose; list of participants with roles, clearly marking the independent reviewer; list of documents reviewed with version numbers; issues raised; actions assigned with owners and due dates; gate decision; and sign-off by the chair and the independent reviewer. A one-page record is usually enough for a lean project. The test is whether an auditor could reconstruct the review from the record alone.
How does a design review differ from design verification? Design verification under clause 7.3.6 is the testing of the design outputs against the design inputs, with the test results captured as verification records. A design review under clause 7.3.5 evaluates the design, and where appropriate the verification evidence, against the stage's purpose and decides whether the project is ready to pass to the next stage. Verification produces test data. The review uses that data, among other inputs, to make a gate decision.
Can one meeting serve as both a design review and a verification activity? No. They are different activities with different records under different sub-clauses. A verification activity is run against a protocol and produces a test record. A design review is run against an agenda and produces a review record. They can happen on the same day, and the review can use the verification results as an input, but the records stay separate.
Does the MDR itself mention design reviews? The MDR does not use the phrase "design reviews" in its articles. MDR Article 10(9) requires a quality management system covering design and development, proportionate to the risk class and type of device. EN ISO 13485:2016+A11:2021 provides presumption of conformity with that obligation, and clause 7.3.5 of that standard is where the design review requirement lives. In practice, manufacturers certifying under MDR run clause 7.3.5 reviews.
Related reading
- MDR Design and Development Planning: Using ISO 13485 Section 7.3 to Comply — post 290, the planning step that pre-commits the stages at which clause 7.3.5 reviews are held.
- Design Input Requirements: How to Define and Document What You Are Building — post 291, the inputs that design reviews evaluate the design against.
- Design Outputs Under MDR — post 292, the outputs that a pre-verification design review checks for readiness.
- Design Verification Under MDR — post 294, the verification activity whose results feed later design reviews.
- Design Validation Under MDR — post 295, the validation work that a pre-release design review evaluates.
- The Design History File Under MDR — post 298, where design review records live inside the design and development file.
- Design Review Templates and Checklists for Startups — post 311, the practical templates a small team can adopt directly.
- The Subtract to Ship Framework for MDR Compliance — post 065, the methodology behind the review discipline in this post.
Sources
- Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices, Article 10(9) (quality management system obligation covering product realisation including design and development, proportionate to the risk class and type of device). Official Journal L 117, 5.5.2017.
- EN ISO 13485:2016+A11:2021 — Medical devices — Quality management systems — Requirements for regulatory purposes. Clause 7.3.2 (Design and development planning) and clause 7.3.5 (Design and development review). The harmonised standard providing presumption of conformity with the design and development portion of MDR Article 10(9).
- EN ISO 14971:2019+A11:2021 — Medical devices — Application of risk management to medical devices. The harmonised standard for the risk management activities that feed the design reviews at each planned stage.
This post is part of the Quality Management Under MDR cluster in the Subtract to Ship: MDR blog, linking up to the QMS pillar at post 276. Authored by Tibor Zechmeister and Felix Lenhard. The MDR is the North Star. EN ISO 13485:2016+A11:2021 is the tool. Clause 7.3.5 is where a startup decides whether its design reviews are governance or theatre — and the auditor can tell the difference in under ten minutes.