MedDeviceGuideMedDeviceGuide
Back

Medical Device Design Verification Test Protocol: How to Write, Execute, and Document Protocols That Pass FDA and EU MDR Review

How to write medical device design verification test protocols covering scope, acceptance criteria, test methods, sample size justification, pass/fail criteria, and result documentation — aligned with FDA design control requirements, EU MDR technical documentation expectations, ISO 13485 Clause 7.3.7, and the FDA recommended content format for non-clinical bench testing reports.

Ran Chen
Ran Chen
Global MedTech Expert | 10× MedTech Global Access
2026-05-1430 min read

Why Test Protocol Quality Determines Submission Success

A design verification test protocol is one of the most consequential documents in your Design History File. It is the mechanism by which you demonstrate, with documented objective evidence, that your device design does what your design inputs say it must do. When the FDA reviewer opens your 510(k) submission, or a Notified Body assessor examines your EU MDR technical file, the verification test protocols and their corresponding reports are among the first documents they scrutinize. Weak protocols produce weak submissions. Weak submissions produce Additional Information letters from FDA, non-conformities from Notified Bodies, and months of costly delay.

The stakes are concrete. FDA data consistently shows that design control deficiencies rank among the top three most-cited observation categories on Form 483. Within those deficiencies, inadequate design verification is a recurring theme: acceptance criteria defined after testing, missing sample size justification, test methods that do not map to any design input, deviations that were never assessed for impact. These are not theoretical risks. They are the reasons submissions get held up, audit findings get written, and product launches get pushed.

This guide covers how to write, execute, and document design verification test protocols that satisfy the requirements of the FDA, EU MDR, and ISO 13485. It is written for design engineers, quality engineers, and regulatory affairs professionals who need to produce protocols that will hold up under regulatory scrutiny on the first pass.

The Regulatory Basis

FDA Design Controls (21 CFR 820.30 and the QMSR)

Under the FDA's Quality System Regulation, design verification is required by 21 CFR 820.30(f): "Each manufacturer shall establish and maintain procedures for verifying the device design. Design verification shall confirm that the design output meets the design input requirements. The results of the design verification, including identification of the design, method(s), the date, and the individual(s) performing the verification, shall be documented in the DHF."

Since February 2, 2026, the QMSR incorporates ISO 13485:2016 by reference. Under this framework, the design verification requirement maps to ISO 13485 Clause 7.3.7, but the substance is unchanged: you must confirm that design outputs meet design inputs, and you must document the results.

The FDA also published the guidance document "Recommended Content and Format of Non-Clinical Bench Testing Reports for Substantial Equivalence Device Modifications" (and related guidance on non-clinical bench testing), which specifies what test protocols and reports should include. While this guidance was originally written for certain submission types, its recommended content format has become the de facto industry standard for all design verification bench testing documentation. The guidance recommends that test protocols include:

  • Test objective
  • Test article description and identification
  • Test methods and procedures
  • Study endpoints
  • Predefined acceptance criteria and pass/fail criteria

The corresponding test report recommendations cover:

  • Test title and identification
  • Device description and sample size
  • Test method description with applicable standards
  • Acceptance criteria (prospectively defined)
  • Documentation of unexpected results and deviations
  • Results presented as pass/fail, or as min/max/average values where appropriate

ISO 13485:2016 Clause 7.3.7

ISO 13485:2016 Clause 7.3.7 addresses design and development verification. It requires that verification be performed in accordance with planned and documented arrangements to ensure that design and development outputs have met the design and development input requirements. The results of verification, including any changes, must be documented. Importantly, ISO 13485 also requires that the verification plan describe the statistical techniques used and the rationale for sample sizes (Clause 7.3.6, which carries forward into verification planning). This is a point that many organizations overlook and that auditors frequently probe.

EU MDR Technical Documentation Requirements

The EU MDR (Regulation 2017/745) does not include a standalone "design verification" clause. Instead, verification evidence is embedded in the technical documentation requirements. Specifically:

  • Annex II, Section 6.1 requires documentation of verification and validation testing, including bench testing, performance testing, and evidence that the device conforms to the general safety and performance requirements (GSPRs)
  • Annex I defines the GSPRs that design verification must demonstrate conformity with
  • Article 10.9 requires the manufacturer to maintain a QMS that covers design and development, including verification activities

In practice, Notified Bodies assess verification protocols and reports during conformity assessment. They expect to see prospectively defined acceptance criteria, traceability from GSPRs to test evidence, sample size justification, and complete deviation documentation. The expectations are consistent with what FDA requires, even if the regulatory language differs.

Anatomy of a Verification Test Protocol: The Essential Sections

A well-structured verification test protocol contains the following sections. Each serves a specific regulatory purpose.

Protocol Header and Administrative Information

Every protocol should begin with a clear header containing:

  • Protocol number (following your document control numbering scheme)
  • Revision number and date
  • Device name and model/catalog number
  • Design input(s) being verified (with traceability IDs)
  • Protocol author, reviewer, and approver signatures with dates
  • Associated design project or DHF reference

Objective and Scope

The objective section states, in precise terms, what the protocol is verifying. A strong objective references the specific design input requirement by its document and section number. A weak objective states something generic like "to test the device." Compare:

Weak objective: "To test the mechanical strength of the catheter."

Strong objective: "To verify that the tensile strength of the distal shaft of the XYZ Catheter (Model A-100) meets the design input requirement of not less than 15 N, as specified in Design Input Specification DIS-042 Section 3.2.1."

The scope section defines what is in scope and what is outside the scope of the protocol, including which device configurations, components, or subsystems are covered.

Test Articles

This section describes the test samples in detail:

  • Device name, model, and part number
  • Lot or batch numbers of test articles
  • Number of samples (with reference to the sample size justification section)
  • Manufacturing process used to produce the test articles (they must be representative of the final production process unless a justification is documented)
  • Storage conditions and handling instructions prior to testing
  • Any pre-conditioning requirements (e.g., accelerated aging, sterilization cycling)

Test Methods and Procedures

This is the core of the protocol. For each test, document:

  • The test method, including step-by-step procedures detailed enough for a qualified person to reproduce the test
  • Applicable standards (e.g., ISO, ASTM, IEC test method standards)
  • Custom fixturing or test apparatus, including descriptions, part numbers, and setup instructions
  • Environmental conditions (temperature, humidity, atmospheric pressure) and the required tolerances
  • Equipment list with calibration status and calibration due dates
  • Sequence of operations and any dependencies between test steps
  • Data collection methodology: what measurements are taken, at what intervals, with what instruments
  • Identification of any software used for data acquisition, including version numbers

The level of detail must enable test repeatability. If two different test engineers, working from the same protocol at different times, cannot produce comparable results, the protocol is insufficient.

Acceptance Criteria

Acceptance criteria must be prospectively defined before testing begins. This is a non-negotiable regulatory requirement. Each criterion must:

  • Be linked to a specific design input requirement
  • Include a quantitative threshold wherever possible (e.g., "force shall be 15 N minimum" rather than "force shall be adequate")
  • Include the clinical, scientific, or engineering justification for the chosen threshold
  • State the pass/fail determination logic (e.g., "all samples must pass" versus "x of y samples must pass")

Sample Size Justification

ISO 13485 requires that the verification plan describe the statistical techniques and the rationale for sample sizes. A sample size justification should address:

  • The confidence level and reliability target (for attribute data) or the desired margin of error (for variable data)
  • The statistical method used (e.g., binomial probability, tolerance interval, normal distribution assumption)
  • Any historical data or assumptions about the expected failure rate
  • The relationship between sample size and risk (higher-risk design inputs may warrant larger sample sizes)

For example, a common approach for attribute testing is to use a binomial sampling plan: to demonstrate 95% reliability with 95% confidence (R95/C95), 59 samples with zero failures are required. For variable data, a tolerance interval approach based on the normal distribution is typical.

Test Setup Documentation

Document all aspects of the test setup that affect repeatability:

  • Custom fixtures: description, engineering drawings or photographs, mounting instructions
  • Equipment configuration: settings, ranges, modes of operation
  • Environmental chamber parameters if environmental conditioning is required
  • Calibration records for all measurement equipment (calibration must be current at the time of testing)
  • Software versions for any automated test systems or data acquisition tools

Deviation Handling Procedure

The protocol should include a section describing how deviations from the protocol will be handled. This is not an afterthought; it is a regulatory expectation. The procedure should address:

  • Who must be notified when a deviation occurs
  • How the deviation is documented (deviation report form, reference number)
  • The requirement for a source investigation
  • The requirement for an impact assessment on data integrity and the validity of test results
  • Who has the authority to approve continuing testing under a deviation
  • Whether re-testing is required and under what conditions
Recommended Reading
Design Output Documentation for Medical Devices: Drawings, Specifications, BOMs, and Acceptance Criteria
Design Controls Quality Systems2026-04-24 · 15 min read

Writing Testable Design Inputs: The Foundation

A verification test protocol is only as strong as the design input it verifies. If the design input is vague, unmeasurable, or missing, the protocol cannot succeed. A testable design input has the following characteristics:

  • Specific: It states a precise requirement, not a general aspiration. "The device must be robust" is not testable. "The device housing shall withstand a 1-meter drop onto a concrete surface without functional impairment" is testable.
  • Measurable: It includes a quantity, threshold, or binary criterion that can be objectively assessed.
  • Traceable: It has a unique identifier that links it to a user need and to the verification protocol(s) that will confirm it.
  • Justified: The threshold value is supported by clinical data, industry standards, risk analysis, or engineering analysis.

Common design input failures that undermine verification:

  • Inputs that are actually user needs in disguise (e.g., "the device must be easy to use" — this is a user need, not a design input)
  • Inputs that describe a design solution rather than a requirement (e.g., "the device shall use a lithium-ion battery" — this prescribes the solution; the input should specify the performance requirement)
  • Inputs that lack acceptance criteria (e.g., "the device shall be biocompatible" without specifying which ISO 10993 tests and which endpoints must pass)

Invest time writing rigorous design inputs. Every hour spent refining inputs saves ten hours of protocol revision, re-testing, and regulatory back-and-forth.

Defining Acceptance Criteria with Clinical, Scientific, or Engineering Justification

Acceptance criteria are the pass/fail thresholds that determine whether a design output meets its corresponding design input. These must be defined prospectively — before any testing occurs — and they must be justified.

What "Prospectively Defined" Means

Prospectively defined means the acceptance criteria are documented in the approved protocol before the first test sample is evaluated. Retrofitting acceptance criteria to fit test results is one of the most serious regulatory deficiencies a manufacturer can commit. FDA investigators and Notified Body auditors specifically look for evidence that acceptance criteria were established after testing by checking protocol approval dates against test execution dates.

Sources of Justification

The justification for each acceptance criterion should draw from one or more of the following:

  • Clinical data: Published clinical literature, predicate device performance data, or clinical input from medical advisors. Example: "The peak pull-off force of 5 N minimum is based on published biomechanical data showing that forces below 3.5 N during routine patient movement are insufficient to maintain secure catheter fixation (Smith et al., J. Med. Devices, 2023), with a safety margin of approximately 40%."
  • Recognized standards: Industry consensus standards that specify performance thresholds. Example: "The minimum burst pressure of 30 bar is specified by ISO 10555-1 for intravascular catheters."
  • Risk analysis: Outputs from ISO 14971 risk management showing that the threshold adequately mitigates a identified hazardous situation.
  • Engineering analysis: Finite element analysis, tolerance stack-up calculations, or other analytical methods that predict the required performance level.
  • Predicate device comparison: For 510(k) devices, performance benchmarks established by testing the predicate device under identical conditions.

Pass/Fail Logic

Define whether the acceptance criterion applies at the individual sample level, the sample set level, or both. Common approaches:

  • All units must pass: Every individual test article must meet the criterion (used for critical safety requirements)
  • Statistical acceptance: The sample set must meet a statistical criterion (e.g., the mean minus three standard deviations must exceed the minimum threshold)
  • X of Y units must pass: A specified number of units within the sample set must meet the criterion (common for attribute testing)

Selecting Test Methods: Testing vs. Inspection vs. Analysis vs. Demonstration

FDA and ISO 13485 recognize four categories of verification methods. Understanding when to use each — and when to combine them — is essential.

Testing

Physical or functional testing of the device under controlled conditions. This is the most common verification method and produces the most direct objective evidence. Examples include tensile testing, leak testing, electrical safety testing, dimensional measurement, and functional performance testing.

Inspection

Visual or dimensional examination of the device or its components. Inspection is appropriate for verifying design outputs that can be confirmed by direct observation or measurement. Examples include visual inspection of surface finish, dimensional inspection against drawings, and review of material certifications.

Analysis

Evaluation using mathematical or computational methods without physically testing the device. Analysis is appropriate when physical testing is impractical, prohibitively expensive, or when analytical methods provide equivalent or superior confidence. Examples include finite element analysis (FEA) for structural integrity, computational fluid dynamics (CFD) for flow characteristics, thermal analysis, and tolerance analysis.

Demonstration

Showing that the device performs its intended function under defined conditions, typically through observation rather than measurement against a quantitative threshold. Demonstration is the weakest form of verification and should be used only when other methods are not feasible, or as a supplement to other methods. Example: demonstrating that a user interface displays the correct sequence of screens during a defined workflow.

Using Multiple Verification Methods

Best practice, and the expectation reflected in both FDA guidance and ISO 13485, is to use at least two verification methods for each design input where practical. For example, a structural requirement might be verified by both finite element analysis (analysis) and physical bench testing (testing). A dimensional requirement might be verified by both inspection (CMM measurement) and functional testing (fit-check with mating components). Using multiple methods provides greater confidence in the verification result and reduces the risk that a deficiency in one method goes undetected.

Recommended Reading
Medical Device Clinical Trial Cost: Complete 2026 Budget Breakdown from Early Feasibility Through Pivotal Studies
Clinical Evidence Regulatory2026-05-13 · 31 min read

Sample Size Justification and Statistical Rationale

The question "how many samples do I need to test?" is one of the most frequently asked in design verification planning. The answer depends on the type of data, the risk level of the design input, and the statistical confidence you need to demonstrate.

Attribute Data (Pass/Fail)

For attribute data, where each sample is classified as pass or fail, the sample size is typically determined using a reliability and confidence framework. The binomial distribution provides the statistical basis.

To demonstrate a reliability of R at a confidence level of C with zero failures allowed in the sample, the required sample size is:

n = ln(1 - C) / ln(R)

For commonly used reliability/confidence combinations:

Reliability Confidence Required Sample Size (zero failures)
90% 90% 22
90% 95% 28
95% 90% 44
95% 95% 59
99% 95% 298

Higher-risk design inputs warrant higher reliability and confidence levels, which increase the required sample size. The justification for the chosen reliability and confidence levels should be documented and should reference the risk analysis.

Variable Data (Continuous Measurements)

For variable data, where each sample yields a continuous measurement (e.g., force in Newtons, pressure in bar, dimension in millimeters), sample sizes are generally smaller than for attribute data because each measurement carries more information. Common approaches include:

  • Tolerance interval: Determine the sample size needed to establish that a specified proportion of the population falls within the acceptance limits with a given confidence level
  • Hypothesis testing: Determine the sample size needed to detect a meaningful difference between the sample mean and the acceptance limit with a specified power
  • Process capability analysis: If historical manufacturing data is available, use capability indices (Cp, Cpk) to estimate the required sample size

Factors That Influence Sample Size

Several factors should be considered when determining sample size:

  • Risk level of the design input: Higher-risk requirements (those related to patient safety or device effectiveness) justify larger sample sizes
  • Expected variability: If the manufacturing process or the measurement method has high variability, larger sample sizes are needed
  • Cost and practicality: While not a statistical factor, cost constraints may influence the choice of reliability/confidence levels. Document the rationale for any trade-offs
  • Regulatory expectations: Some device-specific guidance documents or recognized standards specify minimum sample sizes for certain tests

Documenting Test Setup, Equipment, and Environmental Conditions

Repeatability is a fundamental requirement for any verification test. If the test cannot be repeated with the same results, the validity of the verification is questionable. The protocol must document everything that affects the test outcome.

Equipment and Instrumentation

List every piece of equipment used in the test, including:

  • Equipment name, manufacturer, model number, and serial number or unique identifier
  • Calibration status: date of last calibration, calibration due date, and calibration certificate reference
  • Equipment settings and ranges used during the test
  • For custom-built test equipment: detailed description, photographs, and engineering drawings

Custom Fixturing

Custom fixtures are common in device testing and must be thoroughly documented:

  • Description of the fixture's purpose and design
  • Engineering drawings with dimensions and tolerances
  • Photographs showing the fixture in use
  • Setup instructions: how the test article is mounted, aligned, and secured
  • Any preload or preload force requirements

If a fixture is critical to the test outcome and another organization or test lab needs to reproduce the test, the fixture documentation must be sufficient for them to fabricate an equivalent fixture.

Environmental Conditions

Specify and record the environmental conditions throughout testing:

  • Temperature and acceptable range (e.g., 23 +/- 2 degrees C)
  • Relative humidity and acceptable range (e.g., 50 +/- 10% RH)
  • Atmospheric pressure, if relevant
  • Any other environmental parameters specified by the test method standard

Environmental monitoring records should be maintained as part of the test data package.

Protocol Deviations and Non-Conformances: Handling Failures Properly

Things go wrong during testing. Equipment malfunctions. Samples are damaged during setup. An environmental excursion occurs. A test result fails to meet the acceptance criterion. How you document and handle these events determines whether your verification evidence remains credible.

Protocol Deviations

A deviation is any departure from the approved test protocol. Deviations must be documented with the following elements:

  • Description of the deviation: What happened, when, and during which test step
  • Reason for the deviation: Root cause or apparent cause, if known
  • Source investigation: A brief investigation into why the deviation occurred
  • Impact assessment: An evaluation of whether the deviation affects the validity of the test data, the integrity of the test results, or the ability to draw a conclusion about the design input
  • Disposition: Whether the test data from the affected samples is usable or must be discarded
  • Approvals: The deviation must be reviewed and approved by the protocol author, a quality representative, and any other relevant subject matter experts

Deviations are not inherently disqualifying. A well-documented deviation with a thorough impact assessment that concludes the data remains valid is acceptable to regulators. An undocumented deviation, or a deviation that is discovered during an audit rather than proactively reported, is a significant compliance risk.

Non-Conformances (Test Failures)

A non-conformance occurs when a test result fails to meet the predefined acceptance criterion. Non-conformances require a more rigorous response:

  1. Stop and document: Record the failure, including all raw data, photographs, and observations. Do not discard the failed sample.
  2. Root cause analysis: Perform a formal root cause analysis using an appropriate methodology (e.g., 5 Whys, fishbone diagram, fault tree analysis). The goal is to determine whether the failure is due to a design deficiency, a manufacturing defect in the test article, a test setup error, or another cause.
  3. Impact assessment: Evaluate whether the failure affects the overall conclusion of the verification and whether other design inputs may be affected.
  4. Corrective action: If the root cause is a design deficiency, the design must be corrected and the verification repeated. If the root cause is a test article defect that is not representative of the production process, document the justification and proceed with replacement testing.
  5. Re-testing: Re-testing after a non-conformance must be performed under the same approved protocol (or an approved protocol revision). The original failed test results must be retained in the DHF.

Do not re-test without performing root cause analysis. Re-testing into compliance — running tests repeatedly until a passing result is obtained without understanding why previous attempts failed — is a serious regulatory violation.

Recommended Reading
EU MDR PRRC (Person Responsible for Regulatory Compliance): Complete Guide to Article 15 Requirements with 2026 Qualification Changes
EU MDR / IVDR Regulatory2026-04-17 · 18 min read

Writing the Test Report: What FDA and EU Reviewers Expect to See

The test report is the document that reviewers will actually examine. It must stand on its own, meaning a reviewer should be able to understand what was tested, how it was tested, whether it passed, and why the result is credible without needing to reference the raw data or ask the test engineer for clarification.

Essential Report Sections

Following the FDA's recommended content format for non-clinical bench testing reports:

  • Test title and identification: Clear, descriptive title, protocol number, report number, revision, and date
  • Device description: Device name, model, configuration tested, and a brief description of the test articles
  • Sample size: Number of units tested, lot/batch information, and reference to the sample size justification
  • Test method description: Summary of the test method, including applicable standards, equipment used, and environmental conditions
  • Acceptance criteria: Restatement of the prospectively defined acceptance criteria with the justification summary
  • Deviations: Description of any protocol deviations that occurred, with reference to deviation reports and impact assessments
  • Results: Presentation of test results in a clear, organized format. For variable data, report individual values, minimum, maximum, mean, and standard deviation as appropriate. For attribute data, report the number of passes and failures. State the pass/fail conclusion for each acceptance criterion.
  • Conclusion: A clear statement of whether the design input requirement has been verified, based on the test results
  • Signatures: Test performer, test reviewer, and test approver, with dates

Results Presentation Best Practices

  • Use tables to present raw data, especially for multi-sample tests. Column headers should identify the sample, the measured value, and whether it meets the acceptance criterion.
  • Use graphs where they add clarity, such as force-displacement curves, time-series data, or distribution plots.
  • Include photographs of the test setup, test articles before and after testing (especially for mechanical tests), and any observed failure modes.
  • If statistical analysis was performed, include the analysis method, assumptions, and results.

What Reviewers Flag

Common reasons FDA and EU reviewers question test reports:

  • Acceptance criteria that appear to have been adjusted to fit the data
  • Missing or insufficient sample size justification
  • Deviations that were not assessed for impact on data validity
  • Results that are inconsistent with the acceptance criteria (e.g., reporting a "pass" when some samples failed)
  • Test articles that are not representative of the production device
  • Missing calibration records for test equipment
  • Environmental conditions not documented or outside specified ranges

Traceability: Linking Design Inputs to Verification Evidence

A traceability matrix is the connective tissue between your design inputs and your verification evidence. It is a document (typically a table or a database) that maps every design input requirement to the verification protocol(s) and report(s) that address it, along with the result.

Why Traceability Matters

Without a traceability matrix, you cannot demonstrate that every design input has been verified. This is a fundamental gap that FDA investigators and Notified Body auditors will identify quickly. The traceability matrix:

  • Prevents missed design inputs: If a design input has no corresponding verification protocol, it is immediately visible
  • Prevents unnecessary testing: If a verification protocol does not map to any design input, it may be extraneous
  • Demonstrates completeness: The matrix provides a single view showing that all inputs are verified, all protocols trace to inputs, and all results support the overall design verification conclusion
  • Facilitates change impact assessment: When a design input changes, the matrix identifies which protocols and reports are affected

Structure of a Traceability Matrix

A basic traceability matrix includes the following columns:

Column Content
Design Input ID Unique identifier from the design input specification
Design Input Requirement Brief description or the full text of the requirement
Risk Level Low / Medium / High, from the risk analysis
Verification Method(s) Testing / Inspection / Analysis / Demonstration
Protocol Number(s) The protocol(s) that verify this input
Report Number(s) The completed report(s) with results
Result Pass / Fail / In Progress / Not Started
Notes Any relevant comments, deviations, or cross-references

The traceability matrix should be maintained as a living document throughout the design project and should be reviewed at each design review milestone.

Common Mistakes That Trigger FDA Additional Information Requests

The following errors are drawn from patterns observed in FDA Additional Information (AI) letters and Form 483 observations related to design verification.

Defining Acceptance Criteria After Testing

This is the single most common and most serious deficiency. FDA reviewers will compare the protocol approval date to the test execution dates. If testing began before the protocol was approved, or if there is evidence that acceptance criteria were modified after results were known, the entire verification may be questioned. The fix is simple: approve the protocol, with all acceptance criteria, before testing begins. No exceptions.

Missing or Inadequate Sample Size Justification

ISO 13485 requires that the verification plan describe the statistical techniques and the rationale for sample sizes. Stating "10 samples were tested" without explaining why 10 is sufficient is inadequate. The justification must include the statistical method, the confidence level and reliability target (or equivalent parameters), and the rationale for the chosen parameters.

Test Methods That Do Not Map to Design Inputs

Every test in every protocol must trace to a specific design input. If a protocol tests something that is not specified in the design input specification, it is not design verification — it may be exploratory testing or characterization testing, which is valuable but does not fulfill the verification requirement. Conversely, every design input must be addressed by at least one verification protocol.

Inadequate Deviation Documentation

When something goes wrong during testing, documenting it thoroughly is not optional. A deviation report must include the description, the cause, the impact assessment, and the disposition. Reviewers will specifically look for deviations and will assess whether the impact analysis was adequate.

Using Non-Representative Test Articles

Design verification must be performed on devices that are representative of the final production design. Prototype devices manufactured by processes that differ significantly from the intended production process may not be suitable. If non-representative test articles are used (e.g., during early development), the justification must be documented and the verification may need to be repeated on production-representative units.

Failure to Address Non-Conformances

When a test fails, simply re-testing until it passes is not acceptable. Root cause analysis is required. The original failure must be documented, investigated, and resolved before re-testing proceeds. The corrective action must address the root cause, not just the symptom.

Submitting Incomplete Test Reports to FDA via eSTAR

Since October 2023, the FDA's eSTAR template has been mandatory for 510(k) submissions. The eSTAR template includes specific sections for bench testing data, with fields for test protocol summaries and complete test reports. Submitting incomplete reports — missing sections, unsigned documents, or reports without raw data — will trigger an AI request. Before submission, verify that every test report referenced in the eSTAR template is complete, signed, and contains all required sections.

Recommended Reading
Medical Device Cybersecurity Patch Management: Regulated Update Deployment Under EU MDR, FDA Section 524B, and the Cyber Resilience Act (2026)
Cybersecurity EU MDR / IVDR2026-05-13 · 31 min read

Protocol Section Checklist with Regulatory References

The following table provides a checklist for each protocol section, with the regulatory basis for each element.

Protocol Section Key Elements Regulatory Reference
Protocol header Protocol number, revision, device identification, design input references 21 CFR 820.30(j); ISO 13485 7.3.10
Objective and scope Specific design input being verified, scope boundaries 21 CFR 820.30(f); ISO 13485 7.3.7
Test articles Device description, lot/batch, sample size, manufacturing process representativeness FDA bench testing guidance; ISO 13485 7.3.7
Acceptance criteria Prospective, quantitative, justified, pass/fail logic 21 CFR 820.30(f); FDA bench testing guidance; ISO 13485 7.3.7
Sample size justification Statistical method, confidence/reliability, risk-based rationale ISO 13485 7.3.6; 7.3.7
Test methods and procedures Step-by-step procedures, applicable standards, data collection FDA bench testing guidance; ISO 13485 7.3.7
Test setup and equipment Fixtures, equipment list, calibration records, environmental conditions FDA bench testing guidance; ISO 13485 7.3.7
Deviation handling Documentation, source investigation, impact assessment, disposition 21 CFR 820.30(f); ISO 13485 7.3.7
Results and conclusions Raw data, statistical analysis, pass/fail determination, conclusion FDA bench testing guidance; ISO 13485 7.3.7
Signatures and approvals Test performer, reviewer, approver with dates 21 CFR 820.30(f); ISO 13485 4.2.4
Traceability Design input to protocol to report mapping 21 CFR 820.30(f)(j); EU MDR Annex II 6.1

Common Verification Methods by Device Type

Different device types lend themselves to different verification approaches. The following table summarizes common verification methods organized by device category.

Electromechanical Devices (e.g., infusion pumps, surgical instruments, powered wheelchairs)

Design Input Category Typical Verification Methods Common Standards
Mechanical strength / structural integrity Tensile testing, compression testing, drop testing, fatigue testing ISO 10555, ASTM D638, IEC 60601-1
Electrical safety Dielectric strength, leakage current, grounding continuity IEC 60601-1, IEC 62368-1
Electromagnetic compatibility Radiated/conducted emissions, immunity testing IEC 60601-1-2
Functional performance Flow accuracy, pressure accuracy, speed measurement Device-specific standards
Environmental resistance Temperature cycling, humidity exposure, vibration IEC 60601-1-11, ISTA 3A
Usability Simulated-use testing, task analysis IEC 62366-1

Software as a Medical Device / Software-Containing Devices

Design Input Category Typical Verification Methods Common Standards
Functional requirements Unit testing, integration testing, system testing IEC 62304, IEEE 829
Input/output handling Boundary value analysis, equivalence partitioning IEC 62304
Algorithm accuracy Comparison to reference datasets, sensitivity analysis IEC 62304, FDA software guidance
Security requirements Penetration testing, vulnerability scanning, access control verification IEC 62443, AAMI TIR57
Performance under load Stress testing, scalability testing IEC 62304
User interface Code review, static analysis, traceability to requirements IEC 62304, IEC 62366-1

Implantable Devices (e.g., orthopedic implants, cardiac stents, pacemakers)

Design Input Category Typical Verification Methods Common Standards
Mechanical performance Fatigue testing, static strength testing, wear testing ISO 7206, ASTM F1717, ISO 5832
Biocompatibility Cytotoxicity, sensitization, irritation, systemic toxicity ISO 10993 series
Corrosion resistance Pitting corrosion, galvanic corrosion, fretting corrosion ASTM F2129, ISO 16429
Dimensional accuracy CMM measurement, optical measurement, micro-CT Device-specific
Sterility and packaging Sterility assurance, seal strength, accelerated aging ISO 11607, ISO 11135/11137
MRI compatibility MRI safety testing, artifact evaluation, force/torque measurement ASTM F2052, ASTM F2119, ASTM F2182

In Vitro Diagnostic Devices (IVDs)

Design Input Category Typical Verification Methods Common Standards
Analytical sensitivity Limit of detection, limit of quantification CLSI EP17, EP05
Analytical specificity Cross-reactivity, interfering substances CLSI EP07, EP25
Precision / reproducibility Repeatability, intermediate precision, reproducibility CLSI EP05
Accuracy Comparison to reference method, recovery studies CLSI EP09
Linearity Dilution linearity, hook effect CLSI EP06
Stability Real-time stability, accelerated stability, in-use stability CLSI EP25, ICH Q1A

Putting It All Together: A Practical Workflow

Design verification is not an isolated activity. It is embedded in the design control process and connected to risk management, design reviews, and design transfer. Here is a practical workflow for planning and executing verification:

  1. During design planning: Identify all design inputs that will require verification. Create the initial traceability matrix. Assign verification methods to each input.
  2. During design input development: Ensure every design input is testable, with quantitative acceptance criteria and justification. Review acceptance criteria with clinical and engineering subject matter experts.
  3. Before testing begins: Write and approve the verification protocols. Confirm sample size justifications are documented. Verify that test equipment is calibrated. Confirm test articles are available and representative.
  4. During test execution: Follow the protocol. Document everything. Record all raw data, observations, and environmental conditions. If a deviation occurs, stop and document it before proceeding.
  5. After testing: Write the test reports. Review results against acceptance criteria. Document any non-conformances and perform root cause analysis. Update the traceability matrix with results.
  6. At design review: Present the verification status, including the updated traceability matrix, completed reports, open non-conformances, and any remaining verification activities.
  7. Before submission: Verify that the eSTAR template (for 510(k)) or technical file (for EU MDR) includes complete test protocol summaries and test reports for all verification activities.

Final Considerations

Design verification test protocols are not paperwork exercises. They are the primary mechanism for demonstrating that your device design meets the requirements you defined — and those requirements exist to ensure the device is safe and effective for patients. A well-written protocol, executed according to plan, with results documented in a complete and transparent test report, serves three audiences simultaneously: the engineer who needs confidence in the design, the regulator who needs evidence of compliance, and ultimately the patient who needs a device that works as intended.

The principles in this guide apply regardless of device type, risk classification, or target market. The specific standards and test methods will vary. The regulatory citations may differ between the US, the EU, and other jurisdictions. But the core expectations — prospective acceptance criteria, justified sample sizes, documented methods, thorough deviation handling, complete traceability — are universal. Get these right, and your verification documentation will stand up to any regulatory review.

Related Articles

Digital Health & AIRegulatory

Medical Device AI Bias Testing and Algorithmic Fairness: Validation Methods, Regulatory Requirements, and Submission Documentation

How to test AI-enabled medical devices for algorithmic bias across demographic subgroups, validate fairness using statistical methods, document bias analysis for FDA 510(k) and EU MDR submissions, and implement ongoing post-market monitoring — based on FDA AI-enabled device TPLC draft guidance, EU AI Act high-risk requirements, and 2026 regulatory expectations.

2026-05-14·33 min read
CybersecurityRegulatory

Medical Device Cybersecurity Incident Response and Breach Notification: FDA, EU MDR, and CISA Reporting Requirements

How to build a medical device cybersecurity incident response plan covering FDA 21 CFR 806 reporting, EU MDR vigilance obligations, CISA 72-hour notification, containment and eradication procedures, patient safety assessment, and coordination with ISAOs — based on the MITRE/FDA playbook, HPH sector guidance, and 2026 regulatory requirements.

2026-05-14·28 min read
RegulatoryPolicy & Legislation

DOJ Medical Device Fraud Enforcement in 2026: False Claims Act, Anti-Kickback Statute, and What MedTech Companies Must Know About the Record $6.8 Billion Crackdown

How the Department of Justice's record $6.8 billion False Claims Act enforcement in FY2025 impacts medical device companies — National Fraud Enforcement Division, Health Care Fraud Data Fusion Center using AI analytics, West Coast Strike Force, Anti-Kickback Statute compliance, whistleblower qui tam risks, and what manufacturers, distributors, and executives must do to reduce exposure in the most aggressive healthcare fraud enforcement environment in US history.

2026-05-13·34 min read