MedDeviceGuideMedDeviceGuide
Back

Design Controls for Medical Devices: FDA Requirements, Process, and Implementation Guide

The complete guide to medical device design controls — FDA 21 CFR 820.30, ISO 13485 clause 7.3, the design control process, DHF requirements, traceability, and practical implementation.

Ran Chen
Ran Chen
2026-03-2573 min read

What Are Design Controls?

Design controls are a set of interrelated practices and procedures embedded into the design and development process of a medical device. Their purpose is straightforward: ensure that the device you are building is safe, effective, and meets the needs of the people who will use it — before you manufacture and sell it.

The FDA's Design Control Guidance for Medical Device Manufacturers (1997) defines them as "an interrelated set of practices and procedures that are incorporated into the design and development process, i.e., a system of checks and balances." That definition is worth internalizing because it highlights the core idea: design controls are not a single document or a single review meeting. They are a system — a continuous, documented process that runs from the moment you define what a device needs to do until you transfer that design into production.

Every regulatory framework for medical devices in the world requires some form of design controls. In the United States, the FDA mandates them under 21 CFR 820.30. Internationally, ISO 13485:2016 addresses the same territory under clause 7.3 (Design and Development). In the European Union, the EU MDR (Regulation 2017/745) does not spell out a standalone design control requirement, but compliance with its Annex II technical documentation requirements and the expectation of an ISO 13485-compliant QMS means you are performing design controls in practice.

Design controls exist because the medical device industry learned, painfully, what happens without them. Before the Safe Medical Devices Act of 1990 introduced design control requirements in the US, there was no regulatory mechanism to ensure that manufacturers systematically verified and validated their designs before releasing products. The consequences — device failures, patient injuries, and costly recalls — made the case clear. The FDA formalized design control requirements in the 1996 Quality System Regulation (QSR), drawing on ISO 9001 principles and adapting them for the unique risk profile of medical devices.

The inclusion of design controls as a regulatory requirement represented a pivotal philosophical shift: quality cannot be inspected into a finished product — it must be designed in from the beginning. Rather than relying on end-of-line testing to catch defects, design controls require that quality, safety, and effectiveness are systematically built into every stage of the design process.

Why Design Controls Matter

Design controls matter for three interconnected reasons:

Patient safety. Medical devices interact with patients in ways that can cause serious harm if the design is flawed. A design input that fails to account for electromagnetic interference in an operating room environment, a software algorithm that was never validated with real clinical data, a sterilization process that was never verified against the device's material compatibility — any of these gaps can lead to adverse events. Design controls are the systematic mechanism for catching these gaps before devices reach patients.

Regulatory compliance. Without adequate design controls, your device cannot legally be marketed in the US, the EU, or most other regulated markets. The FDA will cite you on a Form 483 and potentially issue a warning letter. A Notified Body will flag nonconformities during your ISO 13485 audit. Design control deficiencies have been among the top three most-cited categories in FDA inspections for over a decade.

Business efficiency. There is a persistent myth that design controls slow down development. The opposite is true when they are implemented well. Design controls force teams to define requirements clearly before committing to a design, catch errors early through structured reviews and verification testing, and prevent the late-stage rework that actually derails timelines. The financial data is compelling: late-stage design changes cost 10-100x more than modifications made during the requirements phase. Poor design controls can add 6-18 months to development timelines. Recall costs range from approximately $3-5 million for routine component fixes to $600 million or more for systemic design flaws. More than half of all FDA device recalls — and the majority of Class I recalls — trace to design control gaps that better risk analysis, verification, and validation would have caught. A properly implemented design control process is not an overhead cost; it is an investment with a typical 3-10x return.

When Are Design Controls Required?

In the US, design controls under 21 CFR 820.30 are required for:

  • All Class III medical devices (PMA pathway)
  • All Class II medical devices (510(k) pathway)
  • Certain Class I devices that are listed in 21 CFR 820.30(a)(2), including devices automated with computer software and specific device types such as HCT/P and tracheobronchial suction catheters

The common misconception is that Class I devices are exempt from design controls. Most are — but not all. If your Class I device involves software or is specifically listed in the regulation, design controls apply.

Under ISO 13485:2016, design and development controls apply to any organization that performs design and development activities for medical devices. If you are a contract manufacturer that only follows someone else's specifications, you may exclude clause 7.3, but that exclusion must be justified and documented.

Legal and Regulatory Basis

FDA 21 CFR 820.30 — Design Controls

Section 820.30 of the FDA's Quality System Regulation is the primary US regulation governing design controls. It contains ten subsections, each addressing a specific element of the design control process:

Subsection Element Core Requirement
820.30(a) General Applicability — Class II, Class III, and listed Class I devices
820.30(b) Design and development planning Establish and maintain plans describing design activities and responsibilities
820.30(c) Design input Document requirements relating to the device, including intended use, user needs, and patient needs
820.30(d) Design output Define and document outputs that meet input requirements and include acceptance criteria
820.30(e) Design review Conduct formal documented reviews at appropriate stages with cross-functional participation
820.30(f) Design verification Confirm design outputs meet design input requirements
820.30(g) Design validation Ensure the device conforms to user needs and intended uses under actual or simulated conditions
820.30(h) Design transfer Ensure the design is correctly translated into production specifications
820.30(i) Design changes Identify, document, validate or verify, review, and approve changes before implementation
820.30(j) Design history file Maintain a DHF containing records that demonstrate compliance with the design plan

Each subsection is concise but specific. The regulation tells you what you must do (document design inputs, perform verification, maintain a DHF) without dictating how to do it. That flexibility is intentional — the FDA expects manufacturers to develop procedures appropriate for their device complexity and risk level.

ISO 13485:2016 Clause 7.3 — Design and Development

ISO 13485 addresses design controls under clause 7.3, using the terminology "design and development" rather than "design controls." The structure closely parallels 21 CFR 820.30:

ISO 13485 Clause Element
7.3.1 General
7.3.2 Design and development planning
7.3.3 Design and development inputs
7.3.4 Design and development outputs
7.3.5 Design and development review
7.3.6 Design and development verification
7.3.7 Design and development validation
7.3.8 Design and development transfer
7.3.9 Control of design and development changes
7.3.10 Design and development files

ISO 13485 adds several emphases that 21 CFR 820.30 addresses less explicitly:

  • Risk management integration — Clause 7.3.3 explicitly requires risk management outputs as inputs to the design process, referencing ISO 14971
  • Usability — Design inputs must address usability requirements
  • Interconnected devices — Verification and validation must address scenarios where the device connects to or interfaces with other medical devices
  • Statistical techniques — Verification and validation plans must describe statistical techniques used and the rationale for sample sizes

QMSR — What Changed for Design Controls

The FDA's Quality Management System Regulation (QMSR), which became effective on February 2, 2026, represents the most significant change to the US medical device quality system framework in three decades. The QMSR amends 21 CFR Part 820 by incorporating ISO 13485:2016 by reference, replacing the legacy QSR text.

For design controls specifically, the practical impact is:

What stays the same:

  • Design controls are still required for Class II, Class III, and listed Class I devices
  • The core elements (planning, inputs, outputs, reviews, verification, validation, transfer, changes, DHF) remain
  • The DHF, DMR, and DHR concepts are retained as FDA-specific requirements layered on top of ISO 13485

What changes:

  • ISO 13485 clause 7.3 is now the authoritative text for design and development requirements in the US
  • Risk management integration (aligned with ISO 14971) becomes more explicit and pervasive
  • Terminology shifts — "design and development" replaces "design controls" in the standard itself, though the FDA continues to use both terms
  • Documentation expectations increase around risk-based design inputs and outputs
  • Manufacturers who already maintain ISO 13485 certification and comply with 21 CFR 820.30 should find the transition manageable, since the requirements are substantively aligned

Practical impact: If your QMS already complies with both ISO 13485 and the old 21 CFR 820, the QMSR should not require major changes to your design control procedures. The primary work is updating procedure references, training staff on terminology changes, and ensuring your risk management integration meets the enhanced expectations. See our full guide: QSR to QMSR Transition.

EU MDR (Regulation 2017/745)

The EU MDR does not contain a dedicated "design controls" section equivalent to 21 CFR 820.30 or ISO 13485 clause 7.3. Instead, design control expectations are embedded across multiple provisions:

  • Article 10.9 requires manufacturers to establish, document, implement, and maintain a QMS that addresses product realization, including planning, design, development, production, and service provision
  • Annex II (Technical Documentation) requires documentation of design and manufacturing information, including the complete design process from user needs through final design specifications, verification, and validation
  • Annex I (General Safety and Performance Requirements) defines the safety and performance outcomes that design controls must ultimately achieve

In practice, compliance with ISO 13485 clause 7.3 satisfies the EU MDR's design control expectations. This is why the EN ISO 13485:2016 harmonized standard is the primary tool for demonstrating conformity with the MDR's QMS requirements.

Global Regulatory Landscape — Beyond FDA and EU

Design controls are not limited to the US and EU. Every major regulatory market requires some form of design and development controls, and international harmonization efforts have increasingly aligned these expectations:

  • Canada (Health Canada) — Requires ISO 13485 certification through the Canadian Medical Devices Conformity Assessment System (CMDCAS), and since 2019, through the Medical Device Single Audit Program (MDSAP). MDSAP audits cover design controls as part of the ISO 13485 assessment and are accepted by multiple regulatory authorities simultaneously.
  • Australia (TGA) — The Therapeutic Goods Administration requires compliance with Australian regulatory requirements, which are aligned with ISO 13485. MDSAP audit results are accepted by the TGA.
  • Japan (PMDA) — Japan participates in MDSAP and requires QMS compliance aligned with ISO 13485 for medical device manufacturers.
  • Brazil (ANVISA) — Also a MDSAP participant, requiring ISO 13485 compliance for design and development processes.

The practical implication is clear: ISO 13485 clause 7.3 is the global lingua franca for design controls. If your QMS is built on ISO 13485 and your design control procedures satisfy clause 7.3, you have the foundation for regulatory compliance in virtually every major market. The FDA's QMSR transition, which incorporates ISO 13485 by reference, further reinforces this convergence. Companies pursuing a global regulatory strategy should design their design control procedures to satisfy ISO 13485 clause 7.3 as the baseline, with FDA-specific additions (DHF, DMR, DHR, complaint file requirements, UDI) layered on top.

Product Development vs. Design Controls

A common source of confusion — especially for engineers and product managers new to the medical device industry — is the relationship between the product development process and design controls. Are they the same thing?

They are related but distinct. Product development is the broader activity encompassing everything required to bring a device from concept to market: business planning, budgeting, timeline management, marketing strategy, clinical strategy, regulatory strategy, manufacturing scale-up, and more. Design controls are the regulatory-required subset of product development that documents and verifies the design process to demonstrate the device is safe, effective, and meets user needs.

Design controls fit within the product development process. A well-designed product development process incorporates design control activities at the appropriate stages, but it also includes business and operational activities that fall outside the scope of 21 CFR 820.30 or ISO 13485 clause 7.3.

The key point: do not confuse your product development project plan with your design control plan. The design control plan (Design and Development Plan) is a regulatory document that describes the design control activities, responsibilities, and deliverables. The product development project plan is a broader business document. Both are needed, and the design control plan should be referenced within or aligned with the overall project plan.

The Design Control Process — Phase by Phase

User Needs — The Starting Point

Before any formal design activity begins, you must understand and document the needs of the people who will use the device and the patients it will serve. User needs are the foundation upon which the entire design control process is built — they are the starting point of the waterfall diagram and the benchmark against which design validation ultimately confirms success.

User needs capture what the device must accomplish from the perspective of its stakeholders — clinicians, patients, caregivers, technicians, and other users. They are stated in the user's language, not in engineering specifications. "The surgeon needs to visualize the surgical site clearly during minimally invasive procedures" is a user need. "The display shall provide 1920x1080 resolution at 60 Hz" is a design input derived from that user need.

Methods for gathering user needs:

  • Clinical observations — Shadowing users in the clinical environment to understand workflows, pain points, and unmet needs
  • User interviews and focus groups — Structured conversations with representative users to elicit needs and preferences
  • Literature review — Published clinical studies, guidelines, and standards that define clinical needs and performance expectations
  • Competitive analysis — Evaluating existing devices on the market to understand what works, what does not, and what gaps remain
  • Voice of the customer (VOC) surveys — Systematic surveys of target users and key opinion leaders
  • Post-market feedback — For next-generation devices, complaint data, field experience, and post-market surveillance from existing products

User needs should be documented in a User Needs Document (UND) or equivalent, reviewed with clinical and cross-functional stakeholders, and formally approved before being translated into design inputs. The traceability matrix begins here — each user need receives a unique identifier that will be traced forward through design inputs, outputs, verification, and validation.

Tip: User needs gathering often consumes a significant portion of the early project timeline — in some cases up to a third of total project time for complex devices. This is time well spent. Inadequate user needs definition is a root cause of design validation failures, because the design team builds a device that meets engineering specifications but fails to address actual clinical workflows or user expectations.

Design Planning

Design planning is the foundation of the entire design control process. Before any design work begins, you must establish a plan that describes:

  • The design and development activities to be performed
  • Responsibilities and authorities for each activity
  • The interfaces between different groups or functions involved in the design
  • The review, verification, and validation activities planned at each stage
  • Resource requirements (personnel, equipment, facilities)
  • The timeline and milestones

The plan is a living document. As the design progresses, the plan must be updated to reflect changes in scope, timeline, resources, or approach. Both 21 CFR 820.30(b) and ISO 13485 clause 7.3.2 require that the plan be reviewed and updated as the design evolves.

A common mistake is treating the design plan as a formality — a document created at the start of the project and never revisited. This is a recipe for 483 observations. The plan should be the project's single source of truth for how design controls will be executed, and it should be actively managed throughout development.

Tip: Your design plan does not need to be a single monolithic document. Many companies use a Design and Development Plan (DDP) as the top-level document and reference subordinate plans (verification plan, validation plan, risk management plan) for detail. What matters is that the full scope of design control activities is planned, documented, and traceable.

Design Input

Design inputs are the documented requirements that define what the device must do. They are derived from user needs, patient needs, regulatory requirements, applicable standards, risk management outputs, and business requirements.

The regulatory requirement is specific: inputs must be documented, reviewed, and approved. They must address the intended use of the device. And there must be a mechanism for resolving incomplete, ambiguous, or conflicting requirements (21 CFR 820.30(c)).

Design inputs typically include:

Category Examples
Functional requirements Operating parameters, measurement accuracy, throughput
Performance requirements Reliability targets, operating life, response time
Safety requirements Biocompatibility, electrical safety, mechanical strength
Usability requirements User interface, ergonomics, training requirements
Regulatory requirements Applicable standards (IEC 60601, IEC 62304), essential performance
Environmental requirements Operating temperature range, humidity, altitude, EMC immunity
Packaging and labeling Sterile barrier, labeling content, shelf life
Risk control measures Requirements derived from the risk analysis (ISO 14971)

Each design input must be specific enough to be verifiable. "The device must be easy to use" is not a valid design input. "The device must be operable by a clinician wearing standard surgical gloves, with task completion within 30 seconds for the primary use case" is.

Common pitfall: Failing to include risk control measures as design inputs. When your risk analysis identifies that a specific hazard must be mitigated through a design feature (e.g., an alarm threshold, a mechanical interlock, a software timeout), that mitigation becomes a design input requirement. If you do not formally capture it as a design input, it cannot be systematically verified and validated.

Example: Translating User Needs to Design Inputs

Consider a portable pulse oximeter. A user need might be stated as: "The clinician needs to quickly assess a patient's blood oxygen saturation in an emergency department setting."

From this single user need, multiple design inputs would be derived:

User Need Element Design Input Requirement
Quick assessment SpO2 reading displayed within 10 seconds of sensor placement
Emergency department setting Device must function in ambient temperatures of 15-40 C and humidity of 15-90% RH
Emergency department setting Display readable at viewing angles up to 45 degrees under ambient light of 100-2000 lux
Clinician use Operable with one hand; total weight not to exceed 350 grams including batteries
Blood oxygen saturation SpO2 accuracy of +/- 2% over the range of 70-100% SpO2 (per ISO 80601-2-61)
Patient safety (risk control) Audible and visual alarm when SpO2 falls below configurable threshold (default 90%)

Each of these design inputs is specific, measurable, and verifiable — which is exactly what distinguishes a proper design input from a vague user need statement.

Design Output

Design outputs are the results of the design process. They are the tangible deliverables that respond to and fulfill the design input requirements. Design outputs must be documented in terms that allow evaluation against the design inputs, and they must include or reference acceptance criteria.

Typical design outputs include:

  • Device specifications and drawings
  • Software requirements specifications, architecture documents, and source code
  • Bill of materials (BOM)
  • Manufacturing process specifications
  • Packaging specifications
  • Labeling content and artwork
  • Risk management file outputs (updated FMEA, risk/benefit analysis)
  • Test protocols (verification and validation)

The regulation requires that design outputs be reviewed and approved before release (21 CFR 820.30(d)). ISO 13485 clause 7.3.4 adds that outputs must provide information necessary for purchasing, production, and service activities and must specify product characteristics essential for safe and proper use.

The relationship between design inputs and design outputs is the backbone of the design control traceability. Every input must have a corresponding output, and every output must trace back to one or more inputs.

Design Review

Design reviews are formal, documented evaluations conducted at planned stages of the design process. They serve as decision gates — checkpoints where the design team and independent reviewers assess whether the design is progressing adequately and whether identified issues have been resolved.

Key requirements for design reviews:

  • Planned stages: Reviews must occur at pre-defined milestones, not just at the end of the project. Common review points include concept review, preliminary design review (PDR), critical/detailed design review (CDR), and pre-production review.
  • Cross-functional participation: Reviews must include representatives of all functions concerned with the design stage being reviewed, plus at least one individual who does not have direct responsibility for that stage (21 CFR 820.30(e)). This ensures independent perspective.
  • Documented results: The review results, including the identification of the design, the date, and the participants, must be documented in the DHF.
  • Action items: Any issues identified must be tracked to resolution with documented follow-up.

Design reviews are not the same as casual team meetings or status updates. They are formal evaluations with defined agendas, documented outcomes, and decision authority. The distinction matters during FDA inspections — investigators will specifically ask to see design review records and will assess whether they meet the regulatory definition of a formal review.

A well-documented design review record should include:

Record Element What to Document
Design identification Device name, project number, design revision/version under review
Review type Concept review, PDR, CDR, pre-validation review, etc.
Date and location When and where the review was held
Attendees Names, titles, and functions of all participants; identification of the independent reviewer
Agenda Topics and documents reviewed
Decisions Go/no-go decision for proceeding to the next phase
Action items Issues identified, assigned owners, target completion dates
Risk status Summary of current risk analysis status and any new hazards identified
Signatures Approval signatures of review participants

Tip: The "independent reviewer" requirement is one of the most commonly misunderstood provisions. 21 CFR 820.30(e) requires that design reviews include "an individual(s) who does not have direct responsibility for the design stage being reviewed." This does not mean an external consultant. It means someone from your organization — perhaps a quality engineer, a regulatory specialist, or an engineer from a different project — who is not part of the core design team for the stage under review. Their role is to provide objective assessment.

Design Verification

Design verification confirms that the design outputs meet the design input requirements. The fundamental question is: did we build the device correctly?

Verification methods include:

  • Testing — Physical tests, software tests, electrical tests, mechanical tests
  • Inspection — Visual examination, dimensional measurement
  • Analysis — Engineering calculations, finite element analysis, simulation
  • Demonstration — Showing that the device performs a function as specified

Verification is performed throughout the design process, not just at the end. As design outputs are produced — a PCB layout, a software module, a mechanical component — they are verified against their corresponding design input specifications.

The results must be documented in the DHF, including the design identification, the verification method, the date, the individual(s) performing the verification, and the results (21 CFR 820.30(f)).

ISO 13485 clause 7.3.6 adds two important requirements that go beyond the FDA's text:

  1. Verification plans must define acceptance criteria and statistical techniques, including the rationale for sample sizes
  2. If the device is intended to connect or interface with other medical devices, verification must confirm that outputs meet inputs under those interconnected conditions

For a deeper exploration of verification methods, documentation requirements, and common mistakes, see our full guide: Design Verification vs. Design Validation.

Design Validation

Design validation confirms that the final device meets user needs and intended uses under actual or simulated conditions of use. The fundamental question is: did we build the right device?

Key regulatory requirements for design validation:

  • Production-equivalent units: Validation must be performed on initial production units, lots, or batches, or their equivalents (21 CFR 820.30(g)). Testing prototypes built by hand in the lab generally does not satisfy this requirement. The units must be representative of what will actually be manufactured and sold.
  • Actual or simulated use conditions: Validation testing must reflect real-world conditions. A bench test in a controlled lab environment is verification, not validation. Validation requires conditions that represent how the device will actually be used — by the intended users, in the intended environment, for the intended purpose.
  • Software validation: When the device includes software, software validation must be included as part of design validation.
  • Risk analysis: Design validation must include risk analysis where appropriate, confirming that the risk controls identified during risk management are effective in the finished device.
  • Completion before transfer: ISO 13485 clause 7.3.7 explicitly requires that validation be completed prior to delivery or implementation of the medical device.

Validation activities typically include usability testing (formative and summative), simulated-use studies, clinical evaluations (where required), biocompatibility testing, sterility validation, EMC testing, and software system-level validation.

Common Validation Activities by Device Type

Device Type Typical Validation Activities
Implantable device Biocompatibility testing (ISO 10993), mechanical fatigue testing, simulated-use studies with clinicians, clinical evaluation/investigation
Electromechanical device EMC testing (IEC 60601-1-2), electrical safety testing (IEC 60601-1), simulated-use usability study, performance testing under environmental extremes
IVD device Analytical performance validation (precision, accuracy, linearity), clinical performance study, usability testing with intended operators
Software/SaMD System-level software validation, usability study (IEC 62366-1), cybersecurity testing, clinical validation of algorithm performance
Sterile device Sterility validation, package integrity testing, shelf life/stability studies, biocompatibility of sterilized materials

Key distinction: Verification tests individual specifications against design inputs. Validation tests the complete device against user needs. Verification can happen on components and subsystems throughout development. Validation happens on the finished, production-representative device. Both are required. Neither replaces the other. For more detail, see Design Verification vs. Design Validation.

Design Transfer

Design transfer is the process of translating the finalized, validated design into production specifications. It ensures that the device can be consistently and reliably manufactured at scale, maintaining all the safety and performance characteristics that were verified and validated during development.

21 CFR 820.30(h) states simply: "Each manufacturer shall establish and maintain procedures to ensure that the device design is correctly translated into production specifications."

ISO 13485 clause 7.3.8 provides more detail, requiring that:

  • Transfer procedures verify that design outputs are suitable for manufacturing before being finalized as production specifications
  • Production capability is confirmed — the manufacturing process can consistently produce devices that meet the product requirements
  • Results and conclusions of the transfer process are recorded

Design transfer deliverables typically include:

  • Device Master Record (DMR) — the complete set of manufacturing instructions, drawings, specifications, and procedures
  • Process validation records (IQ, OQ, PQ for manufacturing processes)
  • Incoming inspection specifications for components and materials
  • Production test procedures and acceptance criteria
  • Packaging and labeling specifications
  • Training records for production personnel

A "design freeze" typically precedes or coincides with design transfer. This is the point at which the design is considered complete and further changes require formal change control. The design freeze is not a regulatory term — you will not find it in 21 CFR 820.30 or ISO 13485 — but it is a widely used industry practice that marks the transition from development to production.

Design Freeze — When and How

The concept of a "design freeze" (sometimes called "design lock") marks the formal transition point where the design is considered complete and any further modifications require formal change control. While not a regulatory term defined in 21 CFR 820.30 or ISO 13485, it is a near-universal industry practice.

A typical design freeze process includes:

  1. Pre-freeze assessment — Confirm all verification testing is complete, all design review action items are closed, and the risk management file is current
  2. Freeze approval — Formal sign-off by design engineering, quality, regulatory, and manufacturing representatives
  3. Baseline establishment — All design output documents are baselined (revision-locked) in the document control system
  4. Change control activation — From this point forward, all changes to the design require a formal Engineering Change Order (ECO) with documented impact assessment, review, and approval
  5. Validation initiation — Design validation begins on the frozen design using production-equivalent units

Some companies implement multiple freeze points — a "soft freeze" that restricts major architecture changes while allowing minor refinements, followed by a "hard freeze" where no changes are permitted without full change control.

Tip: Design transfer is not a single event. It is a process that should be planned from the beginning of the project. Companies that treat design transfer as an afterthought consistently run into manufacturing problems — processes that cannot hold tolerances, materials that are not available in production quantities, inspection methods that do not scale. Planning for manufacturability should start during design input definition.

Common anti-pattern — "throwing it over the wall": One of the most frequent design transfer failures occurs when engineering completes the design in isolation and then "throws it over the wall" to manufacturing without adequate collaboration. Manufacturing receives specifications they had no input on, discovers issues with producibility, and the result is delays, rework, and quality problems. The remedy is to involve manufacturing engineering early in the design process — during design input definition and throughout design reviews — so that design-for-manufacturability concerns are addressed before the design is frozen.

Design Changes

Any modification to a design after it has been established — whether during development or after the product is on the market — must be controlled through a formal design change process. This applies to changes in specifications, drawings, materials, software, manufacturing processes, or any other element of the design.

21 CFR 820.30(i) requires that design changes be identified, documented, validated (or verified, where appropriate), reviewed, and approved before implementation.

ISO 13485 clause 7.3.9 adds that the review of design changes must include an evaluation of:

  • The effect of the change on constituent parts, the product in process, and products already delivered
  • The impact on risk management inputs/outputs
  • The effect on product realization processes

Design changes are one of the most frequently cited areas in FDA inspections. Out of 178 design control citations on Form 483s in FY2020, 26 concerned design changes — approximately 15% of all design control observations. Common failures include making changes without documenting them, implementing changes before completing the required review and approval, and failing to assess the impact of changes on previously completed verification and validation.

Regulatory trigger: If a design change to a previously cleared device significantly affects safety, effectiveness, or intended use, this may trigger the need for a new 510(k) submission. The FDA's guidance "Deciding When to Submit a 510(k) for a Change to an Existing Device" provides a decision framework, but the assessment must be documented as part of the design change record. Failure to identify changes that require a new submission is both a design control violation and a regulatory marketing violation.

The Waterfall Diagram and V-Model

The FDA Waterfall Diagram

The most recognizable visual in medical device design controls is the "waterfall diagram" from the FDA's 1997 Design Control Guidance. Originally adapted from Health Canada and ISO 9001:1994 concepts, this diagram shows the design process flowing sequentially from user needs through design input, design process (output), design verification, design validation, and ultimately to medical device production — with feedback loops connecting each stage back to earlier stages.

The waterfall model illustrates several critical concepts:

  • User needs drive the process — Everything starts with understanding what the device must accomplish for its users
  • Inputs flow to outputs — Design inputs are translated into design outputs through the design process
  • Verification links outputs to inputs — Confirming that outputs satisfy the input specifications
  • Validation links the finished device to user needs — Confirming that the device fulfills its intended purpose
  • Feedback loops exist — The arrows flowing backward show that issues discovered at any stage can trigger revisions to earlier stages
  • Design review spans the entire process — An overarching bar across the top indicates that reviews occur throughout, not just at a single point

However, the FDA's own guidance explicitly cautions against taking the waterfall model too literally. The guidance states: "Although the Waterfall Model is a useful tool for introducing design controls, its usefulness in practice is limited. The model does apply to the development of some simpler devices. However, for more complex devices, a concurrent engineering model is more representative of the design processes in use in the industry."

The waterfall diagram is pedagogical. It teaches the relationships between design control elements. It does not prescribe how your development process must be organized.

Important misconception: Many professionals mistakenly believe that the FDA requires a waterfall development process. This is incorrect. The FDA requires that design control elements be performed and documented. The waterfall diagram is an explanatory tool, not a mandated methodology. Using agile, spiral, concurrent engineering, or any other development approach is fully acceptable as long as design control activities are completed and documented.

The V-Model

The V-model is an alternative visualization that many companies find more practical. It arranges the design process in a V shape:

  • The left side of the V represents the definition phases (user needs, design inputs, detailed design)
  • The bottom of the V represents implementation
  • The right side of the V represents the testing phases (verification, validation)

The key insight of the V-model is the explicit mapping between left-side and right-side activities:

Left Side (Definition) Right Side (Testing)
User needs Design validation
Design inputs (system requirements) Design verification (system-level)
Subsystem/component specifications Subsystem/component testing
Detailed design Unit testing

Each definition stage on the left has a corresponding testing stage on the right. This mapping reinforces traceability — the test plans on the right side are developed based on the requirements defined on the left side. It also encourages teams to plan their testing strategy early, not as an afterthought.

The V-model is particularly useful for software-intensive devices because it aligns well with the software development lifecycle defined in IEC 62304. The software architecture and detailed design on the left map to integration testing and system testing on the right, creating a clear path from requirements to evidence.

Comparison of Development Models

Aspect Waterfall V-Model Agile/Iterative Hybrid (V-Model + Agile)
Structure Linear, sequential phases V-shaped, parallel definition and testing Short iterations (sprints), continuous delivery V-model gates with agile execution within phases
When testing is planned After design is complete During requirements definition Continuously, each sprint At requirements stage, executed iteratively
Flexibility for changes Low — changes require going back to earlier phases Moderate — changes affect both sides of the V High — changes incorporated each sprint Moderate to high
Best suited for Simple devices, well-understood requirements Software-intensive devices, systems engineering Software/SaMD development, rapid prototyping Most modern medical device development
Regulatory fit Straightforward mapping to 820.30 Natural traceability structure Requires careful documentation discipline Balances agility with regulatory evidence

Which Model Should You Use?

Neither the FDA nor ISO 13485 prescribes a specific development model. The regulations define what must be accomplished (planning, inputs, outputs, reviews, verification, validation, transfer, change control), not how the development process must be structured.

In practice, most modern medical device development uses an iterative or hybrid approach — elements of the V-model combined with agile-inspired iteration within each phase. The critical requirement is that regardless of your development methodology, all design control elements are executed and documented. Your design plan should describe the approach you are using and map the design control activities to your project's phases and milestones.

Practical advice: If you are developing a software-intensive device or SaMD, the V-model combined with agile sprints is typically the most effective approach. Use the V-model structure for your overall project architecture and regulatory milestones, and use agile practices (sprints, daily standups, continuous integration) for execution within each phase. Your design plan should document this hybrid approach and explain how design control activities map to your sprint cadence and release cycles.

The Design History File (DHF)

What Is a DHF?

The Design History File is a compilation of records that documents the entire design and development history of a medical device. It demonstrates that the device was designed and developed in accordance with the approved design plan and regulatory requirements.

21 CFR 820.30(j) requires: "Each manufacturer shall establish and maintain a DHF for each type of device. The DHF shall contain or reference the records necessary to demonstrate that the design was developed in accordance with the approved design plan and the requirements of this part."

ISO 13485 clause 7.3.10 uses the term "design and development file" and states that organizations must maintain a file for each medical device type or medical device family that includes or references records generated to demonstrate conformity with the requirements for design and development.

What Goes in the DHF?

The DHF is not a single document — it is a collection (or organized reference to) all design control records. A complete DHF typically includes:

Category Documents
Planning Design and development plan (DDP), project charter, resource plans
User needs User needs document, clinical user research, stakeholder requirements
Design inputs Design input requirements (system requirements specification, URS), standards applicability matrix
Design outputs Device specifications, drawings, BOM, software requirements/architecture, labeling
Risk management Risk management plan, hazard analysis, FMEA, risk/benefit analysis, risk management report
Design reviews Design review meeting minutes, action items, attendance records, decisions
Verification Verification plan, test protocols, test reports, analysis reports
Validation Validation plan, test protocols, test reports, usability study reports, clinical data
Design transfer Transfer plan, process validation records, manufacturing readiness assessment
Design changes Change requests, change orders, impact assessments, re-verification/re-validation records
Regulatory Standards compliance matrix, predicate comparison (for 510(k)), essential requirements checklist

DHF vs. DMR vs. DHR

Three documentation concepts are frequently confused. Understanding their distinct purposes is essential:

File Purpose Contains When Created
Design History File (DHF) Documents the development process Design plans, inputs, outputs, reviews, V&V records, risk management, changes During design and development
Device Master Record (DMR) Defines how to manufacture the device Device specifications, production process specs, QA procedures, packaging/labeling specs At design transfer (output of design controls)
Device History Record (DHR) Documents what was actually manufactured Production records, QC test results, acceptance activities, labeling, quantities During manufacturing of each batch/lot

The relationship is sequential: the DHF documents how the design was developed, the DMR defines how to build it, and the DHR proves that each unit was built correctly.

Under QMSR: The DHF, DMR, and DHR concepts are retained as FDA-specific requirements even though ISO 13485 uses different terminology (design and development file, medical device file). The QMSR preserves these FDA-specific documentation requirements while incorporating ISO 13485 by reference for the underlying process requirements.

Design Controls and Risk Management (ISO 14971)

Risk management is not a separate, parallel activity. It is interwoven with design controls at every stage. Both FDA and ISO 13485 require risk management to be integrated into the design process, and the QMSR makes this expectation even more explicit.

How Risk Management Connects to Each Design Control Phase

Design Control Phase Risk Management Activity
Design planning Establish risk management plan; define risk acceptability criteria
Design input Include risk control measures identified in the risk analysis as design input requirements
Design output Document risk control measures implemented in the design; update risk analysis
Design review Review risk analysis status; confirm risk controls are adequate; assess residual risks
Design verification Verify that risk control measures are implemented correctly (outputs meet risk-derived inputs)
Design validation Validate that risk controls are effective under real-world conditions; confirm overall residual risk is acceptable
Design transfer Ensure risk controls are maintained in manufacturing (process controls, inspection criteria)
Design changes Assess impact of changes on risk analysis; update hazard analysis; re-verify/re-validate affected risk controls

The linkage is bidirectional. Risk analysis outputs feed design inputs (hazards identified become requirements for risk controls). Design outputs feed back into risk analysis (design decisions may introduce new hazards or modify existing ones). This iterative relationship is why ISO 14971 describes risk management as a lifecycle process, not a one-time analysis.

Risk Analysis Tools

Several established tools support risk analysis within the design control process:

Tool Description Best Used For
FMEA (Failure Modes and Effects Analysis) Systematically identifies potential failure modes, their effects, and their causes; assigns risk priority numbers Component-level and process-level risk analysis; most widely used tool in medical device design
FTA (Fault Tree Analysis) Top-down analysis starting from an undesired event and working backward to identify root causes using Boolean logic System-level hazard analysis; particularly useful for complex, multi-component systems
HACCP (Hazard Analysis and Critical Control Points) Identifies critical control points in a process where hazards can be prevented, eliminated, or reduced Manufacturing process risk analysis; useful for sterile devices and IVD reagent production
Preliminary Hazard Analysis (PHA) Early-stage hazard identification before detailed design is complete Concept and feasibility phases; establishing initial risk acceptability criteria
Use-related risk analysis Analysis of hazards arising from user interaction with the device, including use errors and foreseeable misuse Usability engineering integration (IEC 62366-1); essential for devices with user interfaces

The choice of tool depends on the device type, complexity, and the specific design phase. Most companies use FMEA as their primary risk analysis tool and supplement it with FTA for system-level analysis and use-related risk analysis for human factors considerations. The risk analysis tools and their application should be defined in your risk management plan (per ISO 14971) and referenced in your design plan.

Practical Example: Risk Management Feeding Design Controls

Consider a powered surgical instrument. During the risk analysis (FMEA), the team identifies a hazard: unintended activation of the cutting mechanism during setup could injure the surgical team.

Here is how this hazard flows through design controls:

  1. Risk analysis identifies the hazard (unintended activation) and assigns a risk level based on severity and probability
  2. Risk control measure is defined: a two-step activation mechanism requiring simultaneous actuation of a safety switch and the trigger
  3. Design input is created: "The device shall require simultaneous actuation of the safety switch and trigger to activate the cutting mechanism. Activation of the trigger alone shall not energize the cutting element."
  4. Design output documents the mechanical design of the dual-actuation system, the electrical interlock circuit, and the software logic
  5. Design verification tests that the interlock functions correctly — trigger alone does not activate, safety switch alone does not activate, both together do activate
  6. Design validation tests the dual-actuation mechanism with representative surgical team members in a simulated OR environment to confirm it is intuitive, does not impede clinical workflow, and effectively prevents unintended activation
  7. Design transfer ensures the manufacturing process for the interlock mechanism is validated and incoming inspection criteria verify correct assembly

This end-to-end flow — from hazard to risk control to design input to output to V&V — is exactly what ISO 14971 and the QMSR expect. The traceability matrix documents these linkages.

For a comprehensive treatment of ISO 14971 implementation, see our ISO 14971 Risk Management Guide.

The Traceability Matrix

What Is a Design Control Traceability Matrix?

A traceability matrix is a document — typically a table or spreadsheet — that maps the relationships between user needs, design inputs, design outputs, verification activities, and validation activities. It is the single most powerful tool for demonstrating that your design controls are complete and that every requirement has been addressed.

The traceability matrix answers the auditor's fundamental question: "For every user need, can you show me the design input requirement that addresses it, the design output that implements it, the verification test that confirms the output meets the input, and the validation evidence that the finished device meets the original user need?"

Structure of a Traceability Matrix

A well-constructed traceability matrix includes these columns:

Column Description
User Need ID Reference to the documented user need or stakeholder requirement
Design Input ID Reference to the specific, verifiable design input requirement
Risk Control Reference to any associated risk control measure (from ISO 14971 analysis)
Design Output ID Reference to the specification, drawing, or document that implements the requirement
Verification ID Reference to the verification protocol/report that confirms the output meets the input
Validation ID Reference to the validation protocol/report that confirms the device meets the user need
Status Current status (open, verified, validated, closed)

Why Traceability Matters

Traceability serves multiple purposes:

  • Completeness check — If a user need has no corresponding design input, you have a gap. If a design input has no verification test, you have a gap. The matrix makes these gaps visible.
  • Impact analysis for design changes — When a design input changes, the matrix immediately shows which outputs, verifications, and validations are affected and may need to be updated.
  • Audit readiness — FDA investigators and ISO auditors routinely request the traceability matrix. A complete, well-maintained matrix significantly reduces inspection risk.
  • Efficient testing — By mapping requirements to test activities, you can ensure you test everything that needs testing without unnecessary duplication.

Example Traceability Matrix Entry

Here is a simplified example for a blood glucose meter:

User Need Design Input Risk Control Design Output Verification Validation Status
UN-001: Clinician needs accurate glucose readings DI-003: Accuracy within +/- 15% for glucose > 100 mg/dL per ISO 15197 RC-005: Calibration algorithm with lot-specific coefficients DO-012: Algorithm specification Rev C; DO-013: Calibration procedure VP-008: Analytical accuracy study per ISO 15197, n=600 VL-003: Clinical accuracy study with 200 subjects, production test strips Validated
UN-002: Results displayed quickly at point of care DI-007: Test result displayed within 5 seconds of sample application -- DO-018: Electrochemical measurement timing specification VP-012: Timing verification, n=100 strips across 3 lots VL-003: Usability study confirms adequate speed for clinical workflow Validated

This level of traceability demonstrates to an auditor that each user need is systematically addressed through the design control process — from requirement to evidence.

Tip: Build the traceability matrix from the beginning of the project, not at the end. Populating it retrospectively is painful, error-prone, and exactly the kind of after-the-fact documentation that auditors recognize and flag. The matrix should be updated continuously as requirements, outputs, and test results evolve.

21 CFR 820.30 vs. ISO 13485 Clause 7.3 — Detailed Comparison

While the two frameworks are substantially aligned — and the QMSR has further harmonized them — differences remain in emphasis, terminology, and specificity.

Aspect FDA 21 CFR 820.30 ISO 13485:2016 Clause 7.3
Terminology "Design controls" "Design and development"
Regulatory status US federal regulation (legally binding) International standard (binding when referenced by regulation, e.g., QMSR, EU MDR)
Risk management reference Requires risk analysis "where appropriate" in design validation Explicitly references ISO 14971; requires risk management throughout design and development
Usability Not explicitly mentioned in 820.30 text Explicitly requires usability as a design input (clause 7.3.3)
Statistical techniques Not specified Requires documentation of statistical techniques and sample size rationale for V&V
Interconnected devices Not explicitly addressed Requires verification and validation to address interfaced/connected device combinations
Documentation file Design History File (DHF) — specific term and requirement (820.30(j)) Design and development file (7.3.10) — same concept, different terminology
Design transfer detail Brief requirement (820.30(h)) More detailed — requires verification that outputs are suitable for manufacturing and that production capability meets requirements (7.3.8)
Design change scope Changes must be validated or verified, reviewed, and approved before implementation Adds requirement to assess impact on constituent parts, in-process products, delivered products, risk management, and other realization processes
Applicability Class II, Class III, and listed Class I devices Any organization performing design and development of medical devices

Under QMSR: Because the QMSR incorporates ISO 13485 by reference, the practical differences between the two frameworks have been significantly reduced for US manufacturers. The ISO 13485 text is now the authoritative source, supplemented by FDA-specific additions (DHF/DMR/DHR, complaint file requirements, UDI).

Design Controls vs. Design and Development — Terminology

A frequent source of confusion is the different terminology used by the FDA and ISO 13485 for what is essentially the same set of activities.

The FDA uses the term "design controls" — a term that originated in the 1996 QSR and is deeply embedded in US regulatory culture. ISO 13485 uses the term "design and development" — consistent with the broader ISO quality management system vocabulary.

Under the QMSR, this terminology gap has narrowed but not disappeared. The QMSR incorporates ISO 13485 by reference, so the authoritative requirements text uses ISO's "design and development" language. However, the FDA continues to use "design controls" in its guidance documents, inspection procedures, and industry communications.

For practical purposes, the terms are interchangeable. When an FDA investigator asks to review your "design controls," they are asking about the same activities that an ISO 13485 auditor would assess under "design and development." Your procedures can use either term — what matters is that the substance of the requirements is addressed.

Other terminology differences worth noting:

FDA Term ISO 13485 Term Meaning
Design History File (DHF) Design and development file Record of the design process
Device Master Record (DMR) Medical device file (partial equivalent) Manufacturing specifications
Device History Record (DHR) No direct equivalent term Production records for manufactured units
Quality System Regulation (QSR/QMSR) Quality management system (QMS) Overall quality framework
Design input Design and development input Requirements driving the design
Design output Design and development output Deliverables of the design process

Design Controls for Software (IEC 62304 Intersection)

Medical devices increasingly contain software — and many devices are software (Software as a Medical Device, or SaMD). When your device includes software, the general design control framework of 21 CFR 820.30 and ISO 13485 clause 7.3 still applies, but it must be supplemented with the software-specific lifecycle requirements of IEC 62304.

How IEC 62304 Maps to Design Controls

Design Control Phase IEC 62304 Equivalent
Design planning Software development planning (clause 5.1)
Design input Software requirements analysis (clause 5.2)
Design output Software architectural design (5.3) and detailed design (5.4)
Implementation Software unit implementation (5.5)
Design verification Software unit verification (5.5.5), integration testing (5.6), system testing (5.7)
Design validation Software system testing at system level, usability validation
Design changes Software change management, problem resolution (clauses 6, 9)
Risk management Software items contributing to hazardous situations mapped to ISO 14971

Key Considerations for Software Design Controls

Software safety classification — IEC 62304 classifies software into three safety classes (A, B, C) based on the severity of potential harm. The safety class determines the rigor of the development process. Class C (could result in death or serious injury) requires the most rigorous documentation, verification, and testing. Class A (no contribution to a hazardous situation) allows the least. Your design controls should scale accordingly.

SOUP management — Software of Unknown Provenance (SOUP) includes open-source libraries, third-party components, and off-the-shelf software. IEC 62304 requires that SOUP be identified, evaluated for risk, and managed throughout the software lifecycle. SOUP components become design inputs (functional and performance requirements) and require verification that they perform as expected in your specific use context.

Software validation under FDA — 21 CFR 820.30(g) explicitly states that design validation "shall include software validation and risk analysis, where appropriate." The FDA expects documented software validation for any software that is part of a medical device or is used in the quality system to support production and testing.

Agile development — IEC 62304 is methodology-neutral, and the FDA has confirmed that agile and iterative development approaches are acceptable as long as all required lifecycle activities are documented. The design control framework does not require waterfall-style sequential development. What it requires is that planning, inputs, outputs, reviews, verification, validation, and change control are all performed and documented — regardless of whether they happen in sprints, phases, or iterations.

Software Safety Classification and Design Control Rigor

IEC 62304 defines three software safety classes that directly affect the depth of design control documentation required:

Safety Class Definition Design Control Impact
Class A Software cannot contribute to a hazardous situation Minimal documentation — software requirements and system-level testing are sufficient. No mandatory detailed design or unit-level testing.
Class B Software can contribute to a hazardous situation that does not result in serious injury Moderate documentation — software architecture, integration testing, and verification testing are required.
Class C Software can contribute to a hazardous situation that results in death or serious injury Full documentation — detailed design, unit-level implementation verification, integration testing, system testing, and comprehensive traceability are required.

The software safety classification is determined by the risk analysis (ISO 14971) and documented in the software development plan. A common mistake is assuming that the software safety class is determined solely by the device's overall classification (Class I, II, or III). In reality, software safety classification is a separate analysis based on the contribution of the software to specific hazardous situations.

Regulatory Recognition of IEC 62304

The FDA recognizes IEC 62304 as a consensus standard. Declaring conformity with IEC 62304 in a premarket submission (510(k) or PMA) can significantly reduce the documentation burden by allowing you to reference the standard rather than re-explaining your software development process in detail. The FDA's 2023 final guidance on premarket submissions for software-inclusive medical devices explicitly references the IEC 62304 framework.

In the EU, IEC 62304 is a harmonized standard under both the MDR and IVDR, meaning conformity with IEC 62304 provides a presumption of conformity with the relevant essential requirements for software.

For a comprehensive guide to IEC 62304 implementation, see our IEC 62304 Software Lifecycle Guide.

Common FDA Design Control Deficiencies

Design control citations have consistently ranked among the top three most-cited categories on FDA Form 483 observations, alongside CAPA and complaint handling. The trend data underscores the point: design control deficiencies accounted for approximately 11% of device Form 483 observations in FY2020, rising to more than 33% in FY2021. Design control deficiencies appeared in approximately 40% of device warning letters in FY2022, ranking among the FDA's three most common QS citations. Understanding the most common deficiencies helps you avoid them.

Top Design Control 483 Findings

1. Design validation failures (most frequently cited)

Out of 178 design control citations in FY2020, 48 concerned design validation — making it the single most cited design control element. Common issues include:

  • No established procedures for design validation
  • Validation not performed on production-equivalent devices (using prototypes instead)
  • Validation conditions not representative of actual use
  • Missing software validation
  • No documented risk analysis as part of validation
  • Validation results not documented in the DHF

2. Design change control failures

Design change accounted for 26 of 178 design control citations in FY2020. Common issues include:

  • Design changes implemented before completing review and approval
  • Changes not evaluated for impact on completed verification/validation
  • No documented assessment of risk impact from changes
  • Changes to software not managed through formal change control

3. Design input deficiencies

  • Design inputs not documented or incomplete
  • No mechanism for resolving ambiguous or conflicting requirements
  • Risk control measures not captured as design inputs
  • Inputs not reviewed and approved by designated individuals

4. Design review deficiencies

  • Reviews not conducted at appropriate stages
  • Missing independent reviewer (no one outside the direct design team)
  • Review records lacking required elements (date, participants, decisions)
  • Issues identified but not tracked to resolution

5. Design verification deficiencies

  • Verification not performed for all design input requirements
  • Acceptance criteria not defined before testing
  • Results not documented with required elements (design ID, methods, dates, personnel)
  • Statistical rationale for sample sizes not documented

6. DHF deficiencies

  • No DHF maintained for a device type
  • DHF incomplete — missing key records
  • DHF records do not demonstrate compliance with the approved design plan

Real-World Consequences of Design Control Failures

Design control failures do not just generate paperwork problems. They lead to real harm:

  • Recalls: The June 2022 recall of 23,372 HeartWare Ventricular Assist Devices (HVAD) due to battery issues posed significant risks of injury or death. Investigations traced the root cause to inadequate design verification and validation of the battery subsystem — a design control failure.

  • Warning letters: FDA warning letters citing design control violations typically require companies to cease marketing the affected device, conduct retrospective reviews of all design control records, and implement comprehensive corrective actions — often requiring months of work and significant expense.

  • Delayed market access: When FDA reviewers identify design control gaps during premarket review, the result is additional information requests, delays in clearance or approval, and sometimes refusal to file or rejection of the submission.

Warning letter escalation: FDA Form 483 observations become warning letters when companies provide inadequate responses or fail to implement corrective actions. Design control deficiencies in warning letters have increased significantly, with 47 warning letters issued to medical device companies in FY2024 — a 96% increase from 24 in FY2023.

How to Avoid Design Control Citations

The following practices directly address the most common cited deficiencies:

  1. Use production-equivalent units for validation. This is the single most important practice. Build validation units using your intended production process, tooling, and materials.

  2. Validate under realistic conditions. Test in environments and with user populations that represent actual use. If your device will be used by nurses in an ICU, your validation study should involve nurses in an ICU (or a faithful simulation of one).

  3. Capture risk controls as design inputs. Every risk control measure identified in your risk analysis should appear as a formal design input requirement with a corresponding verification test.

  4. Include independent reviewers in design reviews. At least one reviewer at each design review must not have direct responsibility for the design stage under review. Document their participation.

  5. Control design changes rigorously. No change should be implemented before review and approval. Assess every change for impact on completed V&V, risk analysis, and previously delivered product.

  6. Maintain the DHF in real time. Populate design records as activities are completed, not at the end of the project.

  7. Document statistical rationale. For every verification and validation test that uses sampling, document the sample size and the statistical basis for that sample size.

Practical Workflow for Implementing Design Controls

The following workflow provides a pragmatic implementation path, especially for companies establishing design controls for the first time or overhauling an existing system.

Whether you are a startup building your first device or an established manufacturer improving an existing process, the following workflow provides a step-by-step implementation path. Adapt the level of detail to your organization's size, device complexity, and regulatory markets.

Step 1: Establish Your Design Control Procedure

Write a Design Control Standard Operating Procedure (SOP) that defines:

  • When design controls are required (device classification triggers)
  • The design phases your organization uses (concept, feasibility, design and development, verification, validation, transfer, production)
  • The inputs, outputs, reviews, and approvals required at each phase
  • Roles and responsibilities (project manager, design lead, quality, regulatory, clinical)
  • How the DHF is created, maintained, and controlled
  • How design changes are managed
  • How risk management integrates with design controls
  • Reference to supporting procedures (risk management SOP, document control SOP, V&V SOPs)

Step 2: Define User Needs and Create the Design Plan

For each new device development project:

  1. Conduct user research — clinical observations, interviews, literature review, competitive analysis
  2. Document user needs in a User Needs Document (UND) or equivalent
  3. Create the Design and Development Plan (DDP) describing the project scope, phases, responsibilities, V&V strategy, and milestones
  4. Initiate the risk management plan (per ISO 14971)
  5. Set up the DHF structure

Step 3: Develop and Document Design Inputs

  1. Translate user needs into specific, verifiable design input requirements
  2. Include applicable standards, regulatory requirements, and risk control measures
  3. Review and approve design inputs with cross-functional input
  4. Begin the traceability matrix — link each input back to one or more user needs

Step 4: Generate Design Outputs and Conduct Design Reviews

  1. Execute the design — create specifications, drawings, prototypes, software
  2. Document design outputs with traceability to inputs
  3. Conduct formal design reviews at planned milestones (PDR, CDR)
  4. Update risk analysis as design decisions are made
  5. Update the traceability matrix — link outputs to inputs

Step 5: Execute Design Verification

  1. Develop verification test protocols based on design input requirements
  2. Define acceptance criteria and statistical rationale
  3. Execute tests and document results
  4. Resolve any failures through the design change process
  5. Update the traceability matrix — link verification results to inputs and outputs

Step 6: Execute Design Validation

  1. Build production-equivalent units using the intended manufacturing process
  2. Develop validation test protocols based on user needs and intended use
  3. Execute validation activities — usability testing, simulated use, clinical evaluation, biocompatibility, EMC, etc.
  4. Include software validation and risk analysis
  5. Document results and confirm all user needs are met
  6. Update the traceability matrix — link validation results to user needs

Step 7: Design Transfer

  1. Finalize the Device Master Record (DMR)
  2. Complete process validation for manufacturing processes
  3. Train production personnel
  4. Confirm production capability through pilot production or process performance qualification
  5. Implement design freeze — all further changes go through formal change control

Step 8: Ongoing Design Change Management

  1. For any post-transfer changes, follow the design change procedure
  2. Assess the impact on risk analysis, completed V&V, and manufactured product
  3. Re-verify and/or re-validate as needed
  4. Update the DHF, DMR, and traceability matrix
  5. Review and approve before implementation

Common Implementation Mistakes to Avoid

Even experienced companies make implementation errors. The most frequent are:

  • Over-documentation: Creating voluminous procedures and templates that no one reads or follows. Design controls should be right-sized to your device risk and organizational maturity. A Class II wearable sensor does not need the same level of procedural detail as a Class III implantable cardiac device.
  • Under-documentation: The opposite extreme — skipping records because "everyone knows what happened." Auditors cannot accept oral history. If it is not documented, it did not happen.
  • Sequential-only thinking: Treating design controls as a rigid sequence where each phase must be 100% complete before the next begins. In practice, there is significant overlap. Design inputs may be refined while early design outputs are being generated. The design plan should account for this iteration.
  • Disconnected risk management: Performing risk analysis as a standalone exercise disconnected from design inputs and V&V. Risk management outputs must feed design inputs, and V&V must confirm risk controls are effective.
  • Late DHF assembly: Compiling the DHF at the end of the project by gathering records from scattered locations. This is both unreliable and obvious to auditors.
  • Ignoring post-market data: Design controls do not end at product launch. Complaint data, post-market surveillance, and field experience should feed back into the design control process through design changes when warranted. Under the EU MDR, post-market surveillance is explicitly linked to the clinical evaluation and must inform design updates. Under the QMSR, risk management is a lifecycle activity — post-market data that reveals new hazards or changes risk estimates must trigger a reassessment of the design.
  • Not using templates: Attempting to create every design control document from scratch wastes time and introduces inconsistency. Standardized templates for design inputs, verification protocols, validation protocols, design review records, and the traceability matrix ensure consistency, reduce errors, and accelerate the documentation process. Invest in developing or acquiring a template set early, and refine it based on experience.

Tools and Documentation Best Practices

Documentation Principles

  • Document as you go. Retroactive documentation is visible to auditors, error-prone, and defeats the purpose of design controls. The DHF should be populated in real time as design activities occur.
  • Use controlled documents. All design control documents must be under document control — version controlled, reviewed, approved, and accessible to authorized personnel. This is a QMS fundamental, not a design control nicety.
  • Maintain traceability. The traceability matrix is not optional in practice. Without it, demonstrating completeness and the relationship between requirements, outputs, and test evidence becomes extremely difficult during audits.
  • Keep records audit-ready. FDA investigators can request to see your DHF during an inspection. The records should be organized, complete, and accessible without requiring a multi-day archaeological dig. If you need a week to compile your DHF, your process needs improvement.

Tools

Design controls can be managed with a range of tools, from simple to sophisticated:

Approach Pros Cons
Paper-based/file system Low cost, simple for small projects No version control, difficult to maintain traceability, poor scalability
Spreadsheets + document control Flexible, low cost, familiar tools Manual traceability, version control challenges, no automated audit trail
Dedicated eQMS with design control module Automated traceability, built-in workflows, audit trails, 21 CFR Part 11 compliance Higher cost, implementation effort, potential over-engineering for small companies
Requirements management tools (e.g., Jama, Polarion, DOORS) Strong traceability, requirements management, change impact analysis Specialized tools, learning curve, may not integrate with full QMS

The right tool depends on your company's size, device complexity, regulatory markets, and development team structure. A two-person startup developing a single Class II device has different needs than a multinational corporation with dozens of device families. What matters is not the tool — it is whether your process, supported by your tools, produces the documented evidence that regulations require.

Electronic Signatures and 21 CFR Part 11

Design control records require signatures for review and approval at multiple stages — design inputs, design outputs, design reviews, verification reports, validation reports, and design changes. Electronic signatures can satisfy these requirements, provided they comply with 21 CFR Part 11, which governs the use of electronic records and electronic signatures. Part 11 requires that electronic signatures be linked to their respective records, include the printed name, date and time, and meaning (e.g., approval, review), and that systems include controls to prevent falsification. If you are using an eQMS or electronic document management system for design control records, ensure it is Part 11 compliant — particularly regarding audit trails, access controls, and signature authentication.

Computer Software Assurance (CSA) and Design Controls

The FDA's shift from traditional Computer System Validation (CSV) to Computer Software Assurance (CSA) is relevant to design controls in two ways. First, if your device includes software, the risk-based approach of CSA encourages focusing testing effort on high-risk software functions rather than applying uniform testing rigor across all features. This aligns with the IEC 62304 safety classification approach. Second, if you use software tools in your design control process (eQMS, requirements management tools, test management systems), CSA provides a more efficient framework for assuring those tools are fit for purpose, allowing you to leverage vendor testing for lower-risk features and focus your validation effort on the specific ways your organization uses the tool. CSA does not eliminate the need for software assurance — it makes it more risk-proportionate.

Phase Gate Reviews

Many companies implement design controls using a phase gate (or stage gate) model:

Gate Milestone Key Deliverables Reviewed
Gate 0 Concept approval User needs document, feasibility assessment, regulatory strategy
Gate 1 Design input approval Design input requirements, risk management plan, V&V strategy, design plan
Gate 2 Design output review (PDR/CDR) Design specifications, architecture, preliminary risk analysis, verification plan
Gate 3 Verification complete Verification test reports, updated risk analysis, design review records
Gate 4 Validation complete Validation test reports, usability study, clinical data, final risk management report
Gate 5 Design transfer DMR complete, process validation, production readiness, DHF review

Each gate requires formal review and approval before the project can proceed. This structure provides natural checkpoints for design reviews and ensures that design control activities are completed in a systematic order.

Design Controls Under QMSR — Summary of Changes

With the QMSR now effective (February 2, 2026), here is a consolidated summary of what changed for design controls:

Element Under Old QSR (21 CFR 820.30) Under QMSR
Authoritative text 21 CFR 820.30 subsections (a)-(j) ISO 13485:2016 clause 7.3, incorporated by reference
Design planning Plans describing activities and responsibilities Same, with explicit requirement to update plans as project evolves
Design inputs Address intended use, user needs, patient needs Adds explicit requirements for usability, risk management outputs, and applicable standards
Design outputs Meet input requirements, include acceptance criteria Adds requirements for purchasing, production, and service information
Design review Cross-functional, documented Same requirements; enhanced emphasis on risk reassessment at reviews
Verification Confirm outputs meet inputs Adds statistical technique documentation and interconnected device requirements
Validation Production units, actual/simulated use Adds explicit completion-before-transfer requirement and interconnected device requirements
Transfer Correctly translated into production specs More detailed — verify suitability for manufacturing, confirm production capability
Changes Document, validate/verify, review, approve Adds impact assessment on in-process/delivered products and risk management
Design file DHF per device type DHF retained as FDA-specific term; aligns with ISO 13485 design and development file concept
Risk management Required "where appropriate" in validation Integrated throughout all design phases via ISO 14971 reference

Bottom line: The QMSR does not invent new design control requirements. It aligns the US framework with the international standard (ISO 13485) that most manufacturers already follow. If your design control process was compliant with both the old QSR and ISO 13485, your transition work is primarily procedural — updating references, training, and confirming risk management integration.

Key Takeaways

  1. Design controls are a system, not a checklist. They are an interconnected set of practices spanning the entire design process — from user needs through production. Treating them as a box-checking exercise is how companies end up with 483 observations.

  2. User needs are the starting point and the endpoint. Design inputs are derived from user needs. Design validation confirms the device meets user needs. If you lose sight of the user, your design controls will be technically compliant but practically useless.

  3. Risk management is woven into every phase. Under both ISO 13485 and the QMSR, risk management is not a parallel activity — it feeds design inputs, informs design reviews, and is confirmed through verification and validation. The days of a standalone risk file disconnected from design controls are over.

  4. Traceability is non-negotiable. The traceability matrix connecting user needs to design inputs to outputs to V&V evidence is the single most effective tool for demonstrating completeness and audit readiness.

  5. Design validation must use production-equivalent units under real-world conditions. This is the most frequently cited design control deficiency. Lab prototypes tested on a bench are verification, not validation.

  6. Design changes require full rigor. Every change must be assessed for risk impact, re-verified or re-validated as appropriate, and approved before implementation. Post-hoc documentation of changes is a common audit finding.

  7. The DHF is built throughout development, not assembled at the end. A DHF created retroactively is a red flag to any auditor. Populate it in real time.

  8. The QMSR harmonizes, not reinvents. The transition from QSR to QMSR aligns US requirements with ISO 13485. If you were already compliant with both, the changes are manageable. Focus on updating references, strengthening risk management integration, and training your team.

  9. Software requires additional rigor. When your device includes software, IEC 62304 supplements the general design control framework with software-specific lifecycle requirements. Safety classification drives the level of documentation and testing required.

  10. Design controls save time when done right. The upfront investment in clear requirements, structured reviews, and systematic V&V prevents the late-stage rework, failed submissions, and recalls that actually destroy timelines and budgets.

Design Controls for Startups and Small Companies

Medical device startups face a unique challenge: they need to implement design controls from the earliest stages of product development, but they often lack the resources, experience, and infrastructure of established manufacturers. Here is practical guidance for startups:

Start design controls early — not when you "need to." A common mistake is deferring design controls until a regulatory submission is imminent. By that point, recreating design control records retroactively is expensive, unreliable, and obvious to auditors. The best approach is to "bootstrap" your QMS by implementing the minimum necessary design control procedures from the start and building on them as your organization matures.

Right-size your procedures. A startup developing a single Class II device does not need the same procedural infrastructure as a multinational corporation. Your design control SOP can be a concise document that covers the essentials. What matters is that the required activities are planned, performed, and documented — not that you have 50-page procedures.

Use templates. Standardized templates for design inputs, design outputs, risk analysis, verification protocols, validation protocols, design review records, and the traceability matrix dramatically reduce the effort of creating design control documentation. Many industry resources and consulting firms provide template sets specifically designed for medical device startups. Templates ensure consistency, reduce the chance of missing required elements, and make it easier for new team members to contribute.

Involve QA/RA expertise early. Even if you do not have a full-time quality or regulatory team member, engage QA/RA expertise — whether through a consultant, advisor, or part-time hire — from the beginning of the project. QA/RA professionals can help structure your design control process correctly from the start, avoiding the costly rework of fixing procedural gaps later.

Build the DHF as you go. Set up your DHF structure at project kickoff and populate it continuously. A well-maintained DHF from the start is one of the strongest signals to investors, partners, and regulators that your company operates with quality discipline.

Prioritize the four essentials. At minimum, a startup in active product development should have these four QMS components in place: (1) document control, (2) design controls, (3) risk management, and (4) management review. Everything else can be built out as you approach commercialization.

Cost perspective: Initial design control implementation typically costs $50,000-$200,000 depending on device complexity, team size, and regulatory markets. However, the ROI from avoiding recalls, reducing development rework, and achieving faster regulatory clearance typically provides a 3-10x return on investment.

Design Controls for Different Device Types

While the fundamental design control requirements are the same regardless of device type, the practical implementation varies significantly based on device complexity, risk class, and technology:

Simple mechanical devices (e.g., surgical instruments, manual orthopedic tools): Design controls may be relatively straightforward, with design inputs focused on materials, dimensions, and mechanical performance. Verification is often inspection and testing-based. Validation may involve simulated-use studies with surgeons. The DHF may be compact.

Complex electromechanical devices (e.g., infusion pumps, patient monitors, imaging systems): Design controls must address hardware, software, electrical safety, EMC, human factors, and potentially connectivity/cybersecurity. Multiple engineering disciplines contribute to design inputs and outputs. Verification requires extensive testing across domains. Validation is multi-faceted.

Software as a Medical Device (SaMD) (e.g., diagnostic algorithms, clinical decision support): The entire design control process applies to software. IEC 62304 provides the lifecycle framework. There is no "hardware" to defer risk to — the software is the device. Design inputs include algorithm performance requirements, data handling specifications, and cybersecurity requirements. Validation requires clinical performance studies and real-world data.

Combination products (e.g., drug-eluting stents, prefilled injector pens): Design controls must address both the device and the drug/biologic components, including their interaction. This typically requires coordination between device-focused and drug-focused quality systems and regulatory strategies.

IVD devices (e.g., laboratory analyzers, point-of-care tests, reagent kits): Design controls must address analytical performance (accuracy, precision, specificity, sensitivity, linearity) and clinical performance. Under the EU IVDR, performance evaluation requirements add another layer to design validation. Reagent lot-to-lot consistency becomes a design transfer concern.

Frequently Asked Questions

What are design controls in medical device development?

Design controls are a regulatory-required system of practices and procedures incorporated into the design and development process of a medical device. They ensure that the device is safe, effective, meets user needs, and complies with applicable regulations and standards. In the US, design controls are mandated by 21 CFR 820.30 (now under the QMSR). Internationally, ISO 13485:2016 clause 7.3 defines equivalent requirements under the heading "Design and Development."

Are design controls required for Class I medical devices?

Most Class I devices are exempt from design controls. However, certain Class I devices are specifically listed in 21 CFR 820.30(a)(2) as requiring design controls. These include devices automated with computer software and specific device categories. If your Class I device includes software, assume design controls apply unless you have confirmed the exemption with the FDA's device classification database.

What is the difference between design verification and design validation?

Design verification confirms that design outputs meet design input requirements — answering "did we build the device correctly?" Design validation confirms that the finished device meets user needs and intended uses under actual or simulated conditions — answering "did we build the right device?" Verification can be performed throughout development on components and subsystems. Validation must be performed on the final, production-representative device. For a comprehensive comparison, see Design Verification vs. Design Validation.

What is a Design History File (DHF)?

A DHF is a compilation of records that documents the complete design and development history of a medical device. It contains or references all records necessary to demonstrate that the design was developed in accordance with the approved design plan and regulatory requirements. The DHF includes design plans, inputs, outputs, risk management records, design review minutes, verification and validation reports, design change records, and design transfer documentation.

How does the QMSR affect design controls?

The QMSR, effective February 2, 2026, incorporates ISO 13485:2016 by reference, making it the authoritative text for design and development requirements in the US. The core design control elements remain the same, but the QMSR enhances expectations around risk management integration, usability inputs, statistical rigor in V&V, and documentation of design transfer activities. The FDA-specific DHF, DMR, and DHR requirements are retained. See QSR to QMSR Transition for full details.

What is a design control traceability matrix?

A design control traceability matrix is a document that maps the relationships between user needs, design inputs, design outputs, risk controls, verification activities, and validation activities. It demonstrates completeness — every user need is addressed by a design input, implemented in a design output, verified against the input specification, and validated against the user need. It is an essential tool for audit readiness and change impact analysis.

When should I start building the DHF?

Immediately. The DHF should be established at the beginning of the design project and populated continuously as design activities occur. Building a DHF retrospectively — after design is complete — is error-prone, time-consuming, and a visible red flag during regulatory inspections. Your design plan should define the DHF structure and identify which records will be captured at each phase.

How do design controls apply to software medical devices?

All general design control requirements (planning, inputs, outputs, reviews, V&V, transfer, changes, DHF) apply to software medical devices. Additionally, IEC 62304 provides software-specific lifecycle requirements that supplement the general framework. Software safety classification under IEC 62304 determines the rigor of documentation and testing. FDA explicitly requires software validation as part of design validation (21 CFR 820.30(g)). For SaMD (Software as a Medical Device), the entire design control process applies to the software itself — there is no hardware to defer complexity to. See our IEC 62304 Guide for implementation details.

What is a design freeze and when does it happen?

A design freeze is an industry practice (not a regulatory term) that marks the point at which the design is considered complete and ready for transfer to production. After the design freeze, any further changes must go through formal design change control. The freeze typically occurs after design validation is complete and before or during design transfer. It does not mean the design can never change — it means changes are now controlled, assessed for risk impact, and require formal approval.

What are the most common design control audit findings?

The most frequently cited design control deficiencies in FDA inspections include: (1) design validation failures — not using production-equivalent units, not testing under actual use conditions, missing software validation; (2) design change control failures — implementing changes before approval, not assessing risk impact; (3) design input deficiencies — incomplete documentation, missing risk control measures; (4) design review failures — no independent reviewer, inadequate documentation; and (5) DHF deficiencies — missing records, incomplete documentation. Design controls consistently rank among the top three most-cited categories on Form 483 observations.

When is re-validation required?

Re-validation is required whenever a design change could affect the safety or effectiveness of the device (per 21 CFR 820.30(i) and ISO 13485 clause 7.3.9). This includes changes to materials, components, software, manufacturing processes, or specifications that affect the device's performance or risk profile. The design change assessment must determine whether the change affects previously completed validation activities and, if so, what re-validation is required. Re-validation may also be triggered by post-market surveillance data indicating the device is not performing as validated, or by changes in standards or regulatory requirements that alter the validation acceptance criteria.

Can electronic signatures be used for design control records?

Yes. Electronic signatures can satisfy design control documentation requirements, provided they comply with 21 CFR Part 11 (Electronic Records; Electronic Signatures). Part 11 requires that electronic signatures be attributable to a specific individual, include the date and time, indicate the meaning of the signature (e.g., approval, review, authorship), and be linked to the associated record. The system must include controls to prevent unauthorized access, maintain audit trails, and ensure the integrity of records. Most modern eQMS platforms provide Part 11-compliant electronic signature functionality.

How much does implementing design controls cost?

Implementation costs vary widely based on device complexity, team size, and regulatory markets. Initial implementation typically ranges from $50,000 to $200,000, covering procedure development, template creation, tool setup, training, and initial project application. Ongoing costs include tool subscriptions (if using eQMS software), training for new team members, and periodic procedure updates. However, the investment pays for itself: the ROI from avoided recalls, reduced rework, and faster regulatory clearance typically provides a 3-10x return. The cost of not implementing design controls — in recalls, warning letters, delayed approvals, and patient harm — is orders of magnitude higher.

Do design controls apply to SaaS and cloud-based medical devices?

Yes. If the software is a medical device (SaMD) or is part of a medical device, all design control requirements apply regardless of the deployment model. 21 CFR 820.30(a)(2)(i) explicitly states that Class I devices "automated with computer software" are subject to design controls. For SaaS-based medical devices, additional considerations include cybersecurity requirements, data integrity, update management (each update is potentially a design change), and validation of the cloud infrastructure. The continuous deployment model common in SaaS development requires careful integration of design change control with the release management process.

What is the relationship between MDSAP and design controls?

The Medical Device Single Audit Program (MDSAP) is a multi-national audit program that allows a single audit to satisfy the regulatory requirements of participating countries (US, Canada, Australia, Japan, Brazil). MDSAP audits assess design controls as part of the ISO 13485 evaluation. For companies marketing devices in multiple countries, MDSAP provides an efficient path to demonstrating design control compliance across jurisdictions through a single audit process, reducing audit burden and the risk of conflicting findings from different national auditors.