CSV to CSA Transition: Complete Guide to FDA's 2025 Computer Software Assurance Final Guidance
How to transition from Computer System Validation (CSV) to Computer Software Assurance (CSA) under FDA's September 2025 final guidance — risk-based approach, testing strategies, documentation requirements, and step-by-step implementation roadmap.
The Validation Paradigm Has Shifted
On September 24, 2025, the FDA's Center for Devices and Radiological Health (CDRH) and Center for Biologics Evaluation and Research (CBER) released the final guidance titled "Computer Software Assurance for Production and Quality System Software." This document formally replaces Section 6 ("Validation of Automated Process Equipment and Quality System Software") of the legacy General Principles of Software Validation (GPSV) guidance from 2002. An updated version followed on February 3, 2026, to improve alignment with the Quality Management System Regulation (QMSR) that took effect on February 2, 2026.
The message is unambiguous: the era of one-size-fits-all, documentation-heavy Computer System Validation (CSV) is over. FDA now expects a risk-based, critical-thinking-driven approach called Computer Software Assurance (CSA). This is not a suggestion — it is the Agency's endorsed framework for validating production and quality system software in medical device manufacturing.
This guide covers everything you need to understand the transition, implement CSA, and prepare for FDA inspections under the new expectations.
What Is CSA and Why FDA Made the Change
The Problem with Traditional CSV
Traditional CSV dates back to the late 1990s and early 2000s. It was built around exhaustive documentation: every software function tested with scripted test cases, every screen captured as evidence, every test result documented in painstaking detail — regardless of whether the function posed any risk to patient safety or product quality.
This approach had several consequences:
- Validation backlogs: Companies spent months or years validating a single system, creating a bottleneck for adopting new technology
- Resource misallocation: Teams spent equal effort validating low-risk functions (like a login screen) and high-risk functions (like a sterilization cycle control parameter)
- Documentation theater: Thousands of pages of screenshots and test scripts that inspectors did not want to review and companies did not want to produce
- Innovation stagnation: The burden of CSV discouraged companies from upgrading systems, adopting cloud solutions, or implementing automation
The Case for Quality Initiative
FDA's CDRH launched the Case for Quality (CfQ) program in 2011, working collaboratively with industry stakeholders and the International Society for Pharmaceutical Engineering (ISPE) GAMP community. The goal was to identify what actually drives device quality versus what merely satisfies a checklist. After years of analysis, one conclusion stood out: traditional CSV was consuming enormous resources without proportionally improving patient outcomes.
CSA emerged from this initiative as a framework that redirects validation effort where it matters most — toward software functions that directly affect product quality and patient safety.
What CSA Changes
CSA reframes the question from "Did you test everything?" to "Do you have confidence that the software performs as intended, where it matters most?" The key shifts are:
| Dimension | Traditional CSV | CSA (Risk-Based) |
|---|---|---|
| Approach | Uniform validation across all systems | Risk-based, targeted assurance |
| Testing strategy | Fully scripted test cases for every function | Scripted testing for high-risk; unscripted/exploratory for low-risk |
| Documentation | Extensive and uniform regardless of risk | Lean, fit-for-purpose, commensurate with risk |
| Vendor evidence | Often re-validated from scratch | Accepted where supplier is qualified |
| Focus | Compliance-driven (prove it works) | Quality-driven (confidence it works) |
| Agility | Incompatible with modern development | Supports Agile, DevOps, CI/CD |
| Regulatory alignment | 21 CFR Part 11, legacy GPSV | FDA CSA Guidance (2025), 21 CFR Part 820/QMSR, ISO 13485 |
Scope: What CSA Covers and What It Does Not
In Scope
The CSA guidance applies to computers and automated data processing systems used as part of:
- Medical device production — manufacturing execution systems (MES), process control software, automated inspection systems, labeling systems, packaging automation
- Medical device quality systems — electronic document management, quality event tracking (CAPA, complaints, nonconformances), training management, calibration management, supplier quality systems, audit management
This includes Software-as-a-Service (SaaS) solutions, cloud-hosted platforms, and on-premises systems.
Out of Scope
The guidance explicitly does not apply to:
- Software in a Medical Device (SiMD) — embedded firmware or software that is part of the device itself
- Software as a Medical Device (SaMD) — standalone software that meets the definition of a medical device
- Software used in clinical trials (governed by separate GCP requirements)
- Software used in non-GxP business functions (e.g., general HR or finance systems with no quality impact)
SiMD and SaMD continue to follow design validation requirements under 21 CFR 820.30 (now incorporated into QMSR via ISO 13485 Clause 7.3) and are covered by other FDA guidance documents.
The CSA Framework: Five Core Steps
The final guidance outlines a structured approach with five interconnected steps.
Step 1: Determine Intended Use
Before any testing begins, you must clearly define what the software is supposed to do within your production or quality system. This is not a general product description — it is a specific, operational statement of how you use the software.
Key elements:
- Identify the software function or feature: Be specific (e.g., "automated pass/fail determination for dimensional inspection of catheter shaft diameter")
- Define the intended use within your process: How does this function fit into your production or quality workflow?
- Identify the process: Which manufacturing or quality process does this support?
- Document the connection: Show traceability between the intended use and the process step
This step creates the foundation for everything that follows. If you cannot articulate the intended use, you cannot determine the risk, and you cannot select appropriate assurance activities.
Step 2: Determine Risk
This is the heart of CSA. For each identified intended use, you evaluate the risk — specifically, the potential impact on patient safety and product quality if the software function does not perform as intended.
FDA uses a two-category framework:
| Risk Level | Definition | Examples |
|---|---|---|
| High process risk | Software function failure could directly affect device safety, efficacy, or quality | Sterilization cycle parameters, dose calculations, final product release criteria, alarm limits on life-support equipment |
| Not high process risk | Software function failure would not directly affect device safety or quality but may affect operational efficiency | Training record formatting, report generation, non-critical data display |
The risk determination must be documented with rationale. You are expected to apply critical thinking — not simply default to "high risk" for everything to be safe. FDA explicitly discourages over-classification.
Important nuance: risk is evaluated at the function level, not the system level. A single system may have both high-risk and low-risk functions, and each is treated independently.
Step 3: Determine Appropriate Assurance Activities
Once risk is categorized, you select assurance activities proportional to that risk.
For high process risk functions:
- Scripted testing — Detailed test protocols with predefined inputs, expected outputs, and pass/fail criteria. This is the closest to traditional CSV but focused only on the high-risk functions
- Boundary and stress testing — Testing at the edges of operational parameters
- Regression testing — Ensuring changes do not degrade previously validated functions
For functions that are not high process risk:
- Unscripted/exploratory testing — Less formal testing where testers explore the software's behavior without rigidly predefined scripts. Testers use their knowledge and experience to probe the software
- Vendor evidence — Leveraging the vendor's own testing documentation, certifications, and quality records (if the vendor has been assessed)
- Automated testing — Using automated test tools to verify function behavior
- Continuous monitoring — Ongoing surveillance of software performance in production
The following table, adapted from FDA's guidance (Table 1), summarizes the assurance activity selection:
| Assurance Activity | High Process Risk | Not High Process Risk |
|---|---|---|
| Scripted testing | Recommended | Optional |
| Unscripted/exploratory testing | As supplement | Recommended |
| Vendor evidence/supplier assessment | As supplement | Recommended |
| Automated testing | Recommended where feasible | Recommended where feasible |
| Continuous monitoring | Recommended | As needed |
| Leveraging existing process controls | Recommended | Recommended |
Step 4: Determine Appropriate Records
FDA's documentation expectations under CSA are deliberately leaner than CSV, but they still exist. The guidance states that records should "retain sufficient details of the assurance activity to serve as a baseline for improvements or as a reference point if issues occur" and that documentation "need not include more evidence than necessary to show that the software feature, function, or operation performs as intended for the risk identified."
Typical CSA documentation includes:
| Document | Purpose | Typical Size vs. CSV |
|---|---|---|
| Validation plan | Scope, approach, risk assessment methodology, roles | Similar structure, more concise |
| Risk assessment | Function-level risk categorization with rationale | New — not always present in CSV |
| Test records | Evidence of testing performed | Significantly smaller — no unnecessary screenshots |
| Validation summary report | Results, conclusions, residual risk | Similar structure, more focused |
| Trace matrix | Intended use → risk → test evidence | Streamlined vs. CSV's exhaustive matrices |
Step 5: Manage Changes
When software changes are made — updates, patches, configuration changes, migration to new versions — the same CSA principles apply. Re-assess risk for the changed function, determine appropriate assurance activities, and document accordingly.
For devices with approved PMAs or Humanitarian Device Exemptions (HDEs), FDA provides specific guidance on how to determine whether a software change requires a 30-day supplement, can be handled through an annual report, or falls somewhere in between — based on whether the change may affect safety or effectiveness.
Implementation Roadmap
Transitioning from CSV to CSA is not an overnight switch. It requires cultural change, updated SOPs, and training. Here is a practical phased approach:
Phase 1: Foundation (Months 1-2)
- Read the guidance thoroughly: Obtain the final guidance document from FDA's website
- Establish a CSA task force: Include quality, IT, manufacturing, and regulatory representatives
- Audit your current validation inventory: List all validated systems and classify each as production or quality system software
- Develop risk assessment criteria: Define what constitutes "high process risk" vs. "not high process risk" for your specific products and processes
- Draft or update SOPs: Create a CSA procedure that replaces or supplements your existing CSV SOP
Phase 2: Pilot (Months 3-4)
- Select 1-2 low-complexity systems for your first CSA validations (e.g., a training management system or document review workflow)
- Apply the full CSA framework: Intended use → risk → assurance activities → records
- Conduct unscripted testing: Train your team on exploratory testing techniques
- Document lessons learned: What worked? Where did the team struggle? What templates need revision?
Phase 3: Expansion (Months 5-8)
- Extend CSA to medium-complexity systems: Manufacturing execution systems, quality event tracking
- Develop vendor assessment procedures: Establish criteria for accepting vendor evidence in lieu of duplicate testing
- Train the broader organization: Ensure production operators, quality engineers, and management understand the CSA philosophy
- Build CSA into change control: Every software change now follows the CSA framework
Phase 4: Full Adoption (Months 9-12)
- Apply CSA to all new system validations: No new CSV protocols
- Migrate legacy validations on a risk-prioritized basis: As legacy systems come up for periodic review or significant changes, transition them to CSA
- Conduct internal audit of CSA compliance: Verify that your CSA records meet FDA's expectations
- Prepare for FDA inspection readiness: Ensure investigators can see the CSA rationale, risk assessments, and lean documentation
Vendor Assessment and SaaS Considerations
One of the most impactful changes in CSA is the acceptance of vendor evidence. Under traditional CSV, companies often re-tested commercial off-the-shelf (COTS) software from scratch, duplicating the vendor's own testing. CSA allows you to rely on vendor-provided evidence — if you have adequately assessed the vendor.
Vendor Assessment Framework
| Assessment Area | What to Evaluate | Evidence to Collect |
|---|---|---|
| Quality management system | Does the vendor have a certified QMS (ISO 9001, ISO 13485, SOC 2)? | Certificates, audit reports |
| Software development practices | SDLC methodology, testing practices, change management | SDLC documentation, test summary reports |
| Validation documentation | Does the vendor validate their own software? What testing do they perform? | Vendor validation summary, test protocols |
| Security and data integrity | Access controls, encryption, audit trails, backup/recovery | Security documentation, penetration test results |
| Regulatory compliance | Does the vendor support 21 CFR Part 11 compliance? | Part 11 compliance statements, feature documentation |
| Customer support and issue resolution | How quickly are bugs fixed? How are updates communicated? | SLA documentation, customer references |
For SaaS solutions specifically, also evaluate:
- Data hosting location and compliance with applicable data protection regulations
- Uptime SLAs and disaster recovery capabilities
- Update deployment practices — are updates forced? Can you control timing?
- API security and integration architecture
CSA and the QMSR Connection
The timing of the CSA guidance is not coincidental. On February 2, 2026, the FDA's Quality Management System Regulation (QMSR) took effect, replacing the Quality System Regulation (QSR) and incorporating ISO 13485:2016 by reference into 21 CFR Part 820. The February 2026 update to the CSA guidance improved alignment with QMSR terminology and requirements.
Key intersections:
- ISO 13485 Clause 4.1.6: Requires validation of computer software used in the QMS — CSA provides the methodology
- ISO 13485 Clause 7.5.6: Requires validation of processes for production and service provision where output cannot be verified — CSA applies to the software controlling such processes
- QMSR inspection framework: Under Compliance Program 7382.850, FDA investigators will evaluate software assurance records using CSA expectations
- 21 CFR Part 11: Remains in effect — electronic records and electronic signatures still require Part 11 compliance, but CSA defines how you validate the systems that generate them
Common Mistakes to Avoid
1. Treating CSA as "Less Validation"
CSA is not a license to skip validation. It is a smarter allocation of validation effort. High-risk functions still receive thorough scripted testing. The reduction in effort applies to low-risk functions where extensive documentation adds no quality value.
2. Inadequate Risk Justification
If you classify a function as "not high process risk," you must document why. FDA expects to see thoughtful risk rationale, not blanket declarations. An inspector will question a risk assessment that categorizes every function as low risk.
3. Skipping Vendor Assessment
Accepting vendor evidence without assessment is a compliance gap. You must demonstrate that you evaluated the vendor's quality practices before relying on their testing data.
4. Not Training Your Team
CSA requires critical thinking and professional judgment — skills that many validation teams have not needed under the rigid CSV framework. Training on exploratory testing techniques, risk assessment methodology, and CSA documentation expectations is essential.
5. Ignoring Legacy Systems
You do not need to retroactively re-validate every existing system under CSA immediately. But as legacy systems undergo changes or periodic review, transition them to the CSA framework. Have a documented plan for this migration.
Comparison: Traditional CSV Protocol vs. CSA Approach
| Element | Traditional CSV | CSA Approach |
|---|---|---|
| Planning document | 50-100+ page validation plan | 10-20 page streamlined plan with risk assessment |
| Risk assessment | Often a separate FMEA, not integrated | Integrated, function-level risk analysis |
| Test scripts | Every function scripted with screenshots | Scripted for high-risk; exploratory for low-risk |
| Test execution time | Weeks to months per system | Days to weeks per system |
| Documentation volume | Hundreds to thousands of pages | Tens to low hundreds of pages |
| Vendor testing | Typically duplicated by user | Leveraged where vendor is qualified |
| Revalidation triggers | Any change triggers full revalidation | Risk-based revalidation scope |
| Inspector experience | Reviews large binders of test evidence | Reviews focused risk rationale and key evidence |
| Cost per system | $50,000-$200,000+ | $15,000-$80,000 (estimated industry range) |
FAQ
Does CSA replace 21 CFR Part 11? No. Part 11 remains in effect for electronic records and electronic signatures. CSA defines how you validate the systems that must comply with Part 11. The two work together — CSA is the validation methodology; Part 11 is the compliance requirement.
Do I need to re-validate my existing systems under CSA immediately? No. FDA does not expect retroactive re-validation. However, as existing systems undergo changes, periodic review, or are migrated to new versions, you should apply CSA principles going forward.
Is CSA mandatory or optional? The guidance is titled "guidance," and FDA guidance documents are not legally binding. However, they represent the Agency's current thinking. If you follow traditional CSV and it meets the regulatory requirements, you can continue. But FDA inspectors are now trained on CSA expectations, and the guidance makes clear that the Agency considers CSA the preferred approach.
Does CSA apply to pharmaceutical companies? The September 2025 final guidance specifically addresses medical device production and quality system software. FDA released a separate final guidance for pharmaceutical quality systems in February 2026. The principles are the same, but the specific documents differ.
Can we use automated testing tools under CSA? Yes. FDA explicitly encourages the use of automated testing tools, continuous integration/continuous deployment (CI/CD) pipelines, and modern test management systems. Automated testing can provide more consistent and thorough coverage than manual scripted testing.
What if our Notified Body is not familiar with CSA? CSA is an FDA initiative. For EU market access, Notified Bodies may expect validation approaches aligned with EU GMP Annex 11 and GAMP 5. However, the risk-based principles underlying CSA are consistent with GAMP 5's approach. Document your methodology clearly and be prepared to explain how CSA satisfies ISO 13485 Clause 4.1.6 requirements.
How does CSA handle AI and machine learning systems? The CSA guidance covers the validation of software used in production and quality systems — including AI/ML tools used for manufacturing or quality purposes. For AI/ML that is itself a medical device (SaMD), separate FDA guidance applies, including the Predetermined Change Control Plan (PCCP) framework.
What records should we retain for CSA? FDA recommends retaining sufficient detail to serve as a baseline for improvements and as a reference if issues occur. This typically includes: the validation plan, risk assessment, test records (scripted and unscripted), validation summary report, and a traceability record linking intended use to risk to evidence.
How long does a CSA validation take compared to CSV? Industry experience from early adopters suggests CSA can reduce validation timelines by 40-60% for systems with a mix of high and low-risk functions. The savings come primarily from reduced documentation for low-risk functions and leveraging vendor evidence. High-risk functions still require thorough scripted testing.
Does CSA eliminate the need for IQ/OQ/PQ? Not entirely. Installation Qualification (IQ) remains important to verify that software is correctly installed in your environment. Operational Qualification (OQ) and Performance Qualification (PQ) concepts map to CSA's assurance activities but are applied based on risk. High-risk functions may still follow structured OQ/PQ-style protocols; low-risk functions may be verified through exploratory testing or vendor evidence.