Post-Market Surveillance for Medical Devices: The Complete Guide
Everything you need to know about post-market surveillance (PMS) for medical devices — regulatory requirements under FDA and EU MDR/IVDR, PMS plan templates, PMCF studies, PSURs, vigilance reporting, and practical guidance for building a PMS system from scratch.
What Is Post-Market Surveillance?
Post-market surveillance (PMS) is the systematic process of collecting, analyzing, and acting on information about a medical device after it has been placed on the market. It is not optional. Every major regulatory framework — FDA, EU MDR, IVDR, Health Canada, TGA, NMPA — mandates PMS in some form.
The purpose is straightforward: clinical trials and bench testing, no matter how thorough, cannot capture every issue that surfaces when a device is used by thousands or millions of patients across diverse clinical settings, by operators with varying skill levels, over years of real-world use.
PMS is how you find out that your surgical stapler misfires at a higher rate in humid environments. That your IVD reagent degrades faster than expected when stored alongside certain other chemicals. That clinicians are using your device off-label in a way that creates a safety risk you never anticipated.
It is also how you demonstrate to regulators — on an ongoing basis — that your device's benefit-risk profile remains acceptable throughout its commercial life.
Why PMS Matters More Than Ever
For decades, PMS was treated as a compliance checkbox by many manufacturers. Collect complaints, file the paperwork, move on. That era is over.
Three forces have elevated PMS from back-office compliance function to strategic priority:
EU MDR/IVDR raised the bar dramatically. The Medical Device Regulation (EU 2017/745) and In Vitro Diagnostic Regulation (EU 2017/746) introduced requirements that are qualitatively different from the old Medical Device Directives. PMS is no longer "collect complaints and report serious incidents." It is a proactive, continuous system requiring structured plans, periodic safety reports, clinical follow-up studies, and trend analysis — all feeding back into risk management and clinical evaluation.
FDA is increasingly data-driven. Through initiatives like the National Evaluation System for health Technology (NEST), the Sentinel System, and greater use of real-world evidence (RWE), FDA expects manufacturers to actively monitor device performance using broader data sources than complaint files alone.
Enforcement is real. EU Notified Bodies are auditing PMS systems with teeth. FDA warning letters increasingly cite inadequate complaint handling and failure to report. MDCG guidance documents have clarified expectations to a level of specificity that leaves little room for ambiguity.
Regulatory Requirements: A Side-by-Side Comparison
The following table summarizes PMS requirements across the three most impactful regulatory frameworks for medical device manufacturers.
| Requirement | FDA (US) | EU MDR (2017/745) | EU IVDR (2017/746) |
|---|---|---|---|
| Legal basis | 21 CFR 803 (MDR), 806 (Corrections/Removals), 822 (Post-market Surveillance) | Articles 83–86, Annex III | Articles 78–81, Annex III |
| PMS plan required? | Required for certain devices under 21 CFR 822; complaint procedures required for all | Yes — mandatory for all devices (Article 84) | Yes — mandatory for all devices (Article 79) |
| Periodic safety reporting | Annual reports for PMA devices (21 CFR 814.84) | PSUR for Class IIa/IIb/III (Article 86); PMS report for Class I (Article 85) | PSUR for Class B/C/D (Article 81); PMS report for Class A (Article 80) |
| PSUR frequency | Not applicable (annual reports serve a different function) | At least every 2 years for Class IIa; annually for Class IIb/III | At least every 2 years for Class B; annually for Class C/D |
| PSUR submitted to | N/A | Notified Body; available to competent authority | Notified Body; available to competent authority |
| Clinical follow-up | Post-approval studies for PMA devices (condition of approval) | PMCF mandatory for Class III, IIb implantables; expected for most Class IIa/IIb | PMPF mandatory for Class C/D; expected for most Class B |
| Vigilance reporting | MDR reports to FDA within 30 days (5 days for emergencies) | Serious incidents to competent authority: 15 days (serious public health threat: 2 days; death/unanticipated: 10 days) | Same timelines as MDR |
| Trend reporting | 21 CFR 803.53 (manufacturers must report trends) | Article 88 — statistically significant increase in frequency/severity | Article 83 — same approach as MDR |
| Database | MAUDE | EUDAMED (when fully functional) | EUDAMED |
| Corrections/Removals | 21 CFR Part 806 — report to FDA within 10 working days | Field Safety Corrective Actions (FSCA) — reported through vigilance system | FSCA — same as MDR |
FDA Requirements in Detail
The FDA's PMS framework is spread across multiple regulations:
21 CFR Part 803 — Medical Device Reporting (MDR): Manufacturers, importers, and device user facilities must report device-related deaths, serious injuries, and malfunctions. Manufacturers must report within 30 calendar days of becoming aware. Five-day reports are required when the manufacturer becomes aware of events that necessitate remedial action to prevent an unreasonable risk of substantial harm, or when FDA has requested 5-day reports.
21 CFR Part 806 — Corrections and Removals: When a manufacturer corrects or removes a device to reduce a health risk or remedy a violation, they must report to FDA within 10 working days. This is separate from the MDR reporting obligation.
21 CFR Part 822 — Post-market Surveillance: FDA can order post-market surveillance studies for certain devices — typically Class II or III devices whose failure could have serious health consequences, that are expected to have significant use in pediatric populations, that are intended to be implanted for more than one year, or that are life-sustaining or life-supporting. These are specific ordered studies, not the broader PMS system that EU MDR requires.
Annual Reports (21 CFR 814.84): PMA holders must submit annual reports summarizing new clinical data, device modifications, manufacturing changes, and complaint/MDR trends.
Practical note: FDA does not require a standalone "PMS Plan" document for most devices. But a well-organized manufacturer will have one anyway — it is the backbone for ensuring complaint handling, MDR reporting, CAPA integration, and trend analysis are happening systematically rather than ad hoc.
EU MDR Requirements in Detail
The EU MDR treats PMS as a continuous, proactive system that feeds into risk management, clinical evaluation, and the overall quality management system.
Article 83 — PMS System: Manufacturers must establish, document, implement, maintain, update, and continually improve a PMS system. It must be proportionate to the risk class and device type.
Article 84 — PMS Plan: Every device must have a PMS plan. This is a specific, documented plan — not a generic QMS procedure. The plan must address the device (or device group) specifically and include:
- A proactive and systematic process to collect data from the market
- Methods and processes for data collection (complaints, literature, registries, etc.)
- Effective indicators and methods to analyze collected data
- Methods to investigate complaints and market experiences
- Methods for managing events subject to trend reporting
- Methods for communication with competent authorities, Notified Bodies, economic operators, and users
- Reference to PMCF plan (for applicable devices)
- Procedures for the identification and implementation of corrective and preventive actions
- Methods for tracing device identification and traceability
Article 85 — PMS Report (Class I): A straightforward summary report updated when necessary, available to the competent authority on request.
Article 86 — PSUR (Class IIa, IIb, III): Periodic Safety Update Reports must include:
- Conclusions of the benefit-risk determination
- Main findings of the PMCF
- Volume of sales and estimated patient population
- Summary of changes to the state of the art
- Frequency and trend of incidents, FSCA, complaints
- Updated benefit-risk analysis with rationale for any corrective actions
IVDR-Specific Considerations
The IVDR mirrors the MDR structure but uses different device classification (Class A, B, C, D) and introduces the concept of Post-Market Performance Follow-up (PMPF) instead of PMCF. The focus is on diagnostic performance — sensitivity, specificity, positive/negative predictive values — rather than clinical outcomes per se.
For high-risk IVDs (Class D companion diagnostics, blood screening devices), the scrutiny is intense. PMPF studies may need to demonstrate ongoing diagnostic accuracy against evolving reference methods and pathogen variants.
Global Vigilance Requirements Beyond FDA and EU
If you market your device globally, you must comply with vigilance reporting obligations in every jurisdiction. The timelines, forms, and reporting criteria vary significantly. The following table expands the comparison to the most commercially important regulatory authorities beyond FDA and EU MDR.
| Authority | Jurisdiction | Serious Incident Timeline | Other Incident Timeline | Reporting Portal/Form | Key Notes |
|---|---|---|---|---|---|
| MHRA | United Kingdom | 15 calendar days | Case-by-case assessment | Manufacturer's Online Reporting Environment (MORE) | As of June 2025, the UK PMS Statutory Instrument introduced stricter timelines (reduced from 30 to 15 days). PMSR/PSUR must be provided within 3 working days upon MHRA request. Manufacturers must also notify MHRA of FSCAs conducted outside GB if the same device is on the GB market. |
| TGA | Australia | 10 calendar days (death/serious injury); 30 calendar days (other) | 30 calendar days | IRIS (Incident Reporting Information System) | TGA mandates reporting of events that occur outside Australia if the device is marketed in Australia. Manufacturers must also comply with the Australian Therapeutic Goods Act 1989 and maintain a complaint handling system per TGA requirements. |
| Health Canada | Canada | 10 calendar days (death or serious deterioration) | 30 calendar days (incidents that could lead to death/serious deterioration if repeated) | Medical Device Problem Report system / Canada Vigilance Program | Timelines apply regardless of whether the event occurred in Canada or in another market, provided it is the same device version or model. MDSAP QMS certification is required for manufacturers. |
| BfArM | Germany | Per EU MDR timelines (2/10/15 days) | Per EU MDR | BfArM-specific reporting form (national form required until EUDAMED is fully operational) | BfArM is the German competent authority. Since May 2021, all vigilance reporting follows MDR timelines, but BfArM requires its own national form rather than generic EU forms. The MPDG (German Medical Devices Implementation Act) governs local implementation. |
| ANSM | France | Per EU MDR timelines | Per EU MDR | ANSM portal (French language) | France requires vigilance reporting via the ANSM website. Forms and interface are in French; an online translator may be needed. Reports cover both FSCAs and adverse events. |
| PMDA | Japan | 15 days (death); 30 days (serious injury/malfunction) | Annual reporting | PMDA adverse event reporting system | Japan's Pharmaceutical and Medical Device Act (PMD Act) governs reporting. Foreign manufacturers must report through their Marketing Authorization Holder (MAH) in Japan. |
| NMPA | China | Per NMPA Order No. 1 timelines | Varies by severity | China National Medical Products Administration portal | Reporting is typically handled through the local agent or Registration Certificate holder. Timelines and requirements have been tightening as NMPA aligns with IMDRF guidance. |
| Swissmedic | Switzerland | Per MedDO timelines (aligned with EU MDR) | Per MedDO | Swissmedic portal | Switzerland's Medical Devices Ordinance (MedDO) is closely aligned with EU MDR. FSN database is publicly searchable with excellent search functionality. |
Practical note on global harmonization: The International Medical Device Regulators Forum (IMDRF) has developed harmonized guidance on adverse event terminology, National Competent Authority Report (NCAR) exchange criteria, and standardized reporting forms. While these are not legally binding, they provide a useful framework for building a single PMS system that can serve multiple jurisdictions. The NCAR program, fully implemented since April 2016, facilitates rapid exchange of post-market safety information across regulators globally to speed up adoption of FSCAs in all affected geographies.
Key Guidance Documents and Standards for PMS
Navigating PMS requires familiarity with a substantial body of guidance documents. These are not all legally binding, but they represent the expectations that Notified Bodies and competent authorities will hold you to during audits. Missing or ignoring these documents is one of the most common reasons for audit findings.
EU Guidance Documents
| Document | Title | Status | Key Content |
|---|---|---|---|
| MDCG 2025-10 | Guidance on Post-Market Surveillance of Medical Devices and In Vitro Diagnostic Medical Devices | Published December 2025 | The most current and comprehensive EU PMS guidance. Describes the PMS system, PMS plan requirements, main PMS activities, and interactions with other QMS processes (risk management, clinical evaluation, SS(C)P updates). Supersedes earlier interpretive guidance on PMS system requirements. |
| MDCG 2022-21 | Periodic Safety Update Report (PSUR) | Published December 2022 | Detailed guidance on PSUR scope, structure, content, grouping rationale, frequency, data collection periods, and EUDAMED upload requirements. Contains the PSUR template in Annex I and data presentation tables in Annex II. The definitive reference for writing PSURs. |
| MDCG 2023-3 | Questions and Answers on Vigilance Terms and Concepts (MDR) | Published February 2023 | Clarifies key differences between MEDDEV 2.12/1 rev 8 and EU MDR vigilance requirements. Covers definitions of serious incidents, serious public health threats, expected/unexpected events, and trend reporting triggers. Essential reading for anyone making reportability decisions under MDR. |
| MDCG 2020-7 | PMCF Plan Template | Published April 2020 | Template and guidance for manufacturers and Notified Bodies on structuring PMCF plans. |
| MDCG 2020-8 | PMCF Evaluation Report Template | Published April 2020 | Template for presenting PMCF evaluation results, feeding into clinical evaluation and PSUR. |
| MDCG 2019-15 rev. 1 | Guidance on Class I Transitional Provisions | Published | Although targeted at Class I manufacturers transitioning to MDR, contains PMS and vigilance guidance generally relevant to all classes. |
| MEDDEV 2.12/1 rev. 8 | Guidelines on a Medical Devices Vigilance System | Published January 2013 | The legacy EU vigilance guidance. Not updated for MDR, but still a useful reference for understanding the vigilance framework, incident classification decision trees, and MIR (Manufacturer Incident Report) form structure. MDCG 2023-3 has largely superseded the interpretive content. |
| MEDDEV 2.12/2 rev. 2 | Post-Market Clinical Follow-Up Studies | Published January 2012 | Legacy guidance on PMCF study design and conduct. Useful for understanding circumstances where a PMCF study is indicated, study elements, and the Notified Body's role. |
International Standards
| Standard | Title | Key Content |
|---|---|---|
| ISO/TR 20416:2020 | Medical Devices -- Post-Market Surveillance for Manufacturers | The most important PMS standard. Provides a comprehensive framework for manufacturers to establish, implement, and maintain PMS processes. Covers PMS plan structure (with a recommended section breakdown), data collection methods (including questionnaire design guidance), analysis approaches, measurable criteria, alert and action levels, and feedback into product realization, risk management, and improvement processes. Referenced by MDCG 2025-10 as a key supporting standard. A revision (ISO/AWI TR 20416) is currently in development. |
| ISO 13485:2016 | Medical Devices -- Quality Management Systems | Clause 8.2.1 (Feedback) and Clause 8.2.2 (Complaint Handling) define the QMS requirements that the PMS system must implement. Clause 8.2.3 (Reporting to Regulatory Authorities) covers vigilance obligations within the QMS. |
| ISO 14971:2019 | Application of Risk Management to Medical Devices | Clause 10 (Production and Post-Production Activities) requires manufacturers to collect and review post-production information, evaluate its impact on residual risk, and take action when risks are no longer acceptable. PMS is the mechanism for fulfilling this clause. |
| ISO 14155:2020 | Clinical Investigation of Medical Devices for Human Subjects | Relevant when PMCF includes prospective clinical studies. Defines Good Clinical Practice requirements. |
IMDRF Guidance
The International Medical Device Regulators Forum (IMDRF) develops harmonized guidance used across member jurisdictions (FDA, EU, Health Canada, TGA, MHRA, PMDA, NMPA, and others):
- IMDRF/NCAR WG/N14 FINAL:2015 -- National Competent Authority Report Exchange Criteria and Report Form. Establishes the framework for rapid cross-border exchange of post-market safety information.
- IMDRF Adverse Event Terminology Working Group -- Developing a comprehensive, improved coding system for adverse events to improve signal detection accuracy and facilitate cross-jurisdictional communication.
- IMDRF/GRRP WG -- Guidance on regulatory reliance and recognition, increasingly relevant as authorities like MHRA and TGA offer abridged approvals based on other regulators' reviews.
PMS Plan: Structure and Template Guidance
A PMS plan is not a quality procedure. It is a device-specific (or device-group-specific) document that defines how you will collect, analyze, and act on post-market data for that particular device throughout its commercial life.
Recommended PMS Plan Structure
| Section | Contents |
|---|---|
| 1. Scope | Device identification (name, UDI-DI, classification, intended purpose), device group justification if applicable |
| 2. Roles and Responsibilities | Who owns PMS activities, who reviews data, escalation paths |
| 3. Data Collection | Specific data sources, collection methods, frequency, responsible parties for each source (see Data Sources section below) |
| 4. Data Analysis | Statistical methods, trend detection criteria, threshold triggers, signal detection methodology |
| 5. Complaint Handling | Cross-reference to complaint SOP, device-specific complaint categorization |
| 6. Vigilance Reporting | Cross-reference to vigilance SOP, device-specific reportability criteria |
| 7. Trend Reporting | Baseline rates, statistical methods for detecting significant increases |
| 8. PMCF/PMPF Plan Reference | Separate document or incorporated, study rationale and design |
| 9. PSUR/PMS Report Schedule | Frequency, due dates, distribution |
| 10. Corrective Actions | How PMS findings feed into CAPA, risk management updates, IFU revisions, design changes |
| 11. Reference to Related Documents | Risk management file, clinical evaluation report, technical documentation |
| 12. Review and Update Schedule | How often the plan itself is reviewed and updated |
Common Pitfalls in PMS Plans
Too generic. The most common mistake: writing one PMS plan that says "we collect complaints and review literature" and applying it to every device. Regulators — especially EU Notified Bodies — expect specificity. What literature databases? What search terms? What complaint categories are relevant to this device? What registries exist for this therapeutic area?
No analysis methodology. Collecting data is only half the job. The plan must define how you analyze it. What statistical methods? What constitutes a signal? What are the trigger thresholds for escalation? A plan that says "data is reviewed periodically" without specifying the analytical approach will not survive audit.
Disconnect from risk management. PMS exists to update the benefit-risk analysis. The plan must explicitly link PMS outputs back to the risk management file. When PMS data reveals a new hazard or a changing probability/severity of a known risk, the risk management file must be updated — and the PMS plan should describe this feedback loop.
No PMCF rationale. Even if you conclude that no PMCF study is needed, the PMS plan must document the rationale. "No PMCF is required because the device is well-established and sufficient clinical data exists" is not adequate without supporting analysis.
Data Sources for Post-Market Surveillance
Effective PMS requires casting a wide net. The following data sources should be evaluated for relevance to every device.
Internal Data Sources
- Complaint files — The foundation of any PMS system. Every complaint must be evaluated for reportability, trend significance, and potential corrective action.
- Service and repair records — Often overlooked, but service records can reveal failure modes that users do not report as complaints.
- Returned product analysis — Physical examination of returned devices can identify manufacturing drift, material degradation, or design weaknesses.
- Sales and distribution data — Needed to calculate incidence rates (complaints per units sold, per procedures performed, etc.).
- Internal audit findings — Manufacturing nonconformances and process deviations can indicate issues that may manifest in the field.
- Customer training feedback — Misunderstanding of use instructions, common questions during training, and user errors observed during training sessions.
External Data Sources
- MAUDE database (FDA) — Publicly available database of adverse event reports for medical devices marketed in the US. Search for your device type, predicate devices, and similar devices from competitors. Caveat: data quality is highly variable, and many reports lack detail. Use it for signal detection, not definitive conclusions.
- EUDAMED — The European database for medical devices. As of this writing, modules are still being deployed in phases. When fully operational, it will include vigilance data, clinical investigation data, and PMS data.
- Manufacturer and User Facility Device Experience (MAUDE) — In addition to your own reports, search competitor device reports systematically.
- Literature databases — PubMed, Embase, Cochrane Library. Define specific search strategies with MeSH terms and keywords. Document your searches (date, search string, results, screening criteria).
- Clinical registries — National and international registries relevant to your device type (e.g., National Joint Registry for orthopedic implants, INTERMACS for mechanical circulatory support). These often provide the highest-quality long-term outcomes data.
- Competent authority communications — Safety alerts, field safety notices, and guidance documents from FDA, Health Canada, TGA, MHRA, and EU competent authorities.
- Standards updates — Changes to applicable harmonized standards (ISO, IEC, ASTM, etc.) can indicate evolving understanding of device risks.
- Social media and online forums — Patient forums, clinician discussion boards, and social media can provide early signals of emerging issues. This is a supplementary source — do not base safety decisions solely on social media — but ignoring it entirely is a missed opportunity. MDCG 2022-21 explicitly mentions this.
- National health technology assessments — HTA reports can contain relevant safety and effectiveness data.
Data Source Matrix Example
| Data Source | Frequency | Owner | Analysis Method | Relevant For |
|---|---|---|---|---|
| Complaints | Continuous | Quality/RA | Individual assessment + quarterly trend | All devices |
| MAUDE search | Quarterly | RA | Keyword search, event coding | US-marketed devices |
| Literature review | Semi-annually | Clinical Affairs | Systematic search per protocol | All devices, especially higher risk |
| National registries | Annually | Clinical Affairs | Published registry reports + data request | Implantables, high-volume devices |
| Service records | Quarterly | Service/Quality | Failure mode categorization | Capital equipment, reusable devices |
| Social media monitoring | Quarterly | RA/Marketing | Keyword monitoring, sentiment analysis | Consumer-facing devices |
| Competent authority alerts | Monthly | RA | Review of relevant authority publications | All devices |
| Standards updates | Semi-annually | RA/R&D | Review of relevant standards committees | All devices |
PMCF and PMPF Studies
Post-Market Clinical Follow-up (PMCF)
PMCF is a continuous process that updates the clinical evaluation with post-market clinical data. Under EU MDR, it is a defined subset of PMS, and it is mandatory for Class III and Class IIb implantable devices. For Class IIa and other Class IIb devices, it is expected unless a justified rationale is documented for why it is not needed.
PMCF is not a single study — it is a plan that may include one or more of the following activities:
- PMCF studies — Prospective or retrospective clinical studies designed specifically to collect post-market clinical data. These can range from large multicenter registries to focused single-center observational studies.
- Literature reviews — Systematic, ongoing review of published clinical evidence.
- Registry participation — Contributing to and drawing data from clinical registries.
- Surveys — Structured surveys of clinicians or patients regarding device performance, usability, and outcomes.
- Analysis of real-world data — Insurance claims databases, electronic health records, or other real-world data sources.
The PMCF plan must address:
- General methods and procedures for the PMCF
- Specific methods and procedures, including study protocols if applicable
- Rationale for the chosen methods
- Reference to the clinical evaluation and risk management
- Specific objectives to address uncertainties, confirm safety and performance, detect emerging risks, and ensure continued acceptability of the benefit-risk profile
- Evaluation of side effects, contraindications, and residual risks
- Assessment of whether identified risks are representative of the patient population
- Detection of systematic misuse or off-label use
Practical tip: If your device is a well-established technology with extensive clinical literature (e.g., a standard wound dressing, a basic surgical instrument), your PMCF plan may legitimately consist of ongoing literature surveillance and complaint data analysis. You do not need to run a prospective clinical study for every device. But you do need to document the rationale and demonstrate that the chosen methods are sufficient to meet the PMCF objectives.
Post-Market Performance Follow-up (PMPF) for IVDs
PMPF is the IVDR equivalent of PMCF. The focus is on diagnostic performance rather than clinical outcomes:
- Ongoing verification of sensitivity, specificity, and predictive values against current clinical practice
- Monitoring for changes in target analytes (e.g., pathogen mutations that affect assay performance)
- Comparison with alternative diagnostic approaches as the state of the art evolves
- Assessment of performance across diverse patient populations and specimen types
For high-risk IVDs (Class D), PMPF is particularly critical. A companion diagnostic that loses sensitivity to a new biomarker variant can have immediate patient safety consequences.
PSUR: Periodic Safety Update Reports
The PSUR is the primary output document of the PMS system for Class IIa, IIb, and III devices under EU MDR (and Class B, C, D under IVDR).
PSUR Content Requirements
| PSUR Element | Description | Key Considerations |
|---|---|---|
| Volume of sales | Units sold, procedures performed, patient population estimate | Use multiple data sources; be transparent about estimation methodology |
| Incident summary | Number and type of incidents, FSCAs, complaints | Categorize by type; compare to previous reporting periods |
| Trend analysis | Statistical analysis of incident frequency and severity | Define baseline rates; use appropriate statistical methods (e.g., cumulative sum analysis, control charts) |
| Benefit-risk determination | Updated conclusion on whether benefit-risk remains acceptable | Must be a substantive analysis, not a rote statement |
| PMCF/PMPF findings | Summary of clinical follow-up data since last PSUR | Cross-reference PMCF evaluation report |
| State of the art | Summary of changes in clinical practice, standards, literature | Demonstrate awareness of the evolving evidence base |
| Corrective actions | Description of any corrective or preventive actions taken | Include status and effectiveness assessment |
| Conclusions | Overall PMS conclusions and actions for next reporting period | Must be specific and actionable |
PSUR Frequency
| Device Class (MDR) | Frequency | Submitted To |
|---|---|---|
| Class IIa | At least every 2 years | Notified Body (on request), available to competent authority |
| Class IIb | At least annually | Notified Body |
| Class III | At least annually | Notified Body |
| Device Class (IVDR) | Frequency | Submitted To |
|---|---|---|
| Class B | At least every 2 years | Notified Body (on request), available to competent authority |
| Class C | At least annually | Notified Body |
| Class D | At least annually | Notified Body |
Common question: "Can we group multiple devices into one PSUR?" Yes — if the devices share a common design, intended purpose, and risk profile, a single PSUR covering a device group is acceptable. But the PSUR must address each device variant within the group, and any device-specific safety signals must be discussed individually.
Vigilance Reporting
Vigilance reporting is the mechanism by which manufacturers report serious incidents and Field Safety Corrective Actions (FSCAs) to regulatory authorities. This is a non-negotiable, time-bound obligation — late or missing vigilance reports are among the most common enforcement triggers globally.
Reporting Timelines Comparison
| Event Type | FDA Timeline | EU MDR/IVDR Timeline |
|---|---|---|
| Death or serious injury | 30 calendar days | 10 calendar days |
| Malfunction likely to cause death/serious injury | 30 calendar days | 15 calendar days |
| Serious public health threat | 5 working days (if FDA requests or remedial action needed) | 2 calendar days |
| Correction/Removal | 10 working days (21 CFR 806) | Without undue delay (as part of FSCA reporting) |
| Trend report | As soon as reasonably practicable | Without undue delay after statistical significance established |
Complaint vs. Adverse Event Classification
Not every complaint is a reportable adverse event. Classifying incoming feedback correctly is one of the most error-prone steps in the PMS process.
| Category | Definition | Reportable? | Example |
|---|---|---|---|
| Complaint | Any written, electronic, or oral communication alleging a deficiency in identity, quality, durability, reliability, safety, effectiveness, or performance | Not inherently; requires evaluation | "The device arrived with a dented package" |
| Adverse Event (Incident) | Any malfunction or deterioration, inadequacy in information, or undesirable side-effect that led to or could have led to death or serious deterioration of health | Yes — if it meets the regulatory definition of a reportable event | "The device fractured during use, requiring surgical revision" |
| Near Miss | An event that could have led to an adverse event but did not, due to favorable circumstances | Reportable under EU MDR if it could have led to serious deterioration of health; case-by-case under FDA | "The alarm failed to sound, but the clinician noticed the parameter change visually" |
| Expected/Known Event | An adverse event that is already documented in the risk analysis and IFU | Reportable if frequency or severity exceeds expectations; always reportable for death | "Post-operative infection following implantation (documented in IFU at 2% rate), but observed rate is now 5%" |
| Customer Feedback (non-complaint) | General comments, feature requests, positive feedback | No | "We wish the display were larger" |
Field Safety Corrective Actions (FSCA)
An FSCA is any corrective action taken by a manufacturer for technical or medical reasons to prevent or reduce the risk of a serious incident associated with a device already placed on the market. FSCAs include:
- Recalls — Return, modification, or destruction of devices
- Device modifications — Software updates, hardware retrofits, component replacements
- Information changes — Updated IFU, additional warnings, revised labeling
- Advice to users — Recommendations for additional clinical monitoring, modified use procedures
- Temporary cessation of use — Instructions to stop using the device pending resolution
Every FSCA must be accompanied by a Field Safety Notice (FSN) that communicates the issue and corrective action to affected customers. The FSN should be written in clear, non-technical language and must include:
- Device identification (model, lot/serial numbers, UDI)
- Description of the problem
- Hazard and risk assessment
- Corrective action being taken
- Actions required by the customer
- Contact information for questions
Real-world tip: The biggest time sink in FSCA management is not the regulatory reporting — it is the logistics. Tracking which customers received which lots, confirming receipt of FSNs, verifying that corrective actions were implemented, and closing out the FSCA. Build this infrastructure before you need it. Maintain clean distribution records. Have FSN templates ready. Define your FSCA process flow in advance, not during a crisis.
Trend Reporting and Signal Detection
Trend reporting is where most manufacturers struggle. It requires quantitative thinking, statistical methodology, and — critically — defined baselines against which to measure.
Setting Up Trend Monitoring
Define complaint categories. Use a standardized coding system aligned with regulatory event codes (e.g., FDA Product Problem Codes, MedDRA terms for patient outcomes). Consistency is essential — if different quality engineers code the same issue differently, your trend data is useless.
Establish baseline rates. For each complaint category, establish a baseline rate normalized to a meaningful denominator (units sold, procedures performed, patient-years of exposure). The baseline should be derived from the first 12–24 months of commercial experience, or from clinical trial data if commercial data is insufficient.
Select statistical methods. Common approaches include:
- Control charts (c-charts or u-charts) for monitoring complaint rates over time
- Cumulative sum (CUSUM) analysis for detecting small, sustained shifts in event rates
- Chi-square or Fisher's exact tests for comparing complaint rates across time periods, lots, or manufacturing sites
- Bayesian signal detection for more sophisticated analysis of rare events
Define trigger thresholds. What constitutes a "statistically significant increase"? EU MDR Article 88 requires trend reporting when there is a "statistically significant increase in the frequency or severity of incidents or expected undesirable side-effects." You must define what this means for your device — a two-sigma excursion? A doubling of baseline rate? A CUSUM signal exceeding a defined threshold?
Act on signals. A detected signal triggers an investigation. Not every signal is a confirmed safety issue — some will be explained by increased reporting, changes in denominator (sales growth), or coding artifacts. But every signal must be investigated and the conclusion documented.
Trend Report Example Template
A trend report should include:
- Reporting period
- Data sources reviewed
- Total complaint volume and rate (normalized)
- Complaint breakdown by category
- Comparison to baseline rates and previous periods
- Statistical analysis results
- Identified signals and investigation outcomes
- Conclusions and recommended actions
- Comparison to competitors and similar devices (where data is available)
PMS vs. PMCF vs. PSUR: How They Fit Together
These three concepts are frequently confused. Here is how they relate.
| Aspect | PMS (Post-Market Surveillance) | PMCF (Post-Market Clinical Follow-up) | PSUR (Periodic Safety Update Report) |
|---|---|---|---|
| What it is | The overarching system for monitoring device safety and performance post-market | A specific subset of PMS focused on collecting clinical data to update the clinical evaluation | A periodic report summarizing PMS findings and conclusions |
| Scope | All post-market data: complaints, incidents, literature, registries, everything | Clinical data specifically: clinical studies, registries, literature with clinical endpoints | Summary of all PMS data and analysis for a defined reporting period |
| Document type | System (described in PMS plan) | Plan + evaluation report | Report |
| Required for | All devices, all classes | Class III, IIb implantable (mandatory study); expected for most IIa/IIb | Class IIa, IIb, III (Class I has simpler "PMS Report") |
| Feeds into | Risk management file, clinical evaluation, QMS, PSUR | Clinical evaluation report, benefit-risk analysis | Notified Body review, competent authority oversight |
| Update frequency | Continuous (the system is always active) | Defined in PMCF plan; study endpoints may span years | Class IIa: every 2 years; Class IIb/III: annually |
| Relationship | The parent system | A child activity under PMS | An output document of PMS |
Real-World Challenges and How to Address Them
Challenge 1: Complaint Data Quality
The most sophisticated trend analysis is worthless if the underlying complaint data is inconsistent, incomplete, or miscategorized.
Solution: Invest in complaint intake training. Define a standardized complaint intake form with mandatory fields. Use controlled vocabularies for event and device problem coding. Implement a secondary review step where a trained regulatory professional assesses every complaint for reportability and coding accuracy.
Challenge 2: Literature Review Burden
For a large device portfolio, systematic literature surveillance can be overwhelming. Each device needs a defined search strategy, regular execution, screening, and assessment.
Solution: Group devices by technology and intended use. Develop modular literature search strategies that cover a device family. Use reference management tools (EndNote, Covidence) with defined workflows. Consider outsourcing the search execution (but not the assessment) to a medical writing or regulatory services firm.
Challenge 3: Signal vs. Noise
More data does not automatically mean better PMS. It means more noise to sift through. Social media monitoring, in particular, can generate enormous volumes of irrelevant data.
Solution: Define clear signal detection criteria before you start monitoring. Establish qualification thresholds for each data source. For social media, use filtering tools with medical device-specific lexicons. Not every negative online review is a PMS signal — but a cluster of reports describing the same failure mode absolutely is.
Challenge 4: Global Harmonization
If you sell in the US, EU, Japan, and Australia, you are subject to at least four different vigilance reporting frameworks with different timelines, forms, and criteria.
Solution: Build a single PMS system with regulatory-specific reporting modules. Use a common complaint database with fields that capture the superset of required information across all jurisdictions. Define a decision tree for reportability that evaluates each event against all applicable criteria simultaneously. Consider the IMDRF guidance on adverse event terminology and reporting as a harmonization framework.
Challenge 5: Resource Constraints (Especially for Startups)
Startups with one or two people handling all of regulatory affairs, quality, and clinical do not have the bandwidth for the PMS system that a large medical device company can operate.
Solution: See the dedicated section below on setting up PMS from scratch.
Technology and Tools for PMS
The right tools do not replace good processes, but they dramatically reduce the manual burden.
Complaint and Vigilance Management
- eQMS platforms (Greenlight Guru, MasterControl, Qualio, Dot Compliance, QT9 QMS, SimplerQMS) — Most modern eQMS platforms include complaint management, CAPA integration, and some level of vigilance reporting workflow. For companies already using an eQMS, extending it for PMS is the path of least resistance.
- Dedicated vigilance platforms (AssurX, Sparta TrackWise) — Larger companies with global vigilance obligations may benefit from dedicated vigilance management systems with built-in regulatory intelligence (reporting timelines, forms, authority contacts).
eQMS Platform Comparison for PMS
Choosing the right eQMS is critical for effective PMS. The following comparison focuses on PMS-relevant capabilities:
| Platform | PMS Strengths | Best For | Considerations |
|---|---|---|---|
| Greenlight Guru | Purpose-built for medical devices. Strong complaint handling, CAPA, design controls, and PMS workflows. AI-powered features for document management. | Mid-stage to scaling MedTech companies. Over 1,000 medical device companies use it. | Higher price point. Focused exclusively on medical devices, which is both a strength and a limitation. |
| MasterControl | Enterprise-grade with advanced customization. Strong module integration for large companies. Extensive validation support. | Large pharma and medical device companies needing enterprise-scale deployment. | Higher cost and complexity. Significant implementation effort. |
| Qualio | Cloud-based, user-friendly, designed for growing life sciences startups. Ready-to-use templates for fast audit readiness. | Early-stage companies growing toward regulatory approval. | Some users report aggressive pricing increases. Text editing experience is more limited. |
| QT9 QMS | 25+ included modules, unlimited scalability. Strong across medical device, life sciences, and aerospace. Consistently top-rated on G2 and Capterra. | Companies needing a versatile, all-around QMS with strong analytics. | Less specialized for medical devices compared to Greenlight Guru. |
| SimplerQMS | Cloud-based, designed specifically for life sciences with strong ISO 13485 and MDR compliance features. | Companies wanting a straightforward, compliance-focused platform. | Smaller market presence compared to MasterControl or Greenlight Guru. |
| Dot Compliance | Cloud-based, modular approach with good complaint management and CAPA workflows. | Small to mid-size medical device companies. | Less mature ecosystem compared to larger platforms. |
| OpenRegulatory | Free QMS templates and procedures available. eQMS tooling integrated with development workflows (Git-based). Specifically designed for SaMD/digital health. | SaMD companies and startups with engineering-centric teams. | May not suit traditional hardware medical device manufacturers. |
Selection tip: Do not over-invest in tooling before you have established your processes. A disciplined process in a spreadsheet beats a poorly configured enterprise eQMS. Start with what your team will actually use, then scale as your complaint volume and regulatory footprint grow. Ensure any platform you choose supports FDA QMSR (effective February 2, 2026) and ISO 13485 requirements, including 21 CFR Part 11 compliance for electronic records.
Literature Surveillance
- PubMed/MEDLINE — Free, essential. Set up automated email alerts for your key search terms.
- Embase — Broader coverage than PubMed, particularly for device-related publications. Requires a license.
- Distiller SR / Covidence — Systematic review management tools useful for structuring literature screening.
- AI-assisted literature monitoring — Emerging tools that use NLP to screen publications against your PMS criteria. Not a replacement for human review, but useful for reducing screening burden on large-volume searches.
Adverse Event Databases by Market
Beyond MAUDE and EUDAMED, manufacturers should monitor adverse event databases in every market where their device is sold. The following is a comprehensive list of publicly accessible databases:
| Database | Jurisdiction | Content Available | Access | Notes |
|---|---|---|---|---|
| MAUDE | USA (FDA) | Adverse event reports (mandatory and voluntary) | Free — accessdata.fda.gov | Most comprehensive public database. Updated monthly. See detailed tutorial below. |
| EUDAMED | EU | Vigilance data, clinical investigations, PMS data (phased deployment) | Partially available; full vigilance module expected mandatory from Q4 2027 | Until fully operational, manufacturers must rely on national databases. |
| MHRA | UK | Yellow Card reports, medical device alerts, FSNs | Free — gov.uk/drug-device-alerts | Searchable database of device alerts and FSNs. Adverse event data access is more limited than MAUDE. |
| BfArM | Germany | Field safety notices, device alerts | Free — bfarm.de | Database in German; search function available for FSCAs. National competent authority form required for incident reporting. |
| ANSM | France | Field safety notices, device alert information | Free — ansm.sante.fr | Website in French. Simple search function showing FSNs and device alerts. |
| Swissmedic | Switzerland | Field safety notices | Free — swissmedic.ch | Excellent search functionality with easy-to-review PDF outputs. |
| DAEN | Australia (TGA) | Adverse event reports | Free — tga.gov.au | Database of Adverse Event Notifications. Search function can be difficult to use; results available as PDF. |
| PMDA | Japan | Adverse event reports, FSCAs | Partially free — pmda.go.jp | JMDN (Japanese Medical Device Nomenclature) database available. Much content in Japanese only. |
| IRIS | Australia (TGA) | Incident reporting | Reporting portal for manufacturers | TGA's Incident Reporting Information System for submitting reports. |
| Total Product Life Cycle (TPLC) | USA (FDA) | Integrates premarket and postmarket data | Free — accessdata.fda.gov/scripts/cdrh/cfdocs/cfTPLC | Combines MAUDE, 510(k), PMA, and recall data into a single searchable interface. Extremely useful for competitive intelligence. |
How to Search the MAUDE Database Effectively
The FDA's MAUDE (Manufacturer and User Facility Device Experience) database is the single most important publicly accessible adverse event database for medical devices. It contains mandatory reports from manufacturers and importers since August 1996, user facility reports since 1991, and voluntary reports since June 1993. The FDA receives over two million medical device reports annually.
Accessing MAUDE: Navigate to accessdata.fda.gov/scripts/cdrh/cfdocs/cfmaude/search.cfm. MAUDE offers two search modes:
Simple Search:
- Enter any keyword, exact phrase (in quotes), or multiple terms connected with "and"
- You can enter a device's UDI to search by unique device identifier
- Results are limited to 500 records per selected year
- Best for quick exploratory searches
Advanced Search:
- Filter by specific fields: device brand name, manufacturer name, product class, event type (death, injury, malfunction), date range
- Returns up to 500 results over the selected timeframe
- For searches exceeding 500 results, download the MDR Data Files for bulk analysis
OpenFDA API:
- For programmatic access and large-scale analysis, use the OpenFDA API (open.fda.gov)
- Enables custom queries, trend analysis over time, and integration with statistical tools
- Particularly useful for building automated PMS monitoring scripts in R or Python
- Key fields:
device.brand_name,device.generic_name,event_type,mdr_text.text(narrative),product_problems,date_received
Key MAUDE limitations to understand:
- Data quality is highly variable — many reports lack detail or contain errors
- Duplicate entries exist; FDA explicitly warns against using MAUDE to calculate event frequencies
- Voluntary reports are inherently underreported
- Coding inconsistencies mean similar events may be classified differently
- Reports older than 10 years are only available via MDR Data Files downloads, not the online search
- MAUDE was not designed for AI-enabled device failures and may not capture algorithm-specific adverse events
Best practices for PMS literature searches using MAUDE:
- Search for your own device by brand name and manufacturer name
- Search for predicate devices and equivalent devices identified in your clinical evaluation
- Search by FDA product classification code (the 3-letter product code) to capture competitor devices
- Combine MAUDE searches with the TPLC database to correlate adverse events with 510(k)/PMA data and recall history
- Document every search: date, search parameters, number of results, screening criteria, and conclusions
- Establish a search frequency proportionate to risk — monthly or quarterly for high-risk and newly launched devices; semi-annually for established, lower-risk devices
Signal Detection and Analytics
- JMP or Minitab — Statistical software suitable for control charting and trend analysis.
- R or Python — For custom signal detection algorithms, CUSUM analysis, and automated reporting. R packages specifically developed for medical device signal detection are available (see academic references in the signal detection section below).
- Business intelligence tools (Power BI, Tableau) — Useful for creating PMS dashboards that visualize complaint trends, report status, and KPIs.
- AI and machine learning tools — Emerging NLP-based tools for screening large volumes of adverse event narratives, semantic similarity search across MAUDE entries, and predictive analytics for risk assessment. Not a replacement for human review, but increasingly valuable for reducing screening burden.
Social Media and Web Monitoring
- Google Alerts — Free, basic monitoring for device name mentions.
- Brand monitoring tools (Mention, Brandwatch) — More sophisticated monitoring across social media, forums, and news.
- Patient forum monitoring — Some regulatory consultancies offer structured monitoring of patient forums relevant to specific device types.
PSUR Template Structure per MDCG 2022-21
MDCG 2022-21 (published December 2022) provides the definitive template for PSURs under EU MDR. Notified Bodies expect manufacturers to follow this structure as closely as possible. The following is the chapter structure from Annex I of the guidance, with practical notes on what each section should contain.
PSUR Chapter Structure
| Chapter | Title | Content Requirements |
|---|---|---|
| Cover Page | Administrative information | Manufacturer information, device(s) covered, Notified Body name and organization number, PSUR reference number, version number, data collection period, table of contents. For PSURs submitted to EUDAMED, the PSUR Web Form (Annex V of MDCG 2022-21) serves as the cover page. |
| Executive Summary | Overview and conclusion | Brief overview of PSUR content, overall conclusion on benefit-risk determination, summary of any significant safety signals or corrective actions, status of actions taken since last PSUR. |
| 1. Device Description | Device identification and scope | Basic UDI-DI, device name, classification, intended purpose, device group justification (if multiple devices in one PSUR), description of any device variants or configurations covered. |
| 2. Grouping Rationale | Justification for device grouping | If the PSUR covers a device group rather than a single device, provide justification based on common design, intended purpose, and risk profile. The grouping must be consistent with the clinical evaluation scope. |
| 3. Volume of Sales | Sales data by region over time | Present data year-by-year, split by EEA+TR+XI and Worldwide (worldwide should include EU data). Include number of devices placed on market, put into service, or implanted. For reusable devices, include usage frequency. MDCG 2022-21 Annex II Table 1 provides the recommended format. |
| 4. Population Characteristics | Estimated patient population | Estimated number of patients/users exposed to the device, their characteristics (age, sex, profession where relevant), geographic distribution. Method of estimation must be transparent. |
| 5. PMS Data — Vigilance | Serious incidents and FSCAs | Number and type of serious incidents reported during the data collection period, comparison to previous periods, summary of FSCA activities, CAPA initiated as a result of vigilance data. Include device-specific analysis even if the PSUR covers a device group. |
| 6. PMS Data — Non-Serious Incidents | Non-serious incidents and expected side effects | Summary of non-serious incidents, expected undesirable side effects, complaint data analysis, comparison with established baseline rates. |
| 7. PMS Data — Trend Reporting | Trend analysis | Statistical analysis of incident frequency and severity trends, comparison to baseline and previous periods, description of statistical methods used, identification of any statistically significant increases per Article 88 MDR. |
| 8. PMS Data — Literature and Databases | External data review | Summary of systematic literature review findings, adverse event database search results (MAUDE, national databases), registry data, standards updates, state-of-the-art changes. |
| 9. PMS Data — Feedback and Complaints | User and market feedback | Summary of customer feedback, distributor/importer feedback, social media monitoring results, user survey outcomes, proactively gathered feedback from conferences and client visits. |
| 10. PMCF Summary | Main findings of PMCF | Summary of PMCF activities conducted during the period, key clinical findings, cross-reference to PMCF Evaluation Report and CER. |
| 11. Benefit-Risk Determination | Updated benefit-risk conclusion | Substantive analysis (not a rote statement) of whether the benefit-risk profile remains acceptable. Must integrate all PMS data sources. Reference residual risks from the risk management file. Address any new risks identified. |
| 12. Actions Taken | Corrective and preventive actions | Description of all CAPAs initiated during the period, their rationale, current status, and effectiveness assessment. Include any updates to IFU, labeling, design, manufacturing process, or SS(C)P. |
| 13. Conclusions | Overall conclusions and next steps | Specific, actionable conclusions. Planned actions for the next reporting period. Identification of any open questions requiring further investigation. |
Important timing notes from MDCG 2022-21: The PSUR data collection period starts from (a) May 26, 2021 for legacy devices or (b) the date of MDR certification for newly certified devices. Data should be presented year-by-year. There is no legal requirement to include data before May 26, 2021, but historical data can support the evaluation. The PSUR must be generated as a standalone document assessable independently from supporting documentation.
MHRA PSUR note: The UK MHRA published its own standardized PSUR format in July 2025. While similar to the MDCG 2022-21 structure, it has UK-specific requirements. Manufacturers may leverage EU PSURs but must ensure GB-specific requirements are addressed within the report. The MHRA can request PMSR/PSUR at any time, and manufacturers must respond within 3 working days.
Signal Detection Methodology: A Deeper Dive
While the Trend Reporting section above covers the basics of setting up trend monitoring, signal detection in medical device PMS is an evolving discipline that borrows heavily from pharmacovigilance while addressing unique challenges specific to devices. The following provides a more detailed treatment of the statistical and methodological approaches available.
The Signal Management Process
Signal detection in PMS follows a structured process:
- Signal detection -- Identification of a potential safety concern through quantitative or qualitative analysis of PMS data
- Signal validation -- Confirming that the detected signal represents a genuine change rather than noise, reporting artifact, or coding inconsistency
- Signal assessment -- Evaluating the clinical significance, causality, and potential impact on the benefit-risk profile
- Signal prioritization -- Ranking validated signals by urgency and potential patient impact
- Action and communication -- Determining appropriate response (investigation, CAPA, vigilance report, FSCA, risk management update)
Quantitative Signal Detection Methods
The academic literature and regulatory guidance identify several families of statistical methods applicable to medical device PMS:
Disproportionality Analysis:
- Proportional Reporting Ratio (PRR) -- Compares the proportion of a specific adverse event reported for your device against the proportion reported for all other devices. A PRR significantly greater than 1 suggests disproportionate reporting. Originally developed for pharmacovigilance (EudraVigilance), applicable to MAUDE data.
- Reporting Odds Ratio (ROR) -- Similar to PRR but uses odds ratios. More familiar to epidemiologists.
- Empirical Bayes Geometric Mean (EBGM) -- A Bayesian approach that adjusts for multiple comparisons and small sample sizes. Used by FDA's Multi-item Gamma Poisson Shrinker (MGPS) algorithm.
- Information Component (IC) -- Bayesian measure used by the WHO Uppsala Monitoring Centre. Provides a log-scale measure of disproportionality.
Limitation for devices: Disproportionality methods were developed for drug safety surveillance where products are more homogeneous. Medical devices present unique challenges: greater product diversity, iterative design changes, learning curves, operator-dependent performance, and shorter product lifecycles. Published research suggests that further development of statistical methods specific to medical device vigilance is needed, distinct from pharmacovigilance approaches.
Statistical Process Control (SPC):
- Control charts (c-charts, u-charts, p-charts) for monitoring complaint or event rates against established control limits
- Nelson Rules -- A set of rules for detecting non-random patterns on control charts. ISO/TR 20416:2020 and several competitors specifically recommend Rule 3 (six or more points in a continuous upward trajectory) for PMS trend detection.
- Western Electric rules -- An alternative set of control chart interpretation rules
Sequential and Time-Series Methods:
- Cumulative Sum (CUSUM) analysis -- Detects small, sustained shifts in event rates. Particularly effective for detecting gradual increases that might not trigger a control chart alarm.
- Change Point Analysis (CPA) -- Determines the specific time point at which statistical properties of a sequence of observations change. Demonstrated in published research using neurostimulator adverse event data from MAUDE. Both change-in-mean and change-in-variance approaches are applicable.
- Sequential Probability Ratio Test (SPRT) -- The most commonly cited method family in the device signal detection literature (70% of published studies). Allows for ongoing hypothesis testing as data accumulates.
Qualitative Methods:
- For devices early in their commercial life or with low event volumes, individual case review may be more appropriate than quantitative methods
- Clinical significance assessment of each reported event
- Causality assessment using structured frameworks (e.g., WHO-UMC system adapted for devices)
Choosing the Right Method
| Scenario | Recommended Approach | Rationale |
|---|---|---|
| Newly launched device, low sales volume | Individual case review + qualitative assessment | Insufficient data for statistical methods; each event is clinically significant |
| Established device, moderate volume | Control charts + CUSUM | Balance of sensitivity and practical implementation |
| Large device portfolio, high complaint volume | Automated disproportionality analysis + SPC | Scale requires automated screening; human review of flagged signals |
| Rare events, high severity | Sequential probability testing (SPRT) | Designed for ongoing testing with accumulating data on rare events |
| Detecting gradual shifts in performance | CUSUM or Change Point Analysis | Specifically designed to detect sustained, small shifts |
The PMS-Clinical Evaluation Lifecycle Feedback Loop
One of the most significant conceptual shifts under EU MDR is the explicit requirement that PMS, clinical evaluation, and risk management operate as a continuous feedback loop throughout the device's commercial life. This is not merely a good practice recommendation -- it is a regulatory requirement embedded in the MDR's lifecycle approach.
How PMS Data Feeds Into Clinical Evaluation
The Clinical Evaluation Report (CER) must be updated with post-market clinical data. PMS is the mechanism that provides this data. The relationship is bidirectional:
PMS to CER (forward feed):
- PMCF study results update the clinical evidence base
- New literature identified through PMS literature surveillance is incorporated into the CER's systematic review
- Registry data provides long-term outcomes data not available from pre-market studies
- Complaint data may reveal clinical performance issues (e.g., higher-than-expected complication rates)
- Adverse event data may change the understanding of device risks
CER to PMS (backward feed):
- Gaps identified in the clinical evaluation drive PMCF study design
- Uncertainties flagged in the CER define what the PMS plan needs to specifically monitor
- Changes in the state of the art identified during clinical evaluation may require new PMS data sources
- Clinical claims made in the CER must be continuously validated through PMS data
How PMS Data Feeds Into Risk Management
Per ISO 14971:2019 Clause 10, manufacturers must collect post-production information and evaluate whether it impacts the risk management file:
- New hazards identified through PMS must be added to the risk analysis
- Changed probability or severity estimates must trigger re-evaluation of risk acceptability
- If a residual risk is no longer acceptable, the impact on existing risk control measures must be evaluated
- The need for actions regarding devices already on the market must be considered
- All decisions and actions must be recorded in the risk management file
How PMS Feeds Into Other QMS Documents
| PMS Output | Updates To | Trigger |
|---|---|---|
| New safety signal | Risk Management File, CER, IFU, PSUR | Any new hazard or changed risk profile |
| PMCF study results | CER, PMCF Evaluation Report, PSUR, SS(C)P | Completion of study endpoints or interim analysis |
| Literature review findings | CER, State of the Art section of PSUR | New relevant publications identified |
| Trend analysis showing increased event rate | Risk Management File, PSUR, potentially FSCA | Statistically significant increase above baseline |
| FSCA implementation | PSUR, Risk Management File, IFU | Corrective action taken on marketed device |
| Updated benefit-risk conclusion | CER, PSUR, SS(C)P | Any change to benefit-risk determination |
MDCG 2025-10 makes this explicit: The guidance states that the PMS system must describe how PMS data is used to update the SS(C)P (Summary of Safety and Clinical Performance) when applicable, and that such updates should be aligned with the information presented in the CER and PSUR. This creates a three-way alignment requirement: CER, PSUR, and SS(C)P must tell a consistent story.
Real-World FSCA Case Studies
Understanding Field Safety Corrective Actions in practice helps illustrate the complexity and urgency of these situations. The following examples are drawn from publicly available recall and FSCA databases.
Case Study 1: Philips Respironics CPAP Recall (2021)
The issue: Philips issued a recall of specific CPAP, BiLevel PAP, and mechanical ventilator devices due to potential health risks from the polyester-based polyurethane (PE-PUR) sound abatement foam used inside the devices. The foam could degrade into particles and off-gas volatile organic compounds (VOCs), potentially being inhaled or ingested by users.
Scale: Approximately 5.5 million devices globally.
Complexity factors:
- Devices were used in patients' homes, not healthcare facilities -- making direct communication and retrieval extremely difficult
- Many patients depended on these devices for life-sustaining respiratory support -- simply telling them to stop using the device posed its own safety risk
- The recall spanned dozens of countries, requiring multilingual FSN translations
- Patient registries were incomplete, requiring healthcare system coordination to identify affected users
PMS lessons:
- Distribution records must be meticulously maintained, including end-user identification for home-use devices
- FSN templates should be prepared in advance for major markets
- Risk communication must account for the risk of removing a device versus the risk of continued use
- Healthcare systems had to develop their own patient notification processes (e.g., Mayo Clinic developed a structured empathetic communication approach with multiple patient letters)
Case Study 2: Abbott FreeStyle Libre 3 Sensor Recall (2025-2026)
The issue: Abbott Diabetes Care removed certain FreeStyle Libre 3 and FreeStyle Libre 3 Plus continuous glucose monitoring (CGM) sensors from the market due to a manufacturing issue affecting sensor accuracy.
PMS lessons:
- Continuous glucose monitors are used by patients to make real-time treatment decisions -- inaccurate readings can lead to incorrect insulin dosing
- For IVD-like devices, PMS must specifically track diagnostic accuracy metrics (sensitivity, specificity) against established performance specifications
- Rapid detection and response is critical when the device directly influences treatment decisions
- Post-recall surveillance must verify that replacement devices perform as expected
Case Study 3: Software-Related FSCAs
Software corrections are among the fastest-growing categories of FSCAs. Common examples include:
- Infusion pump software bugs causing incorrect dose calculations
- Imaging system software errors leading to incorrect patient identification or image processing artifacts
- Pacemaker/ICD communication software vulnerabilities requiring security patches
- AI/ML algorithm drift where diagnostic accuracy degrades as real-world data distributions diverge from training data
PMS lesson for SaMD: Post-market monitoring of software-based devices must include performance metrics specific to the algorithm (e.g., sensitivity, specificity, false positive/negative rates) and mechanisms to detect performance degradation over time. Traditional complaint-based PMS may not capture algorithmic failures, as users may not recognize when an AI system is performing suboptimally.
Common PMS Audit Findings and How to Avoid Them
Notified Body audits of PMS systems have become significantly more rigorous under EU MDR. Understanding the most common findings allows manufacturers to proactively address gaps before they become nonconformities.
Finding 1: Generic or Incomplete PMS Plans
What auditors flag: PMS plans that are essentially a copy of the QMS procedure, lacking device-specific data sources, analysis methods, and acceptance criteria. Plans that say "we review complaints and literature" without specifying which databases, search terms, frequency, or analytical thresholds.
How to avoid it: Align your PMS plan explicitly with MDR Annex III requirements. Include well-defined roles, specific data collection methods with named databases and search strategies, measurable indicators with defined thresholds, and documented review cycles. Every PMS plan must be specific to the device or device group.
Finding 2: Inadequate PMCF Justification
What auditors flag: Manufacturers who state "no PMCF study is needed" without providing a substantive rationale grounded in the clinical evaluation. Or PMCF plans that are disconnected from the gaps and uncertainties identified in the CER.
How to avoid it: The PMCF plan (or the rationale for not having active PMCF studies) must flow directly from the clinical evaluation. If the CER identifies residual clinical questions, the PMCF plan must address them. If no PMCF study is needed, explain specifically why existing post-market data (literature, complaints, registry data) is sufficient to update the clinical evaluation. Document this linkage explicitly.
Finding 3: No Defined Trend Analysis Methodology
What auditors flag: PMS systems that collect data but have no defined statistical or analytical methodology for detecting trends. Reports that state "no trends were identified" without explaining what methods were used to look for them.
How to avoid it: Define specific statistical methods in the PMS plan (control charts, CUSUM, etc.), establish baseline rates, define trigger thresholds, and document the analysis in every PSUR/PMS report -- including when no trend is detected (show your work).
Finding 4: Disconnect Between PMS and Risk Management
What auditors flag: PMS data that is collected and reported but never feeds back into the risk management file. Updated PSURs that identify new hazards or changing risk profiles, but the risk management file remains unchanged.
How to avoid it: Establish explicit procedural links between PMS outputs and risk management inputs. When a PSUR identifies a new hazard, a risk management file update should be triggered automatically. Document the decision trail: "PSUR identified X, risk management review concluded Y, action Z was taken (or no action was needed because...)."
Finding 5: Missing or Inadequate State-of-the-Art Review
What auditors flag: PSURs that do not address changes in the state of the art -- new clinical evidence, updated standards, new competitor devices, changes in clinical practice -- that could affect the device's benefit-risk profile.
How to avoid it: Include a dedicated "State of the Art" section in every PSUR. Monitor relevant standards updates (ISO, IEC, ASTM), clinical guidelines, published HTAs, and competitor landscape changes. Document your analysis even if the conclusion is "no change."
Finding 6: Poor Record-Keeping and Traceability
What auditors flag: Inconsistent documentation, complaint records that cannot be traced through the system, missing investigation records, or PMS data stored across multiple disconnected systems without clear audit trails.
How to avoid it: Implement centralized PMS record management (ideally within your eQMS). Ensure every complaint has a traceable path from intake through assessment, investigation (if applicable), reportability decision, trend analysis, and closure. All records must be easily retrievable during an audit.
Finding 7: PSUR Structure Not Aligned with MDCG 2022-21
What auditors flag: PSURs that do not follow the template structure in MDCG 2022-21 Annex I, miss required elements (e.g., no volume of sales data, no population estimate, no benefit-risk conclusion), or present data in a way that is difficult to assess.
How to avoid it: Use the MDCG 2022-21 Annex I template as your starting framework. Include every section, even if only to state that a particular element is not applicable (with justification). Present data in tabular format per Annex II. The PSUR must be a standalone document assessable independently from supporting documentation.
PMS Audit Readiness Checklist
Use this checklist to assess your PMS system before a Notified Body audit:
- Device-specific PMS plan in place, aligned with MDR Annex III
- PMS plan includes specific data sources, collection methods, frequencies, and responsible persons
- Statistical methods for trend detection are defined with trigger thresholds
- PMCF plan (or documented rationale for no PMCF study) is linked to CER gaps
- Complaint coding uses standardized vocabulary (FDA Product Problem Codes, MedDRA terms)
- Baseline complaint rates are established and documented
- PSUR follows MDCG 2022-21 template with all required sections
- Benefit-risk determination is substantive (not a rote statement)
- PMS findings feed back into risk management file with documented decision trail
- CER is updated with post-market clinical data at required intervals
- SS(C)P is updated when PMS data changes the safety/performance profile
- Vigilance reporting timelines are met with documentation of awareness dates
- Distribution records enable identification of affected devices/customers for FSCA
- Literature search strategy is documented (databases, search terms, screening criteria)
- All PMS records are centrally stored and retrievable
Setting Up PMS from Scratch: A Guide for Startups
If you are a startup preparing to launch your first device, here is a practical, phased approach to building a PMS system that meets regulatory requirements without overwhelming your team.
Phase 1: Before Market Launch
Build the foundation. You need four things in place before your device reaches a single patient:
Complaint handling procedure. Define how complaints are received, documented, evaluated, investigated, and closed. Include reportability assessment criteria. This is non-negotiable — it must be in place before commercial launch.
Vigilance reporting procedure. Define the decision tree for determining whether a complaint is a reportable event. Include timelines, responsible persons, and reporting forms for each market. Have draft MedWatch forms (FDA) and vigilance report forms (EU competent authorities) ready.
PMS plan. Write the device-specific PMS plan. At this stage, it will be based largely on pre-market data (clinical trial results, bench testing, risk analysis, predicate device data). That is fine — the plan will be updated as post-market data accumulates.
Distribution records system. You cannot execute a recall or FSCA if you do not know who has your devices. Establish traceability from day one.
Phase 2: First 12 Months Post-Launch
Collect, learn, and calibrate.
- Process every complaint meticulously. In the early months, volume will be low enough that you can give each complaint thorough attention.
- Establish baseline complaint rates. After 6–12 months, you should have enough data to set meaningful baselines.
- Conduct your first systematic literature review. Document it thoroughly — this will form the basis for ongoing literature surveillance.
- Review MAUDE data for your device type and predicate devices. Identify relevant competitor events.
- Begin PMCF activities if applicable. For Class III devices under EU MDR, this should already be underway from the clinical evaluation plan.
- Write your first PSUR or PMS report if the reporting cycle falls within this period.
Phase 3: Ongoing Operations
Systematize and scale.
- Implement trend monitoring with the baseline rates established in Phase 2.
- Automate what you can: literature alert subscriptions, MAUDE search scripts, complaint tracking dashboards.
- Update the PMS plan based on the first year of commercial experience. Are the data sources you identified actually yielding useful information? Do you need to add new sources? Adjust collection frequencies?
- Feed PMS findings back into risk management. Update the risk management file with any new hazards, changed probabilities, or revised severity assessments.
- Review and update the clinical evaluation report with post-market clinical data.
Startup-Specific Tips
- Do not build a Rolls-Royce system when a Honda will do. A spreadsheet-based complaint tracking system with a disciplined process is better than an expensive eQMS that nobody uses properly. Scale your tools to your volume.
- Assign clear ownership. In a startup, PMS often falls between RA, Quality, and Clinical. Assign one person as the PMS process owner, even if they wear other hats.
- Budget for PMS from the start. Literature database subscriptions, eQMS software, regulatory intelligence services, and personnel time for PMS activities are ongoing costs. Include them in your post-market budget.
- Join industry associations. Groups like RAPS, AAMI, and MedTech Europe provide access to guidance documents, training, and peer networks that can accelerate your learning curve.
- Document everything. Even if you only receive two complaints in your first year, document the analysis, the trend evaluation (even if there is no statistically meaningful trend yet), and the conclusions. Regulators want to see that the system is active, not that the system generated a lot of data.
Key Takeaways
Post-market surveillance is not a compliance exercise you bolt on after launch. It is an integrated, continuous system that determines whether your device remains safe and effective throughout its commercial life.
The regulatory environment — particularly under EU MDR and IVDR — now demands a level of PMS sophistication that goes far beyond complaint collection. Manufacturers must proactively collect data from multiple sources, apply rigorous analytical methods, generate periodic safety reports, and feed findings back into risk management and clinical evaluation.
The manufacturers who do this well gain a genuine competitive advantage: faster identification of emerging issues, more defensible regulatory submissions, stronger clinical evidence for marketing claims, and — most importantly — better outcomes for patients.
Start with the PMS plan. Make it device-specific, make it actionable, and make it the central hub for all post-market activities. Everything else follows from there.