MedDeviceGuideMedDeviceGuide
Back

EU AI Act Omnibus Amendment 2026: What the May 7 Deal Means for Medical Device Manufacturers

On May 7, 2026, the EU Council and Parliament reached a provisional deal to simplify AI Act rules. This guide breaks down what changed, what the product safety exemption means for medical devices, the new compliance timelines for AI-enabled SaMD and IVDs, and what manufacturers must do now.

Ran Chen
Ran Chen
Global MedTech Expert | 10× MedTech Global Access
2026-05-0710 min read

Breaking: EU Reaches Deal on AI Act Simplification

On May 7, 2026, EU member states and the European Parliament reached a provisional agreement to simplify and streamline the EU Artificial Intelligence Act as part of a broader Digital Omnibus package. The deal, announced by the Council of the European Union, modifies key compliance deadlines and — most critically for the medical device industry — addresses the contentious question of whether AI embedded in CE-marked regulated products should be governed by the AI Act or by existing product safety legislation.

The agreement does not weaken safety rules, according to Arba Kokalari, the European Parliament's rapporteur for the Internal Market committee. "We are not weakening any safety rules; we are clarifying the rules for companies in Europe," Kokalari stated.

For medical device manufacturers, this deal has immediate implications for compliance planning, technical documentation, and regulatory strategy. Here is what you need to know.

Background: What the Digital Omnibus on AI Proposed

The Digital Omnibus on AI was first introduced by the European Commission in November 2025 as a push to simplify compliance with the EU AI Act and boost European competitiveness in the AI sector. The original AI Act (Regulation EU 2024/1689), which entered into force on August 1, 2024, established the world's first comprehensive horizontal legal framework for AI systems with a tiered risk approach.

The problem? Many of the harmonized standards, guidelines, and common specifications needed to operationalize the AI Act's high-risk obligations were still under development. The first harmonized standard relevant to the Act — prEN 18286, covering quality management systems — entered public enquiry on October 30, 2025, eight months behind the April 2025 target. This created a compliance gap: manufacturers were expected to meet requirements without the technical standards to guide them.

The Omnibus proposed to address this by linking compliance deadlines to the availability of support tools rather than fixed dates.

What Changed: Key Amendments Affecting Medical Devices

1. Extended Compliance Timelines

The most immediately impactful change is to the compliance schedule. Under the original AI Act:

  • August 2, 2026 — High-risk AI obligations for Annex III systems (standalone use cases) were to enter into application
  • August 2, 2027 — High-risk obligations for AI systems that are medical devices or safety components of medical devices (Article 6(1), Annex I) were to apply

Under the Omnibus deal, the timeline is now conditional:

System Type Original Deadline New Deadline Under Omnibus
Annex III standalone high-risk AI (hospital workflow tools, patient triage algorithms not CE-marked as medical devices) August 2, 2026 December 2, 2027 (or 6 months after Commission confirms support measures are available, whichever is earlier)
AI that is a medical device or integrated into a medical device (SaMD, AI-enabled IVDs, AI imaging systems) August 2, 2027 August 2, 2028 (or 12 months after Commission confirms support measures are available, whichever is earlier)

This is a significant extension. Manufacturers of AI-enabled medical devices now have until at least August 2028 to meet the AI Act's high-risk obligations, instead of August 2027. However, the "long-stop" dates are firm — even if support measures are never confirmed, compliance is required by these deadlines.

2. The Product Safety Exemption Debate

The most contested issue in the trilogue negotiations was whether AI embedded in CE-marked regulated products — including medical devices — should be governed primarily by the AI Act's high-risk obligations or by existing product safety legislation (MDR, IVDR).

Germany and allied member states pushed for a product safety exemption: a carve-out that would allow AI in CE-marked medical devices to be governed primarily by MDR/IVDR requirements rather than the AI Act's Annex III high-risk obligations. Their argument: medical device manufacturers already undergo rigorous conformity assessment by Notified Bodies under MDR/IVDR, and duplicative oversight creates unnecessary burden.

The European Parliament resisted broad exemptions, arguing that the AI Act provides AI-specific requirements (data governance, transparency, human oversight) that go beyond what MDR/IVDR currently covers. Removing medical devices from AI Act scope would create a gap in AI-specific safeguards.

The compromise: The final deal maintains that AI-based medical devices are automatically classified as high-risk under the AI Act, but introduces practical simplifications:

  • A single conformity assessment procedure can be used across both the AI Act and MDR/IVDR, avoiding duplicate assessments
  • Notified Bodies conducting MDR/IVDR assessments can simultaneously evaluate AI Act compliance
  • Technical documentation can be integrated rather than duplicated

3. Legacy Device Provisions

The deal clarifies the treatment of legacy AI systems. Where at least one unit of an AI-enabled medical device has been lawfully placed on the EU market before the relevant cut-off date, additional units of the same type and model may continue to be placed on the market without new conformity assessment, provided the design remains unchanged. Any significant design modification triggers full AI Act compliance requirements.

This is important for manufacturers with existing AI-enabled devices on the EU market — it provides continuity for unchanged products while requiring full compliance for updated versions.

4. Simplified Documentation Requirements

The Omnibus streamlines post-market monitoring documentation for high-risk AI systems. Rather than creating entirely separate AI Act monitoring reports, manufacturers can integrate AI-specific monitoring into their existing MDR/IVDR post-market surveillance activities (PMS plans, PSURs, PMCF evaluations).

5. Transparency and Registration

The European Data Protection Board (EDPB) and European Data Protection Supervisor (EDPS) had raised concerns that the Omnibus provisions allowing providers to self-assess their systems as "not high-risk" could reduce accountability. The deal retains a registration obligation: even if a provider determines its AI system is not high-risk, it must document the assessment and provide it to national competent authorities upon request.

Recommended Reading
SBOM for Medical Devices: Complete Guide to FDA Section 524B, EU CRA & NTIA Compliance (2026)
Cybersecurity Digital Health & AI2026-04-17 · 16 min read

What This Means for Different Device Categories

AI-Enabled SaMD (Software as a Medical Device)

If your SaMD uses machine learning, deep learning, or other AI techniques and is classified as Class IIa, IIb, or III under EU MDR, it is automatically high-risk under the AI Act. The key compliance dates:

  • MDR/IVDR conformity assessment: continues as normal through your Notified Body
  • AI Act high-risk obligations: now due by August 2, 2028 at the latest
  • Integrated assessment: plan to combine AI Act technical documentation with your MDR technical file

New requirements you will need to address beyond MDR:

  • AI-specific risk management (complementing ISO 14971)
  • Data governance documentation for training, validation, and testing datasets
  • Transparency obligations (clear information to users about AI capabilities and limitations)
  • Human oversight mechanisms
  • Accuracy, robustness, and cybersecurity requirements specific to AI
  • Post-market monitoring for AI performance drift

AI-Enabled IVDs

Class C and D IVDs under IVDR that incorporate AI are similarly high-risk. The same extended timeline applies. Manufacturers should integrate AI Act requirements into their IVDR performance evaluation and post-market surveillance plans.

Hospital Workflow AI Tools

AI systems used in hospitals for patient triage, scheduling, or resource allocation that are not CE-marked as medical devices fall under Annex III as standalone high-risk systems. Their compliance deadline is December 2, 2027. These systems face the AI Act's full requirements without the benefit of MDR/IVDR conformity assessment integration.

Class I Medical Devices with AI

Class I MDR devices that do not require Notified Body conformity assessment are generally outside the AI Act's high-risk scope — unless they perform a safety function within a higher-class device. Manufacturers should assess carefully whether their Class I AI device triggers high-risk classification.

Practical Compliance Strategy: What to Do Now

Step 1: Map Your AI Portfolio

Identify every device in your portfolio that uses AI, machine learning, or adaptive algorithms. For each:

  • Is it a medical device under MDR or an IVD under IVDR?
  • What is its risk classification?
  • Does it use continuous learning or locked algorithms?
  • Has it already been placed on the EU market?

Step 2: Assess Dual Compliance Requirements

For each AI-enabled device, map the overlap between:

MDR/IVDR Requirement AI Act Requirement Integration Opportunity
Risk management (ISO 14971) AI risk management (Art. 9) Expand risk file to include AI-specific hazards
Clinical evaluation Data governance (Art. 10) Document training data provenance in clinical evaluation
Technical documentation (Annex II/III MDR) Technical documentation (Art. 11) Integrate AI documentation into technical file
PMS plan Post-market monitoring (Art. 72) Add AI performance monitoring to PMS plan
PSUR AI-specific reporting Include AI drift data in PSUR cycle
IFU / labeling Transparency (Art. 13) Add AI transparency information to IFU
Design controls (IEC 62304) Quality management (Art. 17) Extend SDLC to include AI governance
UDI / EUDAMED EU database registration Register AI system in AI Act database

Step 3: Prepare Your Notified Body

Notified Bodies will need to assess AI Act compliance alongside MDR/IVDR conformity. Engage your Notified Body early to understand their readiness and whether they will conduct integrated assessments or require separate submissions.

Step 4: Build AI Governance into Your QMS

Update your quality management system to include:

  • An AI policy defining how AI systems are developed, validated, and monitored
  • Data governance procedures for training, validation, and testing datasets
  • Bias detection and mitigation processes
  • Human oversight mechanisms for AI-assisted clinical decisions
  • AI-specific change control procedures (especially for adaptive algorithms)
  • Post-market monitoring for AI performance degradation or drift

Step 5: Monitor the Final Text

The May 7 agreement is a provisional deal between the Council and Parliament. It still needs formal adoption by both institutions. While the broad outlines are unlikely to change, the final legal text may include modifications. Monitor:

  • The European Commission's confirmation of when support measures (harmonized standards, common specifications, guidance) are available
  • Final publication in the Official Journal of the European Union
  • National transposition guidance from your target EU member states

What the Critics Say

The deal is not universally welcomed. Team-NB (the European association of Notified Bodies for medical devices) raised concerns that the MDR/IVDR revisions and AI Act simplification could compromise patient safety. Patient advocacy groups, including the Greek Patients' Association, have warned that regulatory delays — even when motivated by simplification — can push products off the EU market and force patients to wait years for treatments available elsewhere.

On the other side, industry groups including MedTech Europe have argued that the original AI Act timelines were unrealistic without supporting standards and that the extension is necessary for meaningful compliance.

The truth is somewhere in between: the extension provides breathing room, but manufacturers should not use it as an excuse to delay. Companies that build AI governance into their QMS now will be ahead when the 2028 deadline arrives.

Recommended Reading
IVDR Class C Transition Deadline: What IVD Manufacturers Must Do Before May 26, 2026
EU MDR / IVDR IVD & Diagnostics2026-05-06 · 13 min read

Timeline Summary

Date Milestone
August 1, 2024 AI Act entered into force
February 2, 2025 Prohibited AI practices became enforceable
August 2, 2025 GPAI model obligations began
November 19, 2025 Digital Omnibus proposal introduced
May 7, 2026 Council and Parliament reach provisional deal on Omnibus amendments
August 2, 2026 Original Annex III deadline (may be superseded by Omnibus if adopted in time)
December 2, 2027 New long-stop deadline for Annex III standalone high-risk AI
August 2, 2028 New long-stop deadline for AI in medical devices (Annex I)

The EU AI Act remains the strictest comprehensive AI law in the world, even after these amendments. Medical device manufacturers who treat this as a compliance extension rather than a reprieve will be best positioned when enforcement begins.