Ambient AI Scribe Privacy Read Now
Use Case Classification Guide

Is Clinical Decision Support AI High-Risk Under EU AI Act?

Definitive classification guide for CDSS. Most clinical decision support systems are high-risk. Learn the regulatory requirements, MDR interaction, and compliance pathway.

12 min read 2,200 words
Joe Braidwood
Joe Braidwood
CEO, GLACIS
12 min read

Quick Answer: Yes, Most CDSS Are High-Risk

Clinical Decision Support Systems (CDSS) are classified as high-risk under the EU AI Act in virtually all clinical scenarios. This applies to diagnostic support, treatment recommendations, drug interaction checking, and clinical alerting systems.

The classification derives from Article 6(1) and Annex I: AI systems that are medical devices under the Medical Device Regulation (MDR) are automatically high-risk. Since CDSS that influence clinical decisions qualify as medical devices, they inherit high-risk status and must comply with Articles 9-15 of the AI Act.

The Litigation Has Begun: Sharp HealthCare (2025)

In November 2025, a proposed class action was filed against Sharp HealthCare. The lawsuit alleges their ambient AI scribe recorded an estimated 100,000+ patients without proper consent and that false consent statements appeared in medical records.[1] The same documentation and consent failures apply to clinical decision support systems.

The question isn't whether your CDSS vendor has policies. It's whether you can prove those policies executed when the plaintiff's attorney asks for evidence during discovery.

High-Risk
Classification
Aug 2027
MDR AI Deadline
Art. 9-15
Core Requirements
10+ Years
Log Retention

In This Guide

What is Clinical Decision Support?

Clinical Decision Support Systems (CDSS) are AI-powered tools that assist healthcare providers in making diagnostic, treatment, and care decisions. These systems analyze patient data, medical literature, and clinical guidelines to provide recommendations, alerts, or insights at the point of care.

Types of Clinical Decision Support

Diagnostic Support

AI systems that analyze symptoms, lab results, imaging, or patient history to suggest potential diagnoses. Examples include radiology AI, pathology analysis, and differential diagnosis engines.

Treatment Recommendations

Systems that suggest treatment protocols, medication dosing, or therapeutic interventions based on patient characteristics, clinical guidelines, and outcomes data.

Drug Interaction Checking

AI-powered alerts for contraindications, drug-drug interactions, allergy warnings, and dosing errors. Critical safety systems in electronic health records and pharmacy systems.

Risk Stratification

Predictive models that identify high-risk patients for conditions like sepsis, deterioration, readmission, or adverse events. Used for early intervention and resource allocation.

Annex III Category Analysis

The EU AI Act classifies AI systems as high-risk through two pathways. For CDSS, the primary classification derives from Article 6(1) combined with Annex I, which covers products subject to EU harmonization legislation including medical devices.

Primary Classification Pathway: Medical Devices (Annex I)

Under Article 6(1), an AI system is high-risk if it is:

The Medical Device Regulation (EU) 2017/745 and In Vitro Diagnostic Regulation (EU) 2017/746 are listed in Annex I. CDSS that qualify as medical devices under these regulations are therefore automatically classified as high-risk AI systems.

Key Determination

If your CDSS meets the MDR definition of a medical device (software intended for diagnosis, treatment, or monitoring of disease), it is a high-risk AI system under the EU AI Act. There is no separate assessment needed—the MDR classification triggers AI Act high-risk status.

When Does CDSS Qualify as a Medical Device?

Under MDR Article 2(1), software qualifies as a medical device if it is intended by the manufacturer for medical purposes including:

Most CDSS fall squarely within these definitions. The critical factor is the intended purpose as defined by the manufacturer—if marketing materials, documentation, or deployment context indicate clinical decision-making use, MDR applies.

When CDSS Are High-Risk

The following CDSS scenarios are unambiguously high-risk under the EU AI Act:

Diagnostic AI in Radiology, Pathology, or Dermatology

AI that analyzes medical images to detect, classify, or characterize disease. Examples: chest X-ray analysis, mammography screening, skin lesion classification, histopathology analysis.

Treatment Protocol Recommendation Engines

AI systems that suggest treatment plans, chemotherapy regimens, surgical approaches, or therapy protocols based on patient characteristics and outcomes data.

Medication Safety and Drug Interaction Systems

AI-powered alerts for contraindications, drug-drug or drug-gene interactions, dosing recommendations, and allergy warnings integrated into prescribing workflows.

Early Warning and Deterioration Prediction

Sepsis prediction, patient deterioration scores, ICU mortality risk, and other predictive models used to trigger clinical interventions or escalation of care.

Genomic and Precision Medicine AI

AI that interprets genetic data to guide treatment selection, predict drug response, or identify hereditary disease risk for clinical action.

Limited Exemption Scenarios

Very few CDSS scenarios escape high-risk classification. Potential exemptions are narrow:

Potentially Lower-Risk CDSS Scenarios

  • Pure research tools: AI used exclusively for clinical research without influencing patient care decisions may not require MDR classification, potentially avoiding high-risk status.
  • Administrative and operational AI: Scheduling optimization, resource allocation, or workflow management that doesn’t influence clinical decisions.
  • Simple data retrieval: Systems that only retrieve and display existing information without analysis, inference, or recommendations (rare for true AI systems).
  • Wellness applications: Consumer fitness apps or general wellness tools not intended for medical purposes (though misuse can trigger reclassification).

Critical warning: These exemptions are narrowly construed. Regulators will examine actual use, not just intended purpose. A "research tool" used in clinical practice becomes a medical device. An "administrative tool" that influences patient prioritization may affect care decisions.

High-Risk CDSS Requirements (Articles 9-15)

High-risk CDSS must comply with the core requirements in Articles 9-15 of the EU AI Act. These requirements layer on top of existing MDR obligations.

Article Requirement CDSS Implication
Article 9 Risk Management System Continuous identification and mitigation of AI-specific risks throughout the CDSS lifecycle, including diagnostic errors, recommendation failures, and edge cases.
Article 10 Data Governance Training, validation, and testing data must be relevant, representative, and free from errors. Critical for CDSS to avoid bias across patient populations.
Article 11 Technical Documentation Comprehensive documentation of AI system design, development, capabilities, limitations, and performance. Must enable regulatory assessment.
Article 12 Automatic Logging Logs must enable traceability of CDSS operation, inputs, outputs, and human oversight actions. Retention for device lifecycle or minimum 10 years.
Article 13 Transparency Instructions for use must enable clinicians to interpret outputs, understand limitations, and exercise appropriate oversight.
Article 14 Human Oversight CDSS must be designed to allow effective human oversight. Clinicians must be able to understand, intervene, and override AI recommendations.
Article 15 Accuracy, Robustness, Cybersecurity CDSS must achieve appropriate accuracy levels, be robust against errors, and include cybersecurity protections against adversarial manipulation.

Article 12 Logging: Core GLACIS Relevance

Article 12 of the EU AI Act mandates automatic logging capabilities for high-risk AI systems. For CDSS, this requirement is particularly critical given patient safety implications and the need for post-incident investigation.

What Must Be Logged

  • System operation periods: When the CDSS was active and processing clinical data
  • Input data reference: The clinical data inputs that generated each recommendation (with patient privacy protections)
  • AI outputs: The specific recommendations, alerts, or decisions generated by the system
  • Human oversight actions: Clinician acceptance, rejection, or modification of AI recommendations
  • System anomalies: Any errors, failures, or unusual behavior that may affect reliability

Retention Requirements

Logs must be retained for a period appropriate to the intended purpose—for CDSS, this typically means:

GLACIS provides cryptographic attestation of CDSS logging compliance—continuous evidence that your logging infrastructure captures required data, maintains integrity, and satisfies retention requirements. This evidence is essential for conformity assessments and regulatory audits.

Interaction with Medical Device Regulation

For AI-enabled medical devices, the EU AI Act and MDR work in tandem. Organizations must comply with both regulatory frameworks, though the AI Act explicitly avoids duplicating MDR requirements where possible.

Complementary Requirements

MDR Provides

  • Clinical safety and performance requirements
  • Quality management system (ISO 13485)
  • Clinical evaluation and post-market surveillance
  • Notified body conformity assessment

AI Act Adds

  • AI-specific risk management (Article 9)
  • Data governance for training data (Article 10)
  • Automatic logging requirements (Article 12)
  • Explicit human oversight provisions (Article 14)

Conformity Assessment

For AI-enabled medical devices, the MDR notified body assessment can incorporate AI Act requirements. Article 43(3) of the AI Act provides that conformity assessment under MDR satisfies AI Act requirements when the notified body verifies AI-specific compliance. This avoids duplicate assessments but requires notified bodies to have AI competence.

US FDA SaMD Comparison

Organizations operating in both EU and US markets must navigate parallel regulatory frameworks. While philosophically similar, important differences exist.

Aspect EU AI Act + MDR FDA SaMD Framework
Risk Classification High-risk for medical AI automatically Risk-based (Class I, II, III) with IMDRF SaMD categories
Logging Requirements Mandatory automatic logging (Article 12) Good Machine Learning Practice guidance; less prescriptive
Human Oversight Explicit requirements (Article 14) Considered in labeling; less formalized
Update Pathway Substantial modification triggers reassessment Predetermined Change Control Plan (PCCP) for certain AI updates
Transparency Detailed instructions for use requirements Labeling requirements; proposed transparency rules

The EU AI Act is currently more prescriptive on AI-specific requirements like logging and human oversight. Organizations building for both markets should design to the higher standard (typically EU) and document how they satisfy each framework’s requirements.

Implementation Checklist

Use this checklist to assess and plan your CDSS compliance program:

CDSS EU AI Act Compliance Checklist

Classification and Assessment

  • Confirm CDSS qualifies as medical device under MDR
  • Document high-risk classification rationale
  • Identify applicable MDR class and notified body requirements

Risk Management (Article 9)

  • Establish AI-specific risk management process
  • Identify and document CDSS-specific risks (diagnostic errors, bias, edge cases)
  • Implement and document risk mitigation measures

Data Governance (Article 10)

  • Document training data sources and characteristics
  • Assess and document data representativeness and bias testing
  • Establish validation and testing data governance

Logging Infrastructure (Article 12)

  • Implement automatic logging of CDSS operations
  • Capture inputs, outputs, and human oversight actions
  • Establish 10+ year retention with integrity protection

Human Oversight (Article 14)

  • Design CDSS for effective clinician oversight
  • Ensure outputs are interpretable and overridable
  • Document human oversight procedures in instructions for use

Conformity Assessment

  • Prepare technical documentation per Article 11
  • Engage notified body with AI assessment capability
  • Schedule assessment allowing 6-12 months before deadline

Frequently Asked Questions

Is clinical decision support software high-risk under the EU AI Act?

Yes, most Clinical Decision Support Systems (CDSS) are classified as high-risk under the EU AI Act. CDSS that qualify as medical devices under the Medical Device Regulation (MDR) are automatically high-risk per Article 6(1) and Annex I. This includes diagnostic support, treatment recommendations, and drug interaction checking systems that influence clinical decisions.

What EU AI Act requirements apply to high-risk CDSS?

High-risk CDSS must comply with Articles 9-15 of the EU AI Act, including: risk management systems (Article 9), data governance (Article 10), technical documentation (Article 11), automatic logging (Article 12), transparency and instructions for use (Article 13), human oversight provisions (Article 14), and accuracy, robustness, and cybersecurity requirements (Article 15).

How does the EU AI Act interact with the Medical Device Regulation for CDSS?

The EU AI Act and MDR work together for AI-enabled medical devices. AI systems that are medical devices must comply with both regulations. The AI Act adds specific AI-related requirements (logging, human oversight, AI-specific risk management) on top of existing MDR requirements. Notified body assessment under MDR can satisfy AI Act conformity assessment requirements.

Are there any exemptions for clinical decision support under the EU AI Act?

Limited exemptions exist. CDSS used purely for research purposes without clinical application may be exempt. Systems that only provide information without recommendations (simple data retrieval) may not qualify as AI systems. However, any CDSS that provides diagnostic suggestions, treatment recommendations, or clinical alerts will almost certainly be classified as high-risk.

What logging requirements apply to clinical decision support AI?

Article 12 of the EU AI Act requires high-risk CDSS to maintain automatic logs of system operation. Logs must enable traceability of AI decisions, record inputs and outputs, capture human oversight interventions, and be retained for the device’s lifecycle or at least 10 years. This is critical for patient safety investigations and regulatory audits.

How does US FDA regulation of Software as Medical Device compare to EU AI Act?

The FDA regulates clinical decision support through the Software as Medical Device (SaMD) framework, which uses a risk-based approach similar to the EU. However, the EU AI Act adds explicit AI-specific requirements including mandatory logging, human oversight provisions, and transparency obligations that go beyond current FDA guidance. Organizations operating in both markets must comply with both frameworks.

When must clinical decision support systems comply with the EU AI Act?

High-risk AI systems including CDSS must achieve full compliance by August 2, 2026. However, AI-enabled medical devices have an extended timeline until August 2, 2027. Organizations should begin compliance efforts now, as building conformity infrastructure (risk management, logging, documentation) requires 6-12 months minimum.

References

  1. [1] European Union. "Regulation (EU) 2024/1689 of the European Parliament and of the Council." Official Journal of the European Union, July 12, 2024. EUR-Lex 32024R1689
  2. [2] European Union. "Regulation (EU) 2017/745 on Medical Devices (MDR)." Official Journal of the European Union, May 5, 2017. EUR-Lex 32017R0745
  3. [3] FDA. "Software as a Medical Device (SaMD): Clinical Evaluation." Guidance Document, December 2017. fda.gov
  4. [4] European Commission. "Questions and Answers: Artificial Intelligence Act." March 13, 2024. europa.eu
  5. [5] IMDRF. "Software as a Medical Device: Possible Framework for Risk Categorization and Corresponding Considerations." IMDRF/SaMD WG/N12, 2014. imdrf.org
  6. [6] FDA. "Artificial Intelligence and Machine Learning in Software as a Medical Device." Action Plan, January 2021. fda.gov

CDSS Compliance Evidence in Days, Not Months

GLACIS generates cryptographic proof that your clinical decision support controls work as intended. Article 12 logging compliance, human oversight verification, and audit-ready documentation for your CDSS.

Assess Your CDSS Compliance

Related Guides