Framework Comparison Overview
Before diving into the control-by-control mapping, it’s essential to understand the fundamental nature of each framework. While both address AI governance, they serve different purposes and carry different weight in the compliance landscape.
| Attribute | ISO/IEC 42001:2023 | EU AI Act (Reg. 2024/1689) |
|---|---|---|
| Type | Voluntary international standard | Mandatory EU regulation |
| Issuing Body | ISO/IEC Joint Technical Committee | European Parliament and Council |
| Published | December 2023 | July 2024 (in force August 2024) |
| Scope | AI management system (organizational) | AI systems placed on EU market (product/service) |
| Geographic Reach | Global (voluntary adoption) | EU + extraterritorial (mandatory) |
| Certification | Third-party certification available | Conformity assessment (self or notified body) |
| Enforcement | Market-driven (customer requirements) | Regulatory (fines up to €35M / 7% turnover) |
| Focus | Management processes and continuous improvement | Product safety and fundamental rights |
The table above reveals the core distinction: ISO 42001 asks "Do you have good processes for managing AI?" while the EU AI Act asks "Is this specific AI system safe and compliant?" Organizations need both perspectives—robust internal processes (ISO 42001) and demonstrable product compliance (EU AI Act).
Detailed Control Mapping by Category
The following control mapping compares specific requirements from ISO 42001 clauses with corresponding EU AI Act articles. We’ve assessed each mapping as High Alignment, Partial Alignment, or Gap.
1. Risk Management
- •Identify AI-related risks and opportunities
- •Assess risks to individuals and organizations
- •Determine risk treatment options
- •Implement controls proportionate to risk
- •Maintain risk register and review periodically
- •Establish risk management system for entire lifecycle
- •Identify and analyze known/foreseeable risks
- •Estimate and evaluate risks from intended use
- •Adopt suitable risk management measures
- •Test to identify appropriate measures
2. Data Governance
- •Data quality management processes
- •Data provenance and lineage tracking
- •Bias assessment in training data
- •Privacy and data protection considerations
- •Training, validation, and testing data governance
- •Relevance, representativeness, error-free criteria
- •Bias detection and correction
- •Statistical properties examination
3. Documentation
- •Documented information required by standard
- •AI system lifecycle documentation
- •Control and version management
- •Retention and disposition requirements
- •Detailed technical documentation per Annex IV
- •System description, design, development
- •Monitoring, functioning, control
- •Maintain for 10 years after market placement
4. Logging and Traceability
- •Monitoring, measurement, analysis, evaluation
- •Determine what needs monitoring
- •Methods for valid results
- •Retain documented information as evidence
- •Automatic recording of events (logs)
- •Enable tracing of AI system functioning
- •Retention period appropriate to intended purpose
- •Logs accessible to deployers and authorities
5. Transparency
- •Explainability of AI system outputs
- •Communication with stakeholders
- •AI system capabilities and limitations
- •Disclosure of AI use to affected parties
- •Design for sufficient transparency
- •Instructions for use to deployers
- •Capabilities, limitations, intended purpose
- •Human oversight measures
6. Human Oversight
- •Human oversight requirements
- •Appropriate level of human control
- •Override and intervention capabilities
- •Competency of oversight personnel
- •Design for effective human oversight
- •Ability to understand system capabilities
- •Ability to correctly interpret outputs
- •Ability to decide not to use or override
Key Differences
Despite significant overlap, several fundamental differences distinguish these frameworks:
1. Legal Force
ISO 42001 is voluntary—market-driven adoption with no legal penalties. The EU AI Act is mandatory with enforcement mechanisms: €35M or 7% of global turnover for prohibited practices, €15M or 3% for high-risk violations. Non-compliance isn’t a strategic choice; it’s a legal violation.
2. Conformity Assessment
ISO 42001 certification is management-system focused—auditors verify processes exist and function. EU AI Act conformity assessment evaluates whether specific AI systems meet technical requirements. Some high-risk systems require third-party notified body assessment, which is entirely different from ISO certification.
3. Registration Requirements
ISO 42001 has no registration requirements. The EU AI Act mandates registration in the EU database for high-risk AI systems before market placement or putting into service (Article 71). This public registration includes provider information, system description, and conformity documentation.
4. Post-Market Surveillance
ISO 42001 covers monitoring within the management system context. The EU AI Act requires specific post-market monitoring systems (Article 72), serious incident reporting to national authorities within 15 days (Article 73), and corrective action procedures. These regulatory obligations go beyond ISO 42001 requirements.
Significant Overlaps
The good news: substantial overlap exists between the frameworks, making ISO 42001 an excellent foundation for EU AI Act compliance.
- Risk-based approach: Both frameworks center on identifying, assessing, and mitigating AI-related risks. ISO 42001’s risk management infrastructure directly supports EU AI Act Article 9 requirements.
- Lifecycle thinking: Both require consideration of AI systems across their entire lifecycle—from design through deployment to decommissioning.
- Human oversight: Both emphasize the importance of human control over AI systems, including the ability to override or intervene.
- Data quality: Both address training data governance, bias assessment, and data quality management.
- Transparency: Both require organizations to be transparent about AI capabilities, limitations, and appropriate use.
- Continuous improvement: Both frameworks assume ongoing monitoring and improvement rather than point-in-time compliance.
Gap Analysis
What ISO 42001 Doesn’t Cover
Organizations with ISO 42001 certification will need to address these gaps for EU AI Act compliance:
- Conformity assessment procedures: Specific procedures for self-assessment or notified body assessment per EU AI Act requirements
- CE marking requirements: Affixing CE marking to compliant AI systems (Article 48)
- EU database registration: Registration of high-risk AI systems in the EU public database
- Annex IV technical documentation: Specific documentation format and content requirements
- Automatic logging with retention: Technical implementation of automatic event logging with specified retention periods
- Incident reporting: 15-day serious incident reporting to national competent authorities
- EU authorized representative: Designation of EU-based representative for non-EU providers
What EU AI Act Doesn’t Cover
Conversely, ISO 42001 provides organizational capabilities beyond EU AI Act scope:
- Management system structure: Organizational framework for AI governance, including leadership, planning, and improvement
- Internal audit processes: Systematic internal audit and management review cycles
- Competency management: Formal processes for ensuring personnel competency in AI roles
- Third-party AI management: Structured approach to managing AI components from suppliers
- Continuous improvement: PDCA cycle for systematic enhancement of AI practices
Evidence Requirements Comparison
Both frameworks require documented evidence, but the nature and specificity differ:
| Evidence Type | ISO 42001 | EU AI Act |
|---|---|---|
| Risk Assessment | Risk register, treatment plans, review records | Article 9 compliant risk management documentation |
| Technical Documentation | System documentation per organizational needs | Annex IV format: 10+ specific sections, 10-year retention |
| Logging | Monitoring records, measurement results | Automatic logs enabling traceability, authority-accessible |
| Testing | Validation and verification records | Pre-market testing per harmonized standards |
| Conformity | Certification audit reports | EU Declaration of Conformity, notified body assessment (if required) |
| Incidents | Incident records, corrective actions | Serious incident reports to authorities within 15 days |
Compliance Strategies by Scenario
Scenario 1: ISO 42001 Certified, Pursuing EU AI Act
If you already have ISO 42001 certification:
- 1 Classify your AI systems under EU AI Act risk categories. Identify which are high-risk per Annex III.
- 2 Map existing documentation to Annex IV requirements. Identify gaps in format and content.
- 3 Implement automatic logging if not already in place. This is typically the largest technical gap.
- 4 Establish conformity assessment procedures appropriate to your AI systems’ risk classification.
- 5 Prepare EU database registration materials for high-risk systems.
Scenario 2: Pursuing Both Simultaneously
If starting fresh with both frameworks:
- 1 Start with ISO 42001 implementation to establish management system infrastructure.
- 2 Design with EU AI Act in mind—build documentation to Annex IV specifications from the start.
- 3 Implement logging infrastructure early—this supports both frameworks’ monitoring requirements.
- 4 Pursue ISO 42001 certification first—it demonstrates governance maturity and accelerates EU AI Act conformity.
Scenario 3: EU AI Act Priority, ISO 42001 Later
If regulatory compliance is the immediate driver:
- 1 Focus on high-risk system compliance—prioritize systems facing the August 2026 deadline.
- 2 Build documentation and logging to EU AI Act requirements specifically.
- 3 Retroactively align with ISO 42001—much of your EU AI Act work will map to ISO 42001 requirements.
How GLACIS Bridges the Gaps
GLACIS provides the technical infrastructure that addresses the most significant gaps between ISO 42001 and EU AI Act compliance:
GLACIS Capabilities Mapped to Gaps
GLACIS provides continuous, automatic logging of AI system events with cryptographic attestation. Logs are tamper-evident and meet EU AI Act traceability requirements.
Generate audit-ready documentation that maps to both ISO 42001 clauses and EU AI Act Annex IV requirements. One source of truth, multiple framework outputs.
Real-time monitoring of AI control execution goes beyond ISO 42001’s periodic review requirements, providing the continuous risk assessment EU AI Act envisions.
Automatic retention policies aligned with EU AI Act requirements—10 years for technical documentation, purpose-appropriate periods for operational logs.
Frequently Asked Questions
Does ISO 42001 certification satisfy EU AI Act requirements?
No, but it provides strong alignment. ISO 42001 covers approximately 70-80% of high-risk AI system requirements under Articles 9-15. Organizations still need conformity assessment procedures, EU database registration, specific Annex IV documentation, and post-market surveillance mechanisms not covered by ISO 42001.
Should I pursue ISO 42001 first or EU AI Act compliance first?
For most organizations, ISO 42001 first is advisable. It establishes management system infrastructure, risk assessment processes, and documentation practices that EU AI Act requires. Organizations with ISO 42001 achieve EU AI Act compliance 30-40% faster than starting from scratch. However, if you face immediate regulatory pressure (e.g., August 2026 deadline for high-risk systems), prioritize EU AI Act compliance.
What is the biggest gap between the frameworks?
Automatic logging is typically the largest gap. ISO 42001 requires monitoring and measurement but allows organizational flexibility in implementation. EU AI Act Article 12 mandates automatic recording of events enabling traceability during AI system operation, with logs accessible to deployers and authorities. This requires technical infrastructure many organizations lack.
Can one certification body handle both assessments?
ISO 42001 certification and EU AI Act conformity assessment are fundamentally different processes. ISO certification audits management systems; EU AI Act conformity (for systems requiring it) assesses specific AI products against technical requirements. Some notified bodies may offer both services, but they remain separate assessments with different criteria.
How does GDPR fit into this picture?
GDPR adds a third layer for AI systems processing personal data. Both ISO 42001 and EU AI Act reference data protection requirements. Organizations need to ensure their AI governance addresses all three: ISO 42001 for management systems, EU AI Act for AI-specific requirements, and GDPR for personal data processing. GLACIS evidence generation can map to all three frameworks simultaneously.
References
- ISO/IEC. "ISO/IEC 42001:2023 Information Technology — Artificial Intelligence — Management System." December 2023. iso.org
- European Union. "Regulation (EU) 2024/1689 of the European Parliament and of the Council." Official Journal of the European Union, July 12, 2024. EUR-Lex 32024R1689
- European Commission. "Questions and Answers: Artificial Intelligence Act." March 13, 2024. europa.eu
- ISO. "ISO/IEC 42001 — Artificial Intelligence Management System." Guidance document, 2024. iso.org
- European AI Office. "AI Act Implementation Guidance." European Commission, 2024. ec.europa.eu