Ambient AI Scribe Privacy Read Now
Germany Guide | Updated December 2025

EU AI Act Germany Implementation Guide

Complete guide to EU AI Act (KI-Verordnung) compliance in Germany. Bundesnetzagentur authority, works council requirements, automotive and healthcare sectors, and Article 12 logging obligations.

18 min read 3,200+ words
Joe Braidwood
Joe Braidwood
CEO, GLACIS
18 min read

Executive Summary

Germany, as the EU’s largest economy and home to major automotive, manufacturing, and healthcare sectors, faces significant compliance obligations under the EU AI Act (Regulation 2024/1689). The Bundesnetzagentur (Federal Network Agency) has been designated as Germany’s primary market surveillance authority, with an AI Service Desk operational since July 2025.[1]

German organizations must navigate both EU-level requirements and national specifics including works council (Betriebsrat) co-determination rights under the Works Constitution Act, sectoral authorities for medical devices (BfArM) and financial services (BaFin), and the automotive industry’s intersection with type-approval regulations. The draft KI-Marktuberwachungsgesetz (KI-MIG) provides the national implementation framework.[2][3]

Key finding: German employers deploying high-risk AI systems face dual compliance tracks: EU AI Act deployer obligations (Article 29) and German co-determination requirements. Works councils must be informed before AI introduction, can consult experts, and have veto power over systems monitoring employee performance. Organizations should factor 3-6 additional months for works agreement negotiations into compliance timelines.

Aug 2026
High-Risk Deadline
BNetzA
Lead Authority
Section 87
Works Council Rights
UKIM
Oversight Chamber

In This Guide

Germany’s Implementation Status

Germany is implementing the EU AI Act through a combination of directly applicable EU regulation and national implementing legislation. As the EU’s largest economy with significant AI deployment across automotive, manufacturing, healthcare, and financial services, Germany’s implementation approach has outsized influence on how the regulation works in practice.

National Implementing Legislation

The German government published the first draft of its national implementation law in late August 2025: the KI-Marktuberwachungsgesetz und Innovationsforderungsgesetz (KI-MIG)—the "AI Market Surveillance and Innovation Promotion Act." This law establishes:[1][2]

The legislative process was delayed by Germany’s unscheduled parliamentary elections in early 2025. The new Federal Government is expected to enact the KI-MIG in the latter half of 2025, though the EU AI Act’s requirements are directly applicable regardless of national legislation status.[1]

Direct Applicability Note

Unlike directives requiring national transposition, the EU AI Act (Regulation 2024/1689) is directly applicable across all member states. German organizations must comply with its requirements regardless of national implementing legislation status. The KI-MIG establishes enforcement mechanisms, but doesn’t change substantive obligations.

National Competent Authority

Article 70 of the EU AI Act requires each member state to designate at least one national competent authority to supervise application and implementation. Germany’s approach reflects its federal structure and existing regulatory landscape.

Bundesnetzagentur (Federal Network Agency)

The Bundesnetzagentur (BNetzA) has been designated as Germany’s primary market surveillance authority for the EU AI Act. BNetzA, an independent higher federal authority under the Federal Ministry for Economic Affairs and Climate Action, already regulates telecommunications, postal services, electricity, gas, and railway markets.[1]

BNetzA’s AI Act responsibilities include:

Market Surveillance Coordination

Lead authority for AI Act compliance monitoring, ensuring coordinated supervision across sectors and providing centralized resources for other authorities. Conducts inspections, processes complaints, and coordinates cross-border enforcement.

AI Service Desk

Operational since July 2025, the AI Service Desk provides guidance to organizations on AI Act compliance, risk classification questions, and documentation requirements. Acts as first point of contact for businesses deploying AI in Germany.

AI Lab

Technical testing facility for evaluating AI system compliance, conducting conformity assessments, and providing technical expertise for enforcement actions.

Regulatory Sandboxes

BNetzA will operate AI regulatory sandboxes as required by Article 57 of the AI Act, allowing organizations to develop and test AI systems under controlled conditions with regulatory guidance before full market deployment.

Sectoral Authorities

Germany’s draft implementation maintains a decentralized supervisory structure where existing sectoral regulators retain AI-related market surveillance responsibilities in their domains:[1][2]

German Sectoral AI Authorities

Authority Domain AI Act Relevance
BfArM Medical devices, in vitro diagnostics Medical AI, diagnostic algorithms, clinical decision support
BaFin Financial services supervision Credit scoring AI, algorithmic trading, insurance underwriting
KBA Motor vehicles and road traffic Autonomous vehicles, ADAS, vehicle type-approval
State DPAs Data protection GDPR/AI Act intersection, biometric AI, employee monitoring
Lander Authorities Product safety Consumer AI products, general market surveillance

Independent Market Surveillance Chamber (UKIM)

The draft KI-MIG establishes an Unabhangige Kammer fur die Marktuberwachung (UKIM)—Independent Market Surveillance Chamber—within BNetzA to oversee particularly sensitive high-risk AI systems. UKIM has exclusive oversight of AI used in:[1]

UKIM reports annually to the Bundestag (German Parliament) on AI deployment in these sensitive areas, providing democratic oversight of government AI use.

Implementation Timeline

The EU AI Act’s staggered implementation timeline applies uniformly across all member states, including Germany. Key deadlines for German organizations:

EU AI Act Timeline for Germany

Date Milestone Impact for Germany Status
Aug 1, 2024 Entry into Force AI Act legally effective across EU COMPLETE
Feb 2, 2025 Prohibited AI Ban Social scoring, manipulative AI, untargeted biometric scraping banned ACTIVE
Aug 2, 2025 GPAI Compliance Foundation model obligations, AI Service Desk operational ACTIVE
Aug 2, 2026 High-Risk Systems Full conformity for Annex III systems; notified bodies operational 8 MONTHS
Aug 2, 2026 AI Regulatory Sandboxes BNetzA sandboxes must be operational 8 MONTHS
Aug 2, 2027 Medical AI Extended Extended timeline for medical device AI safety components 20 MONTHS

High-Risk AI Sectors in Germany

Germany’s industrial structure means certain high-risk AI categories have outsized relevance. The following sectors face significant compliance obligations under Annex III:

Automotive and Manufacturing

Germany’s automotive industry—home to Volkswagen, BMW, Daimler, and Bosch—faces substantial AI Act implications. Most AI systems in autonomous vehicles and Advanced Driver Assistance Systems (ADAS) are classified as high-risk when used as safety components.[4][5]

Automotive AI Classification

AI systems in vehicles may be classified as high-risk under two pathways:

  • 1. Annex I (Article 6(1)): AI as safety component of products requiring third-party conformity assessment (vehicle type-approval)
  • 2. Annex III Category 2: AI managing critical infrastructure including road traffic

However, the Type-Approval Framework Regulation (EU 2018/858) serves as lex specialis for vehicle-related AI safety components, with AI Act requirements supplementary rather than primary. The German Association of the Automotive Industry (VDA) established the KI-Absicherung project to develop AI assurance verification methods for in-vehicle systems.[4]

Key automotive AI applications requiring compliance attention:

Healthcare and Medical Devices

Germany’s substantial healthcare sector and medical device industry (including Siemens Healthineers, Fresenius, B. Braun) face high-risk AI requirements. Medical AI is addressed through both the AI Act and the Medical Device Regulation (EU 2017/745).

The BfArM (Federal Institute for Drugs and Medical Devices) retains supervisory responsibility for AI-powered medical devices. Most clinical decision support systems, diagnostic AI, and treatment recommendation engines are classified as high-risk requiring:[6]

Financial Services

German financial institutions using AI for creditworthiness assessment, insurance underwriting, or algorithmic trading face high-risk classification under Annex III Category 5 (essential services). BaFin maintains supervisory authority, with AI Act requirements complementing existing financial regulation.

Article 12 Logging Requirements

Article 12 of the EU AI Act establishes mandatory logging requirements for high-risk AI systems—a requirement with particular implications in Germany due to intersection with data protection law and works council rights.

Core Logging Obligations

High-risk AI systems must be designed with automatic recording of events (logs) throughout operation, including:

Traceability Requirements

  • - Recording of the period of each use (start date/time, end date/time)
  • - Reference database against which input data was checked
  • - Input data for which the search led to a match
  • - Identity of natural persons involved in verifying results

Technical Requirements

  • - Logging capabilities ensuring traceability throughout system lifecycle
  • - Logging level appropriate to intended purpose of high-risk system
  • - Protection by appropriate security measures (tamper-evidence)
  • - Retention for period appropriate to intended purpose

German-Specific Considerations

Article 12 logging intersects with several German legal requirements:

Sector-Specific Considerations

Employment and Works Councils (Betriebsrat)

German employers deploying AI in employment contexts face dual compliance requirements: EU AI Act obligations and national works council co-determination rights under the Works Constitution Act (Betriebsverfassungsgesetz, BetrVG).[3][7]

Employment AI is explicitly classified as high-risk under Annex III Category 4, covering:

Works Council Rights (BetrVG)

The 2021 Works Council Modernization Act (Betriebsratemodernisierungsgesetz) added AI-specific provisions to BetrVG:[3][7]

Works Council AI Rights under BetrVG

Section Right Practical Implication
Section 80(3) Expert consultation Works council may engage external AI experts at employer expense
Section 87(1) No. 6 Co-determination on monitoring Veto power over AI systems capable of monitoring employee behavior/performance
Section 90(1) No. 3 Information before introduction Employer must inform works council in good time before deploying AI
Section 95(2a) Personnel selection guidelines Works council involvement in AI-based personnel selection criteria

Critical Planning Factor

German organizations should factor 3-6 months additional timeline for works council negotiations when deploying high-risk employment AI. Works agreements (Betriebsvereinbarungen) covering AI use, data handling, and employee protections are often required before deployment. Failure to secure works council agreement can result in injunctions blocking AI system use.

Healthcare Sector

Healthcare AI in Germany must satisfy both AI Act requirements and medical device regulations. Key considerations:

Financial Services

German financial institutions under BaFin supervision using AI for high-risk applications must address:

Conformity Assessment Pathway

German organizations with high-risk AI systems must complete conformity assessment before August 2, 2026. Two pathways exist, depending on system classification:

Internal Control (Most High-Risk)

Self-assessment by provider based on:

  • - Technical documentation (Annex IV)
  • - Quality management system
  • - Post-market monitoring plan
  • - EU declaration of conformity

Timeline: 3-6 months | Cost: Internal resources

Notified Body Assessment

Third-party assessment required for:

  • - Biometric identification systems
  • - AI medical devices
  • - Products under Annex I requiring third-party conformity

Timeline: 3-12 months | Cost: 10,000-100,000 EUR

German Notified Bodies

Germany’s notified bodies for AI Act conformity assessment are being designated. Organizations should engage early given limited capacity and extended assessment timelines. Key German notified bodies with relevant technical competence include TUV Sud, TUV Rheinland, DEKRA, and sector-specific bodies designated under existing EU regulations.

Enforcement and Penalties

The EU AI Act’s penalty structure applies uniformly across Germany, with BNetzA and sectoral authorities empowered to impose fines:

Penalty Structure in Germany

Violation Type Maximum Fine Enforcing Authority
Prohibited AI practices 35M EUR or 7% global revenue BNetzA, UKIM (sensitive areas)
High-risk non-compliance 15M EUR or 3% global revenue BNetzA, sectoral authorities
GPAI obligations 15M EUR or 3% global revenue EU AI Office (direct enforcement)
Incorrect information 7.5M EUR or 1% global revenue BNetzA, sectoral authorities
Transparency violations 7.5M EUR or 1% global revenue BNetzA, sectoral authorities

Enforcement Powers

German authorities have extensive investigatory powers under Article 74, including:

Compliance Roadmap for German Organizations

German organizations should implement a phased approach accounting for both EU deadlines and national specifics including works council processes:

GLACIS logoGLACIS
GLACIS Framework

Germany EU AI Act Compliance Roadmap

1

AI System Inventory & Classification (Month 1)

Catalog all AI systems. Classify per Annex III risk categories. Identify systems requiring works council involvement (Section 87 BetrVG). Map to existing sector-specific requirements (BfArM, BaFin, KBA). Document intended purpose and affected populations.

2

Works Council Engagement (Month 1-4)

Inform works council per Section 90 BetrVG. Prepare for co-determination negotiations. Draft Betriebsvereinbarung (works agreement) covering AI use, data handling, and employee protections. Allow 3-6 months for negotiation and expert consultation.

3

Risk Management & Documentation (Month 2-5)

Implement Article 9 risk management system. Prepare Annex IV technical documentation. Integrate with existing frameworks (ISO 42001, sector requirements). Document risk mitigation measures and residual risks.

4

Article 12 Logging Implementation (Month 3-6)

Deploy Article 12-compliant logging infrastructure. Ensure GDPR/DSGVO compliance for logged personal data. Implement tamper-evident storage. Configure retention periods per sector requirements. Prepare for works council and regulator access requests.

5

Conformity Assessment (Month 4-8)

Complete internal control assessment or engage German notified body. Prepare EU declaration of conformity. Register in EU AI database per Article 71. Affix CE marking. For medical AI, coordinate with BfArM and MDR requirements.

6

Post-Market Monitoring & Continuous Compliance (Ongoing)

Implement Article 72 post-market monitoring. Establish Article 73 serious incident reporting to BNetzA/sectoral authorities. Conduct periodic reviews. Update documentation as systems evolve. Prepare for market surveillance inspections.

Critical insight: German organizations face tighter effective timelines due to works council requirements. A notified body assessment starting January 2026 may not complete before the August deadline. Start now.

Frequently Asked Questions

Who is the competent authority for the EU AI Act in Germany?

The Bundesnetzagentur (Federal Network Agency, BNetzA) is designated as Germany’s primary market surveillance authority. BNetzA coordinates AI Act supervision, operates an AI lab and service desk, and manages regulatory sandboxes. Sectoral authorities like BfArM (medical devices), BaFin (financial services), and KBA (vehicles) retain responsibility in their domains. An Independent Market Surveillance Chamber (UKIM) oversees sensitive high-risk systems in law enforcement, migration, and justice.

What is the KI-Verordnung and when does it take effect?

KI-Verordnung is the German term for the EU AI Act (Regulation 2024/1689). Germany is implementing it through national legislation called the KI-Marktuberwachungsgesetz und Innovationsforderungsgesetz (KI-MIG). The EU AI Act is directly applicable, with prohibited practices banned since February 2025, GPAI requirements effective August 2025, and high-risk system compliance required by August 2026.

Do German works councils have rights regarding AI systems?

Yes, extensive rights. Under the Works Constitution Act (BetrVG), employers must inform works councils before introducing AI (Section 90), works councils can consult external AI experts (Section 80), they have co-determination rights over systems that could monitor employees (Section 87), and must be involved in AI-based personnel selection guidelines (Section 95). These rights apply in addition to EU AI Act deployer obligations and typically require negotiated works agreements before AI deployment.

How does the EU AI Act affect German automotive companies?

Most AI in autonomous vehicles and ADAS is classified as high-risk when used as safety components. However, vehicle-related AI safety components are primarily regulated through the Type-Approval Framework Regulation (EU 2018/858), with AI Act requirements supplementary. German automakers must complete conformity assessments by August 2026. The VDA’s KI-Absicherung project develops AI assurance verification methods specific to in-vehicle systems.

What are the penalties for EU AI Act violations in Germany?

Penalties mirror EU maximums: up to 35 million euros or 7% of global annual turnover for prohibited AI practices, up to 15 million euros or 3% for high-risk system non-compliance, and up to 7.5 million euros or 1% for providing incorrect information. BNetzA and sectoral authorities enforce penalties, with UKIM overseeing sensitive high-risk systems.

What is Article 12 logging and why does it matter in Germany?

Article 12 requires high-risk AI systems to automatically log events throughout operation, ensuring traceability of inputs, outputs, and decisions. In Germany, this intersects with GDPR data protection requirements, works council information rights (Section 80 BetrVG), and sector-specific retention rules. Logs must be tamper-evident, retained appropriately, and available to supervisory authorities. GLACIS provides Article 12-compliant logging infrastructure with automatic retention management.

Are there AI regulatory sandboxes in Germany?

Yes. Article 57 requires member states to establish at least one AI regulatory sandbox by August 2026. Germany’s draft KI-MIG provides for sandboxes operated by BNetzA. These controlled environments allow organizations to develop and test AI systems under regulatory supervision, receiving compliance guidance and reducing uncertainty before full market launch.

References

  1. [1] Technology’s Legal Edge. "State of the Act: EU AI Act implementation in key Member States." November 2025. technologyslegaledge.com
  2. [2] Pinsent Masons. "AI Act: Germany consults on implementation law." 2025. pinsentmasons.com
  3. [3] Hogan Lovells. "AI in German Employment - Navigating the AI Act, GDPR, and National Legislation." 2024. hoganlovells.com
  4. [4] VDA - German Association of the Automotive Industry. "Position: AI Act." 2023. vda.de
  5. [5] Taylor Wessing. "AI Act and the Automotive Industry - Where does the road lead?" March 2025. taylorwessing.com
  6. [6] European Union. "Regulation (EU) 2024/1689 of the European Parliament and of the Council." Official Journal of the European Union, July 12, 2024. EUR-Lex 32024R1689
  7. [7] Bird & Bird. "First Judgement on the Rights of Works Councils when Employees use AI Systems." 2024. twobirds.com
  8. [8] White & Case. "AI Watch: Global regulatory tracker - Germany." 2025. whitecase.com
  9. [9] Chambers and Partners. "Artificial Intelligence 2025 - Germany." Practice Guide, 2025. chambers.com
  10. [10] DLA Piper. "The German government provides information on its plans for AI and employee protection." 2024. dlapiper.com

EU AI Act Compliance for German Organizations

GLACIS generates cryptographic evidence that your AI controls execute correctly—mapped to EU AI Act Articles 9-15, ready for BNetzA inspection, and structured for works council transparency.

Start Your Compliance Sprint

Related Guides