Ambient AI Scribe Privacy Read Now
Poland Implementation • December 2025

EU AI Act Poland Implementation Guide

Complete guide to Rozporządzenie AI implementation in Poland. National authority (KRiBSI), compliance timeline, Article 12 logging requirements, and enforcement framework.

18 min read 3,200+ words
Joe Braidwood
Joe Braidwood
CEO, GLACIS
18 min read

Executive Summary

Poland is implementing the EU AI Act (Rozporządzenie 2024/1689) through the Ustawa o systemach sztucznej inteligencji (Act on AI Systems), establishing the Commission for AI Development and Security (KRiBSI) as its national competent authority. The draft legislation, first published October 2024 with revisions in February 2025 and June 2025, positions Poland among the first EU member states to create a dedicated AI regulatory body.[1]

With prohibited AI practices already banned (February 2, 2025) and GPAI model obligations effective August 2, 2025, Polish organizations face immediate compliance requirements. High-risk AI systems must achieve full conformity by August 2, 2026. Poland’s strong IT sector—projected to reach $10.35 billion by end of 2025—faces particular exposure as many Polish companies serve as AI system providers to EU customers.[2][3]

Key finding: Polish organizations deploying AI must prioritize Article 12 logging capabilities—the requirement for automatic, tamper-evident event recording throughout an AI system’s lifetime. This infrastructure requires 6-12 months to implement properly and is prerequisite to demonstrating compliance with all other high-risk requirements.

€35M
Maximum Penalty[1]
Aug 2026
High-Risk Deadline[1]
47%
Mfg AI Adoption[4]
$10.35B
Poland IT Market[3]

In This Guide

Poland’s EU AI Act Implementation Status

Poland is among the EU member states actively developing national implementing legislation for the AI Act. The Ustawa o systemach sztucznej inteligencji (Act on Artificial Intelligence Systems) has progressed through multiple drafts, with the Ministry of Digital Affairs (Ministerstwo Cyfryzacji) leading the legislative process.[1]

Legislative Timeline

The draft legislation focuses on three core areas: establishing the national oversight architecture, setting conformity assessment and registration rules for high-risk AI systems, and defining enforcement powers and penalties at the national level. Poland and Spain are notably among the member states that have chosen to create entirely new regulatory bodies rather than delegating authority to existing agencies.[1]

Current Status Assessment

As of late 2025, Poland’s legislative process remains ongoing. The EU Commission’s August 2, 2025 deadline for designating market surveillance authorities found many member states—including Poland—still finalizing their governance frameworks. However, Poland’s proactive approach in publishing draft legislation demonstrates commitment to timely implementation.[5]

National Competent Authority: KRiBSI

Poland’s Act on AI Systems establishes the Komisja Rozwoju i Bezpieczeństwa Sztucznej Inteligencji (Commission for AI Development and Security, abbreviated KRiBSI) as the primary national competent authority for AI Act enforcement.[1]

KRiBSI’s Role and Responsibilities

KRiBSI Functions Under Poland’s AI Act

Function Description EU AI Act Reference
Market Surveillance Authority Monitors AI systems in the Polish market, investigates complaints, conducts inspections Article 70
Single Point of Contact Coordinates with EU AI Office, other member states, and stakeholders Article 70(2)
Compliance Oversight Ensures AI systems meet requirements, enforces corrective actions Articles 74-76
Innovation Support Promotes AI development, supports regulatory sandboxes, monitors competitiveness Articles 57-63
Penalty Enforcement Investigates violations, imposes administrative fines per penalty framework Article 99

The Minister for Digital Affairs serves as the notifying authority, responsible for designating and overseeing conformity assessment bodies (notified bodies) that will conduct third-party assessments of high-risk AI systems.[1]

UODO’s Limited Role

A notable feature of Poland’s implementation architecture is the limited role assigned to the Urząd Ochrony Danych Osobowych (UODO, the Polish Data Protection Authority). Despite the AI Act’s explicit provisions mandating DPA involvement in AI governance, Poland’s draft legislation positions UODO as a "cooperating authority" without voting rights in KRiBSI decisions.[6]

In July 2025, UODO sharply criticized this arrangement, arguing that enforcement of AI Act provisions related to personal data requires "meaningful participation in decision-making, not merely cooperation." UODO emphasized that relegation to advisory capacity is inadequate given the significant overlap between AI regulation and data protection law.[6]

Fundamental Rights Protection Authorities

Under Article 77 of the AI Act, Poland has designated several authorities to protect fundamental rights in AI contexts:

Implementation Timeline

Polish organizations must comply with the EU AI Act’s phased implementation schedule. While Poland’s national law adds domestic enforcement mechanisms, the underlying obligations follow EU-wide deadlines.

EU AI Act Implementation Timeline for Poland

Date Milestone Impact
August 1, 2024 AI Act enters into force Regulation text final; 24-month countdown begins
February 2, 2025 Prohibited AI practices banned Social scoring, manipulative AI, real-time biometric ID (with exceptions) — NOW ACTIVE
August 2, 2025 GPAI model obligations apply Foundation model providers must comply; member states designate authorities
August 2, 2026 High-risk AI full compliance All Annex III high-risk systems must meet Articles 9-15 requirements
August 2, 2027 Medical AI devices deadline AI systems covered by medical device regulations

Critical Path for Polish Organizations

The August 2026 high-risk compliance deadline is approximately eight months away. Building the required infrastructure—risk management systems, technical documentation, quality management systems, and logging capabilities—typically requires 6-12 months. Organizations that haven’t started compliance work face significant timeline risk.[7]

Poland’s Digital Strategy Context

Poland’s AI Act implementation occurs within a broader national digital transformation strategy. Understanding this context helps organizations align compliance efforts with government priorities and available support mechanisms.

Policy for AI Development in Poland

In 2020, the Council of Ministers adopted the "Policy for the Development of Artificial Intelligence in Poland from 2020" (Polityka Rozwoju Sztucznej Inteligencji w Polsce), establishing strategic directions across six key areas: society, innovative companies, science, education, international cooperation, and the public sector.[8]

The policy was updated in 2024, emphasizing administrative competence development and ethical AI implementation. Key priorities include:

Baltic AI Gigafactory Initiative

The Ministry of Digital Affairs has announced plans for the Baltic AI Gigafactory, a major infrastructure project submitted to the European Commission. The initiative targets key sectors including financial services and banking, healthcare and pharmaceuticals, manufacturing automation, and transport/logistics—all areas with significant high-risk AI exposure under the AI Act.[9]

AI Literacy Requirements

From February 2, 2025, the AI Act requires organizations to ensure staff operating AI systems have "sufficient AI literacy." Polish data shows only 34% of employees currently have access to approved AI tools (compared to 46% globally), with training levels at 35% versus 46% globally. This gap presents both compliance risk and opportunity for competitive differentiation.[4]

High-Risk AI Sectors in Poland

Poland’s economy includes several sectors with significant exposure to the AI Act’s high-risk requirements. Understanding sector-specific considerations helps organizations prioritize compliance efforts.

Manufacturing

Poland’s manufacturing sector has embraced AI adoption, with 47% of manufacturing companies implementing AI solutions for quality control, customer service, and process automation. Key compliance considerations include:

Critical Infrastructure AI

AI systems managing safety components in:

  • Power grid management
  • Water/gas/heating systems
  • Industrial automation safety

High-risk per Annex III Category 2

Employment AI

AI systems used for:

  • Worker performance monitoring
  • Task allocation algorithms
  • Recruitment/promotion decisions

High-risk per Annex III Category 4

IT Services and Software Development

Poland ranks among the top-5 globally for software development talent, with an IT market projected to reach $10.35 billion by end of 2025. Polish IT companies face dual exposure:[3]

Poland’s position as a major IT outsourcing destination means compliance failures can have cascading effects across EU supply chains.

Financial Services

40% of Polish financial services firms leverage AI for financial analysis, predictive analytics, and HR processes. High-risk categories include:[4]

Public Sector

Polish government agencies increasingly deploy AI for citizen services. UODO has already intervened in cases involving AI-based scoring in public aid decisions. High-risk exposure includes:

Article 12: Logging Requirements

Article 12 of the EU AI Act establishes one of the most technically demanding requirements for high-risk AI systems: automatic, comprehensive event logging throughout the system’s lifetime. For Polish organizations, implementing robust logging infrastructure is prerequisite to demonstrating compliance with all other high-risk requirements.[10]

Core Article 12 Requirements

High-risk AI systems must technically allow for automatic recording of events (logs) that enable:

What Must Be Logged

Regulators require logs that "reconstruct the who, what, when, and why for every decision and exception." Specifically:[10]

Article 12 Logging Elements

Element Description Example
Who Identity of users, operators, and verifiers User ID, operator credentials, verification sign-off
What Actions taken, data processed, outputs generated Input data hash, model version, decision output
When Timestamps for all logged events ISO 8601 timestamps with timezone
Why Context for decisions, overrides, exceptions Rule triggered, threshold exceeded, manual override reason
Database references Reference data checked against Database version, record IDs matched
Verification Human oversight actions and results Reviewer ID, approval/rejection, modification details

Technical Implementation Standards

The draft standard ISO/IEC DIS 24970:2025 (Artificial Intelligence — AI System Logging) provides guidance for Article 12 implementation. Key requirements include:[10]

Provider Obligations for Log Retention

Per Article 19, providers of high-risk AI systems must retain automatically generated logs "to the extent such logs are under their control." This creates ongoing obligations for log storage, protection, and accessibility that extend throughout the AI system’s operational lifetime and beyond.[10]

Conformity Assessment Pathway

Before placing high-risk AI systems on the Polish market, providers must undergo conformity assessment to demonstrate compliance. The pathway depends on the AI system category and applicable EU harmonization legislation.

Internal Control Assessment

Most high-risk AI systems can use self-assessment (internal control) per Article 43. Providers must:

Notified Body Assessment

Third-party assessment is required for:

In Poland, the Minister for Digital Affairs designates notified bodies, which must be accredited per ISO 17065. Typical costs range from €10,000-€100,000 with timelines of 3-12 months.[7]

Enforcement & Penalties

The EU AI Act establishes substantial penalties for non-compliance, and Poland’s implementing legislation incorporates these into national enforcement mechanisms administered by KRiBSI.

Penalty Structure

EU AI Act Penalties Applicable in Poland

Violation Maximum Penalty Status
Prohibited AI practices €35M or 7% global turnover Enforceable Now
High-risk AI non-compliance €15M or 3% global turnover From August 2026
GPAI model non-compliance €15M or 3% global turnover From August 2025
Incorrect information to authorities €7.5M or 1% global turnover Varies by provision
Transparency obligation violations €7.5M or 1% global turnover From August 2025

Critical note: "Global annual turnover" means worldwide revenue. For multinational corporations operating in Poland, 7% of global turnover could reach billions of euros. Whichever amount is higher applies—even startups face €35M maximum exposure.[7]

KRiBSI Enforcement Powers

Poland’s national competent authority will have extensive investigatory powers including:

Compliance Roadmap for Polish Organizations

With eight months until the high-risk AI compliance deadline, Polish organizations need a structured approach to EU AI Act readiness. This roadmap prioritizes actions by deadline and impact.

Immediate Actions (Q1 2026)

1. AI System Inventory

Catalog all AI systems in use. Classify each against Annex III high-risk categories. Identify prohibited practices requiring immediate cessation. Document providers, deployer status, and data flows.

2. Logging Infrastructure Assessment

Evaluate current logging capabilities against Article 12 requirements. Identify gaps in event capture, retention, and tamper-evidence. Plan infrastructure upgrades—this typically requires 3-6 months.

3. Risk Management System Design

Begin designing risk management systems per Article 9. Establish risk identification, assessment, and mitigation processes. Define acceptable risk thresholds and escalation procedures.

Q2 2026: Implementation Phase

Q3 2026: Validation and Conformity

How GLACIS Supports Article 12 Compliance

GLACIS provides the logging infrastructure Polish organizations need to meet Article 12 requirements:

Frequently Asked Questions

Who is the national competent authority for the EU AI Act in Poland?

Poland has established the Commission for AI Development and Security (Komisja Rozwoju i Bezpieczeństwa Sztucznej Inteligencji, KRiBSI) as its national competent authority. KRiBSI serves as both the market surveillance authority and single point of contact. The Minister for Digital Affairs (Ministerstwo Cyfryzacji) acts as the notifying authority for conformity assessment bodies.

When must Polish organizations comply with the EU AI Act?

Polish organizations must follow the EU AI Act’s phased timeline: prohibited AI practices were banned February 2, 2025; GPAI model obligations apply from August 2, 2025; high-risk AI systems must achieve full compliance by August 2, 2026; and certain medical AI devices have until August 2, 2027. Poland’s national implementing law (Ustawa o systemach SI) is expected to be enacted in 2025.

What are Article 12 logging requirements under the EU AI Act?

Article 12 requires high-risk AI systems to automatically record events (logs) throughout the system’s lifetime. Logs must enable traceability for identifying situations where the AI may present risk, supporting post-market monitoring, and tracking system operation. Logs must capture who used the system, what data was processed, when operations occurred, and who verified results. Records must be tamper-evident and instantly retrievable.

What are the penalties for EU AI Act non-compliance in Poland?

Penalties follow EU-wide maximums: deploying prohibited AI practices can result in fines up to €35 million or 7% of global annual turnover; non-compliance with high-risk AI requirements incurs fines up to €15 million or 3% of turnover; providing incorrect information to authorities can result in fines up to €7.5 million or 1% of turnover. Whichever amount is higher applies.

Which industries in Poland are most affected by the EU AI Act?

Key sectors include: financial services (credit scoring, insurance underwriting, fraud detection), manufacturing (quality control AI, predictive maintenance, worker monitoring), IT services and software development (AI tooling, foundation model deployment), healthcare (diagnostic AI, patient management systems), and public sector (automated decision-making, citizen services). Poland’s strong IT outsourcing sector faces particular compliance obligations as AI system providers to EU customers.

How does the Polish Data Protection Authority (UODO) relate to AI Act enforcement?

Under Poland’s draft implementing law, UODO is designated as a "cooperating authority" rather than a primary enforcement body. UODO has criticized this limited role, arguing that AI Act provisions related to data protection require meaningful participation in decision-making. UODO is listed among Poland’s fundamental rights protection authorities under Article 77, alongside the Patient Rights Ombudsman and National Labour Inspectorate.

References

  1. [1] Polish Ministry of Digital Affairs. "Draft Act on Artificial Intelligence Systems (Projekt ustawy o systemach sztucznej inteligencji)." October 2024, revised February 2025 and June 2025. regulations.ai
  2. [2] European Commission. "AI Act: Governance and Enforcement." Shaping Europe’s Digital Future, 2024. ec.europa.eu
  3. [3] DevsData. "Software Development In Poland: Market Overview For 2025." 2025. devsdata.com
  4. [4] Amazon EU News. "AI adoption in Poland grew by 36% over the past year." 2025. aboutamazon.eu
  5. [5] Cullen International. "EU AI Act implementation: Only a few countries have designated AI Act enforcement authorities." September 2025. cullen-international.com
  6. [6] RAILS Blog. "Unfinished Architecture? Poland’s Draft Act on AI Systems – and the Struggle for Supervisory Clarity." 2025. ai-laws.org
  7. [7] European Union. "Regulation (EU) 2024/1689 of the European Parliament and of the Council." Official Journal of the European Union, July 12, 2024. EUR-Lex 32024R1689
  8. [8] Polish Council of Ministers. "Policy for the Development of Artificial Intelligence in Poland from 2020." 2020, updated 2024. gov.pl
  9. [9] Polish Ministry of Digital Affairs. "Baltic AI Gigafactory Initiative." 2025. trade.gov.pl
  10. [10] EU AI Act Service Desk. "Article 12: Record-keeping." European Commission, 2024. ai-act-service-desk.ec.europa.eu
  11. [11] Chambers and Partners. "Artificial Intelligence 2025 - Poland: Trends and Developments." 2025. chambers.com
  12. [12] Global Legal Insights. "AI, Machine Learning & Big Data Laws 2025 | Poland." 2025. globallegalinsights.com

EU AI Act Compliance for Polish Organizations

GLACIS generates cryptographic evidence that your AI controls execute correctly—mapped to EU AI Act Articles 9-15, ISO 42001, and NIST AI RMF. Get audit-ready documentation before KRiBSI comes knocking.

Start Your Compliance Sprint

Related Guides