What is California ADMT?
California’s Automated Decision-Making Technology (ADMT) regulations represent the state’s comprehensive framework for governing AI systems that make or substantially influence significant decisions about consumers. Finalized by the California Privacy Protection Agency (CPPA) in September 2025 and effective January 1, 2027, these regulations establish detailed requirements for transparency, risk management, and consumer control over automated systems.[1]
The ADMT regulations emerge from California’s broader privacy framework—the California Consumer Privacy Act (CCPA) as amended by the California Privacy Rights Act (CPRA). While the CCPA/CPRA established foundational privacy rights, the ADMT regulations specifically address how businesses must handle AI-driven decision-making that affects consumers’ access to healthcare, employment, financial services, housing, education, and other consequential domains.[2]
Regulatory Timeline
- September 2025: CPPA finalizes ADMT regulations
- January 1, 2027: ADMT requirements take effect[1]
- April 2028: First risk assessment attestations due[2]
- Ongoing: Documentation retention for processing duration plus 5 years[2]
The regulations define Automated Decision-Making Technology broadly to include any technology that processes personal information to make decisions about consumers or provides outputs that serve as the primary basis for human decisions. This encompasses machine learning systems, AI models, algorithmic scoring tools, and automated profiling systems.[3]
Scope & Applicability
The California ADMT regulations apply to businesses subject to the CCPA/CPRA that use automated decision-making technology for significant decisions. This includes any for-profit entity that does business in California and meets one or more of the following thresholds:[3]
- Annual gross revenues exceeding $25 million
- Annually buys, sells, or shares personal information of 100,000 or more California consumers
- Derives 50% or more of annual revenues from selling or sharing consumer personal information
What Constitutes a Significant Decision
The regulations focus on significant decisions—those with material legal or similarly significant effects on consumers. Unlike general AI governance frameworks, ADMT specifically targets decisions that can substantially affect a person’s life circumstances:
Significant Decision Domains
| Domain | Examples | Impact |
|---|---|---|
| Healthcare | Diagnosis support, treatment recommendations, coverage decisions | Health outcomes and access to care |
| Employment | Resume screening, interview scoring, performance evaluation | Livelihood and career opportunities |
| Financial Services | Credit decisions, loan approvals, insurance underwriting | Access to capital and financial products |
| Housing | Tenant screening, rental applications, mortgage decisions | Access to housing |
| Education | Admissions scoring, financial aid, academic placement | Educational opportunities |
| Insurance | Risk assessment, claims processing, premium pricing | Coverage availability and costs |
Who Must Comply
Organizations affected by ADMT regulations include:
Direct Deployers
Businesses that use ADMT to make significant decisions about California consumers. This includes healthcare providers, employers, lenders, insurers, landlords, and educational institutions using AI systems.
AI Vendors
Service providers supplying ADMT to covered businesses have contractual and practical obligations to enable their customers’ compliance. Expect customers to demand risk assessment documentation and transparency information.
Key Requirements
The California ADMT regulations establish four core compliance obligations:[2]
1. Pre-Use Notices
Before using ADMT to make significant decisions, businesses must provide consumers with clear and conspicuous notice that automated processing will occur. The pre-use notice must include:
- Description of the ADMT and its purpose
- Categories of personal information the ADMT will process
- Consumer rights including opt-out and access rights
- How to exercise those rights
Pre-use notices must be provided at or before the point of data collection—not buried in lengthy privacy policies. Organizations should design notices that are accessible, understandable, and actionable for average consumers.
2. Risk Assessments
Businesses must conduct risk assessments before deploying ADMT for significant decisions. Risk assessments must be updated when material changes occur (within 45 days) and reviewed at least every three years. Details are covered in the Risk Assessments section below.
3. Consumer Opt-Out
Consumers have the right to opt out of ADMT processing for significant decisions. When a consumer opts out, businesses must:
- Provide an alternative human decision-making process
- Not discriminate against consumers who exercise opt-out rights
- Process opt-out requests promptly
- Document opt-out requests and responses
4. ADMT Information Access
Consumers have the right to access information about ADMT processing, including:
- Logic: A meaningful explanation of how the ADMT reaches decisions
- Inputs: Categories of personal information used in the decision
- Outputs: The decision or recommendation produced
- Process: Whether human review occurred before the decision was finalized
Healthcare Relevance
For healthcare organizations, ADMT information access requirements create obligations analogous to explaining clinical decision support recommendations. Patients have the right to understand how AI influenced their diagnosis, treatment recommendation, or coverage decision—and what data drove that output.
Risk Assessments
Risk assessments are the cornerstone of California ADMT compliance. Businesses must complete assessments before deploying ADMT for significant decisions, update them when material changes occur, and submit attestations to the CPPA beginning April 2028.[2]
Risk Assessment Components
A compliant risk assessment must document:
Required Risk Assessment Elements
| Element | Description |
|---|---|
| Purpose & Use Cases | Description of the ADMT, its intended purpose, and specific deployment context |
| Data Processing | Categories of personal information processed and data sources |
| Potential Harms | Identified risks to consumers from ADMT processing, including bias and discrimination |
| Safeguards | Technical and organizational measures to mitigate identified risks |
| Benefit Analysis | Assessment of whether benefits outweigh potential harms |
| Human Oversight | Description of human review processes and escalation procedures |
| Monitoring Plan | Ongoing monitoring for accuracy, bias, and unintended consequences |
Attestation Requirements
Beginning April 2028, businesses must submit attestations to the CPPA confirming they have completed required risk assessments. Attestations don’t require submitting the full assessment—but the CPPA may request assessments during investigations or audits.[2]
Documentation Retention
Risk assessment documentation must be retained for the duration of ADMT processing or five years after assessment completion, whichever is longer. This creates a substantial documentation burden—organizations deploying ADMT in 2027 must maintain records potentially through 2032 and beyond. Evidence must be:[2]
- Complete and accurate
- Readily accessible for regulatory inspection
- Sufficient to demonstrate compliance
Consumer Rights
The California ADMT regulations establish robust consumer rights that go beyond typical privacy frameworks. These rights become enforceable January 1, 2027.[2]
Right to Opt Out
Consumers may refuse ADMT processing for significant decisions. When a consumer exercises opt-out rights:
- The business must provide a human alternative for the decision
- The consumer cannot be penalized or discriminated against for opting out
- The business must document the opt-out request and its response
This creates operational challenges for organizations with fully automated decision pipelines. Businesses must maintain human decision-making capacity as a backstop for consumers who opt out.
Right to Access ADMT Information
Consumers may request information about how ADMT processed their personal information. Businesses must provide:
- Logic explanation: A meaningful description of how the ADMT reaches decisions—not the underlying algorithm, but an understandable explanation of the decision-making process
- Input disclosure: What categories of personal information were used
- Output disclosure: What decision or recommendation the ADMT produced
- Human review status: Whether a human reviewed the decision before it was finalized
Right to Correct Data
Consumers retain CCPA/CPRA rights to correct inaccurate personal information—including data used by ADMT. When data is corrected, businesses should:
- Reprocess the decision if the corrected data would materially affect the outcome
- Notify the consumer of any changes to the decision
- Document the correction and reprocessing
Healthcare Implications
Healthcare is explicitly covered under California ADMT regulations as a domain where significant decisions occur. This creates direct obligations for healthcare providers, payers, and AI vendors serving California patients.[2]
Covered Healthcare AI Uses
Healthcare ADMT applications subject to these regulations include:
- Clinical decision support: AI systems that recommend diagnoses, treatments, or referrals
- Coverage decisions: Automated prior authorization, claims processing, or coverage determinations
- Risk stratification: Patient risk scoring for care management or resource allocation
- Operational decisions: Appointment scheduling, triage, or capacity management affecting patient access
HIPAA Intersection
California ADMT regulations operate alongside HIPAA, not as a replacement. Healthcare organizations must comply with both frameworks:
- HIPAA governs the privacy and security of protected health information (PHI)
- California ADMT governs transparency and consumer rights regarding automated decision-making
- Both require risk assessments, documentation, and consumer/patient access rights
For healthcare organizations, this means AI governance programs must address both HIPAA Security Rule requirements and ADMT obligations—ideally through an integrated framework that satisfies both.
Implications for Healthcare AI Vendors
Vendors providing AI systems to California healthcare organizations should expect:
- Documentation demands: Customers will require model cards, risk assessment inputs, and transparency documentation
- Contractual requirements: Business associate agreements may expand to include ADMT compliance provisions
- Evidence obligations: Healthcare buyers will ask vendors to demonstrate—not just assert—that AI controls work
Comparison to Colorado AI Act
California ADMT and the Colorado AI Act represent two distinct but complementary approaches to state-level AI governance. Organizations operating in both states should understand how these frameworks align and differ:
California ADMT vs. Colorado AI Act
| Feature | California ADMT | Colorado AI Act |
|---|---|---|
| Effective Date | January 1, 2027 | June 30, 2026 |
| Regulatory Authority | California Privacy Protection Agency (CPPA) | Colorado Attorney General |
| Primary Focus | Consumer transparency and opt-out rights | Preventing algorithmic discrimination |
| Risk Assessments | Required before deployment; attestations due April 2028 | Required every three years; available upon AG request |
| Consumer Opt-Out | Explicit right to opt out of ADMT processing | Right to appeal adverse decisions; human review |
| Documentation Retention | Processing duration or 5 years, whichever is longer | Not specified in statute |
| Framework Safe Harbor | Not specified | NIST AI RMF / ISO 42001 creates presumption of reasonable care |
| Covered Domains | Significant decisions (healthcare, employment, financial, housing, education, insurance) | 8 high-risk domains including legal services |
Key Similarities
- Risk-based approach: Both target high-stakes AI decisions affecting consumers
- Risk assessments: Both require systematic evaluation before deployment
- Transparency: Both mandate disclosure to affected consumers
- Healthcare coverage: Both explicitly include healthcare as a regulated domain
Key Differences
- Opt-out emphasis: California provides stronger consumer opt-out rights; Colorado focuses on appeal rights after adverse decisions
- Framework alignment: Colorado explicitly recognizes NIST AI RMF and ISO 42001 as safe harbors; California hasn’t specified framework preferences
- Attestation model: California requires proactive attestations to the CPPA; Colorado requires documentation available upon AG request
- Retention requirements: California’s 5-year retention rule creates explicit documentation obligations
Compliance Checklist
Use this checklist to track your organization’s California ADMT compliance progress:
California ADMT Readiness
ADMT Inventory
- ☐ Catalog all AI/ML systems making or influencing decisions about consumers
- ☐ Classify each system by decision domain (healthcare, employment, financial, etc.)
- ☐ Identify which systems make "significant decisions" under ADMT definitions
Pre-Use Notice Preparation
- ☐ Draft consumer notices for each ADMT system
- ☐ Document personal information categories processed
- ☐ Create delivery mechanisms (website, application, point-of-collection)
Risk Assessment Development
- ☐ Complete risk assessments for each ADMT system before deployment
- ☐ Document potential harms and safeguards
- ☐ Establish annual review schedule
- ☐ Prepare for April 2028 attestation deadline
Consumer Rights Infrastructure
- ☐ Implement opt-out request handling workflow
- ☐ Establish human decision-making alternatives for opt-out consumers
- ☐ Create ADMT information access response process
- ☐ Train customer service on consumer rights handling
Documentation & Retention
- ☐ Implement 5-year retention policy for risk assessments
- ☐ Document consumer notices, opt-out requests, and responses
- ☐ Establish evidence generation for compliance verification
Vendor Management
- ☐ Review contracts with AI vendors for ADMT compliance provisions
- ☐ Request model documentation and risk assessment inputs from vendors
- ☐ Establish ongoing vendor monitoring for ADMT compliance
Frequently Asked Questions
Does California ADMT apply to companies headquartered outside California?
Yes. If your business meets CCPA/CPRA thresholds and uses ADMT for significant decisions about California consumers, you must comply—regardless of where your company is headquartered. The regulations apply based on consumer location, not business location.
How does consumer opt-out work in practice?
When a consumer opts out of ADMT processing for significant decisions, you must provide a human decision-making alternative. You can’t refuse service or penalize the consumer for opting out. This creates operational requirements—you must maintain human decision capacity even for highly automated processes.
What if I’m already complying with HIPAA for healthcare AI?
HIPAA and California ADMT are complementary, not duplicative. HIPAA governs PHI privacy and security; ADMT governs consumer rights regarding automated decisions. You must comply with both. However, organizations with mature HIPAA programs will find overlap in risk assessment and documentation requirements.
Do I need to comply with both California ADMT and Colorado AI Act?
If you serve consumers in both states, yes. The good news: significant overlap exists. Organizations implementing comprehensive AI governance aligned with NIST AI RMF or ISO 42001 will satisfy many requirements of both frameworks, though some state-specific provisions require additional attention.
What are the penalties for non-compliance?
ADMT violations are enforced through CCPA/CPRA mechanisms. The CPPA can impose civil penalties of $2,500 per unintentional violation and $7,500 per intentional violation. Given that violations can accumulate per consumer affected, exposure can scale rapidly for organizations with large California customer bases.
How should I prepare for the April 2028 attestation deadline?
Start risk assessments now. The April 2028 deadline applies to systems already in use—you can’t wait until 2028 to begin documentation. Complete risk assessments before January 2027 deployment, establish annual review cycles, and build evidence that your assessments were conducted properly. The attestation is a confirmation that work has been done, not the start of compliance.
What information must I provide when consumers request ADMT access?
You must provide a meaningful explanation of the ADMT logic (how it reaches decisions), the categories of personal information used as inputs, the output (decision or recommendation), and whether human review occurred. You’re not required to disclose proprietary algorithms, but you must explain the decision-making process in terms consumers can understand.
How do I document compliance for the 5-year retention requirement?
Retain complete risk assessments, pre-use notices provided to consumers, opt-out requests and responses, ADMT information access requests and responses, and evidence of safeguards implemented. Documentation must be readily accessible for regulatory inspection—not just archived, but retrievable. Consider evidence-grade compliance platforms that generate verifiable proof of controls.
References
- [1] California Privacy Protection Agency. "Automated Decision-Making Technology Regulations." Finalized September 2025. cppa.ca.gov
- [2] Shannon, Jennifer MD. "The Proof Gap in Healthcare AI." GLACIS Technologies White Paper. December 2025.
- [3] California Consumer Privacy Act (CCPA) as amended by the California Privacy Rights Act (CPRA). Cal. Civ. Code § 1798.100 et seq.
- [4] Colorado General Assembly. "SB24-205 Consumer Protections for Artificial Intelligence." leg.colorado.gov/bills/sb24-205
- [5] European Union. "Regulation (EU) 2024/1689 on Artificial Intelligence (AI Act)." Official Journal of the European Union, May 2024.