Alert
July 31, 2025

California’s New Privacy and Cybersecurity Regulations on Risk Assessments, Automated Decision-Making, and Cybersecurity Audits: What Businesses Need to Know

During a board meeting on July 24, 2025, the California Privacy Protection Agency (CPPA) unanimously approved the long-awaited final text of its second rulemaking package, implementing a broad swath of new requirements regarding risk assessments, automated decision-making technology (ADMT), and cybersecurity audits. The regulations, under the California Consumer Privacy Act (CCPA), also amended various provisions of the initial CCPA regulations. While not using — and, in fact, removing from previous drafts — the words “artificial intelligence” (AI), the regulations very much impact AI, through risk assessment and ADMT rules, and require companies to enhance their data privacy and cybersecurity programs, including undergoing an annual evidence-based cybersecurity audit.

The regulations adopt rigorous privacy and cybersecurity standards that will expand the scope of requirements for most businesses subject to the CCPA and are likely to become the benchmark for US privacy and cybersecurity compliance. Nonetheless, they step back considerably from earlier drafts, which Goodwin has covered in depth in “California Forges a New Path on Automated Decision-Making Technology, Risk Assessments, and Cybersecurity Audits” (published February 2025). Key changes from prior drafts, such as narrowed applicability thresholds, increased flexibility in reporting timelines, and reduced burden on AI and algorithmic decision-making, echo the new regulatory tone in Washington that has reprioritized technological innovation.

California’s Office of Administrative Law (OAL) now has 30 working days (plus an optional 60–calendar day extension) to complete its review. Pending any challenges or pushback from the OAL, which did not impede previous CPPA rulemakings, the regulations will become effective January 1, 2026, with certain sections coming into force in 2027.

This alert analyzes three key aspects of the regulations: (1) risk assessment requirements for certain practices involving personal information; (2) notice, opt-out, and access rights for ADMT; and (3) mandated cybersecurity audit procedures applicable to certain businesses. 

Mandatory Risk Assessments for High-Risk Data Processing: New Privacy Compliance Requirement

The regulations significantly expand privacy compliance obligations by embedding proactive risk management into the core of data processing activities. Businesses must now evaluate their data practices through the lens of privacy risk — including by developing internal protocols for identifying covered activities, conducting structured risk assessments, and maintaining documentation to demonstrate compliance — before initiating certain operations.  

Accordingly, the new risk assessment requirements signal a shift from reactive to preventative privacy governance, with legal exposure for businesses that fail to perform the required assessments before engaging in high-risk data processing. And while the regulations require businesses to conduct risk assessments when processing personal information in ways that regulators have determined present heightened risks to consumer privacy, even commonplace practices, such as targeting online ads to promote a service (e.g., on social media and search engines), will likely trigger risk assessment requirements. 

When Is a Risk Assessment Required?

Under section 7150(a) of the regulations, any business engaged in processing activities that involve significant privacy risks must conduct a formal risk assessment before initiating that processing. Removing ambiguity about which activities trigger this obligation, the regulations explicitly define what constitutes “significant risk” as: 

  • “Selling or sharing personal information,” including any disclosures of personal information for targeted advertising purposes. Given the broad reading of the terms “sell” and “share” by regulators, a large number of businesses are likely to fall under the risk assessment requirement based on this regulatory hook.  
  • “Processing sensitive personal information,” such as information revealing certain government identifiers, financial account credentials or card details, precise geolocation, health or genetic data, racial or ethnic origin, citizenship, or the contents of private communications. However, the collection of this information only in the employment context does not require a risk assessment when the data is used only for standard employment purposes. 
  • “Using ADMT for a significant decision [involving] a consumer” (discussed further in the next section of this alert). 
  • Using automated processing to analyze “a consumer’s intelligence, ability, aptitude, performance at work, economic situation, health, [location], personal preferences, interests” or other similar characteristics, “based upon a systematic observation of that consumer when they are acting […] as an educational program applicant, job applicant, student, employee, or independent contractor” or on that consumer’s presence in a sensitive location, such as a hospital, religious facility, or political party office. 
  • Using personal information of consumers to train an ADMT that is intended to be used for a “significant decision concerning a consumer” or “a facial-recognition, emotion-recognition, or other technology” designed to identify or profile consumers. 

As previously discussed, because of the broad definitions of “selling” and “sharing” — which often encompass standard online advertising practices — most businesses that are regulated under the CCPA will likely be subject to these risk assessment requirements. In addition, not only does use of personal information to make a significant automated decision require a risk assessment, but a business also must conduct a risk assessment under the regulations if it uses personal information to train certain AI systems, even if those systems are never deployed to evaluate the individuals whose data is used for training purposes. 

Content of a Risk Assessment

The regulations contain detailed requirements regarding the content of risk assessments and the procedures for conducting them. Specifically, risks assessments must:

  • Explain the activity and its purpose in granular detail, including: the specific categories of personal information required (and, in support of data minimization requirements, the minimum categories of personal information necessary to achieve the intended purpose); the method of collecting, using, and sharing personal information; the intended retention period for each category of personal information; and all transparency disclosures provided to consumers. 
  • Identify the purported benefits and potential negative impacts of the proposed activity. Negative impacts may include unauthorized access to or use of personal information, discrimination, loss of control over personal information, coercion, or economic, physical, psychological, or reputational harms. 
  • Describe the safeguards that will be implemented to manage potential negative impacts (e.g., technical safeguards, policies or procedures, notifications or consents). 

Businesses that develop and sell technologies used for ADMT (including AI) to other businesses must provide those businesses with all the information necessary to conduct a risk assessment. 

In a significant change from previous drafts of the regulations, the CPPA removed the prohibition on processing activities in which the privacy risks outweigh the benefits. In its place, the final regulations include a softer formulation, stating instead that “the goal of a risk assessment is restricting or prohibiting” processing activities with disproportionate risks. 

Procedural Requirements

In addition to specific content requirements, the regulations mandate specific procedures for conducting risk assessments, including:

  • Requiring the involvement of relevant internal stakeholders (e.g., employees who participate in the activity) and allowing (or perhaps suggesting) the involvement of relevant external stakeholders, such as affected consumers. 
  • Requiring businesses to conduct risk assessments before initiating the activity and review the risk assessment at least every three years or, in the event of a material change in the activity, “as soon as feasibly possible, but no later than 45 calendar days from the date” of the change.  
  • Designating a qualified individual (e.g., a business’ highest-ranking executive) to certify the completion of risk assessments. 
  • Submitting to the CPPA, once per year beginning April 1, 2028, certain information about risk assessments that the business has conducted, including the number of assessments, categories of personal information involved, and a written certification. The CPPA may also, at any time, request copies of any risk assessment a business has conducted. 

If approved by the OAL, the risk assessment provisions of the regulations will come into effect for any new activities initiated on or after January 1, 2026. Businesses will have a grace period until December 31, 2027, to complete risk assessments for activities already underway before the effective date. 

ADMT Rules Force Businesses to Open the Black Box

The regulations establish new consumer protections for any use of ADMT to make a significant decision regarding a California resident. Beginning on January 1, 2027, businesses that use ADMT will need to provide robust transparency — both before and after an automated decision is made — and, in some cases, permit consumers to opt out by offering alternative decision-making channels or access to human review.

What Is a Significant Decision?

The regulations define “ADMT” as “any technology that processes personal information and uses computation to replace human decisionmaking or substantially replace human decisionmaking.” This definition aligns with the EU’s General Data Protection Regulation and covers decisions made “without human involvement” — an important scaling back from prior drafts of the regulations, which would have extended it beyond full automation to include AI-assisted decisions. 

Moreover, the regulations’ ADMT requirements apply only to significant decisions, which are decisions “that [result] in the provision or denial of financial or lending services, housing, education enrollment or opportunities, employment or independent contracting opportunities or compensation, or healthcare services.” Significant decisions include “profiling” (i.e., automated evaluation of individual characteristics to analyze or predict certain individual characteristics, such as work performance, health, economic situation, and behavior) but, importantly, do not include “advertising to a consumer.”

Pre-Use Notice — Wait, Another CCPA Notice? 

A business that uses ADMT for significant decisions will need to provide consumers with a “pre-use notice” that describes, among other elements, the specific purpose for using ADMT, the consumers’ rights to access and opt out of or appeal the decision, and how the ADMT works to make a significant decision, such as the types of outputs the system generates and how the business uses those outputs. 

The description needs to be provided in plain language and must include granular details about the operation of the system, such as the categories of personal information that affect the ADMT’s output. At the same time, the rules permit businesses to withhold protected trade secrets and any information that would compromise their ability to protect security and integrity, resist malicious, deceptive, or illegal actions, or ensure the safety of individuals. 

Pre-use notices can be stand-alone notices, or they can be rolled into a business’ existing “notice[s] at [c]ollection,” which are already required under the CCPA. Businesses can also consolidate information about multiple decision-making systems into the same pre-use notice, provided the notice gives all the required information for each system. 

Critically, the notice must be “presented prominently and conspicuously […] at or before the point when the business collects the consumer’s personal information that the business plans to process using ADMT.” And, if a business will reuse personal information initially collected for a different purpose, the business must provide the pre-use notice before using ADMT.

ADMT Opt-Outs and Appeals

The regulations grant consumers the right to opt out of the use of ADMT for significant decisions. The stated purpose of the opt-out right is to permit consumers to bypass automated decisions altogether and access alternative decision-making systems. For this reason, the rules require businesses to describe in their pre-use notices the alternative process consumers can undergo — instead of ADMT — if they decide to opt out. Businesses must also make available “two or more designated methods” to submit opt-out requests at or before any significant decision is made. 

However, several exceptions to these requirements appear designed to motivate businesses to adopt safeguards for their use of ADMT rather than offering alternative decision-making channels. For example: 

  • The regulations exempt businesses from pre-ADMT opt-out requirements if they instead permit consumers to appeal and obtain human review after a decision has been executed. To use this exception, the human reviewer must have authority to overturn the automated decision and must know how to interpret and use any outputs of the ADMT. 
  • The regulations also exempt certain classes of decisions from opt-out requirements, provided certain safeguards are in place. For example, certain decisions affecting admission, acceptance, hiring, allocation of work, and compensation are exempt from opt-out requirements as long as the business uses the system only for exempted purposes and “the ADMT works for the business’s purpose and does not unlawfully discriminate based upon protected characteristics.” These safeguards will likely depend on the outcome of a risk assessment concluding that a particular ADMT is fit for purpose and not biased or discriminatory. 

Businesses are not permitted to verify consumers’ identities in order to facilitate opt-out requests, unless they have documented reasons to suspect a request is fraudulent. The regulations do not address whether businesses can or must verify consumers’ identities when offering a right to appeal instead of a right to opt out.

A New Right to Explanations? 

In addition to notice and opt-out/appeal rights, the regulations permit consumers to request specific information concerning the use of ADMT to make decisions affecting them. While some of the information consumers can access mirrors the disclosure requirements of the pre-use notice — such as information about the purpose for which the ADMT was used — other elements of this access right will likely require more specific and detailed information. 

Among other things, the regulations require businesses to provide information about the logic of the ADMT in a manner that permits “a consumer to understand how the ADMT processed their personal information to generate an output with respect to them, which may include the parameters that generated the output as well as the specific output” (emphasis added). Businesses must also explain, on request, how they used any of the ADMT’s outputs as well as the roles of any humans in the loop. 

These provisions are likely to present a significant challenge for businesses that use complex or black-box algorithms that do not rely on preprogrammed logic. Businesses will need to design thoughtful processes — with robust collaboration between technical and legal teams — for providing meaningful information about the decisions they make using ADMT, informing consumers not only about the logic generally underlying an ADMT but also the reasons it reached a certain decision about them

Unlike the regulations’ opt-out requirements, the right of access is subject to the CCPA’s standard verification procedures. Accordingly, businesses must confirm consumers’ identities with a “reasonably high degree of certainty” before granting access. 

Cybersecurity Audits 

The cybersecurity audit regulations require designated businesses to conduct comprehensive annual audits of their cybersecurity programs. Critically, the regulations are sector-agnostic — a substantial deviation from most US cybersecurity frameworks, which typically apply only to certain regulated sectors, such as financial services and healthcare. Accordingly, businesses in sectors that in the past were not subject to mandatory cybersecurity requirements, including many technology and life sciences companies, will need to review — and, in many cases, significantly expand — their cybersecurity programs.

Scope of Application

The cybersecurity audit requirements in the regulations apply only to a subset of businesses regulated by the CPPA that, in the prior calendar year 

  • had annual gross revenues above the specified monetary threshold, adjusted for inflation (currently set at $26.62 million) and either processed (a) personal information of at least 250,000 consumers or households or (b) sensitive personal information of 50,000 or more consumers; or 
  • derived at least 50% of annual revenue from selling or sharing the personal information of California residents. 

Audit Requirements

The regulations include a prescriptive set of requirements for the audits, which must assess every applicable component of the business’ cybersecurity program. Covered businesses must evaluate both technical and nontechnical safeguards and describe how those safeguards mitigate specific risks. Specifically, the regulations specify a list of 18 cyber controls that the audits must assess, thereby providing a glimpse into how California regulators interpret the ever-elusive concept of “reasonable security.” 

These cyber controls include, if applicable, the company’s approach to authentication (including phishing-resistant multi-factor authentication); password policies; encryption at rest and in transit; account and access management including privileged access controls; inventory and classification of personal data and systems; monitoring and logging; secure configuration; vulnerability management and patching; incident response and breach procedures; disaster recovery and business continuity; penetration testing and red team exercises; employee training and awareness; third-party and service provider oversight; change management; secure software development including code review and testing; multilayered network and security controls; data minimization and retention limits; and threat intelligence and current threat awareness capabilities.

The audit must also evaluate how the covered business implements and enforces compliance with the foregoing requirements. In addition, the audit must document any incidents that triggered notification requirements in California in the previous year. 

Importantly, businesses that engage in a cybersecurity audit or assessment (if the business relies on an existing assessment, such as under requirements from New York’s Department of Financial Services or the National Institute of Standards and Technology Cybersecurity Framework 2.0) may leverage it to demonstrate compliance with the audit requirement under the law, provided that the audit meets the regulations’ requirements. 

Audit Reports, Conduct, and Certification

The audit or assessment must be conducted by impartial qualified internal or external personnel with “knowledge of cybersecurity and how to audit a business’s cybersecurity program.” When using an internal auditor, the regulations require the business to establish procedures governing the auditor’s reporting chain and performance reviews to maintain the auditor’s independence. 

The audit report must document specific gaps and weaknesses of applicable policies, procedures, and safeguards, explain how the audit or assessment’s findings are or will be acted upon, and specify time frames for such actions. The audit’s identification of compliance and/or gaps in controls must be based on, and include, supporting evidence and documentation. The result of the audit must be provided to a member of the business’ executive management team with direct responsibility for its cybersecurity program. 

The regulations also require businesses to maintain mandatory records and certify compliance to the CPPA. Specifically: 

  • Recordkeeping: Covered businesses must document their audits or assessments in writing each year that the audit is required and retain records for at least five years following the completion of each audit. 
  • Certification: Each year that the requirement applies, businesses must submit to the CPPA a certification that they completed the required audit. The certification must be made by a member of the business’ executive management team who is directly responsible for the business’ cybersecurity audit compliance and who has sufficient knowledge of the audit to provide accurate information. The certification does not include a copy of the audit report.  

Compliance Timeline

The timeline for completing cybersecurity audits was also revised with the latest approved draft. The regulations push out the requirement for a business to complete its first cybersecurity audit report to April 1, 2028, if its 2026 gross revenue exceeded $100 million and April 1, 2029 or 2030, if its gross revenue was between $50 and $100 million or less than $50 million, respectively. The regulations state that failure to conduct or document audits as required may be considered a separate violation under the CCPA. 

What Should Businesses Do Now? 

Businesses should take a number of steps to make sure they are prepared for these new requirements to come into force. 

  1. Risk assessments: Identify any activities that present a significant risk under the regulations, such as engaging in targeted advertising or otherwise “selling” personal information, processing sensitive personal information, making consequential automated decisions about consumers, or profiling consumers in sensitive contexts (e.g., aptitude, work performance, or education) or for biometric identification. Businesses may also start developing risk assessment procedures to ensure that new activities are surfaced to appropriate teams for review. The compliance deadline for this requirement is December 31, 2027. 
  2. ADMT: Identify any processing activities that involve making significant decisions using ADMT, such as decisions affecting access to financial or lending services, housing, education enrollment or opportunities, employment or independent contracting opportunities or compensation, or healthcare services. Businesses that use ADMT should start collecting information about how their systems operate and, if the systems are operated by vendors, work with vendors to obtain information about the logic their systems use to generate outputs and how they manage risks of error, bias, and discrimination. The compliance deadline for this requirement is January 1, 2027. 
  3. Cybersecurity audits: Begin identifying and, if necessary, developing and/or updating cybersecurity policies and procedures to address the prescriptive new requirements set out in the regulations. Businesses outside of highly regulated sectors are likely to find that their current policies — as well as their technical and operational security controls — will need significant upgrades to meet these new standards. The compliance deadline for initial audits ranges from April 1, 2028, to April 1, 2030, depending on the size of the business. 

 

This informational piece, which may be considered advertising under the ethical rules of certain jurisdictions, is provided on the understanding that it does not constitute the rendering of legal advice or other professional advice by Goodwin or its lawyers. Prior results do not guarantee similar outcomes.