As the next piece in this series on the 2026 California Consumer Privacy Act (CCPA) regulatory updates, I’m turning to the new risk assessment requirements. If you’d like additional context, you can look back at our pieces on the broader 2026 CCPA updates and the ADMT requirements.
California’s 2026 CCPA regulations introduce a formal risk assessment requirement for certain data processing activities. At first glance, it can look like another documentation exercise. But the requirement is doing something more fundamental: It asks businesses to pause and evaluate specific data practices before they are deployed or materially changed, particularly where those practices could create meaningful risk to consumers.
Like the ADMT provisions, this is less about producing a document and more about making, documenting, and being able to explain a set of decisions.
What the risk assessment requirement is—and is not
Under the regulations, businesses must conduct and document a risk assessment before initiating processing activities that the regulations treat as presenting a significant risk to consumers’ privacy. Those activities include:
- selling or sharing personal information;
- processing sensitive personal information (subject to a narrow carveout for certain employee and contractor data processed solely for limited employment-administration purposes);
- using ADMT to make a significant decision concerning a consumer;
- using automated processing to infer or extrapolate personal characteristics or behavior about applicants, students, employees, or independent contractors based on systematic observation in those roles;
- using automated processing to infer or extrapolate personal characteristics or behavior based on a consumer’s presence in a sensitive location; and
- intent to process personal information to train certain high-impact technologies, including ADMT used for significant decisions and facial-recognition, emotion-recognition, identity-verification, or other technologies used for physical or biological identification or profiling.
It’s worth noting that, for the last category of activities—particularly training AI models for high-impact use cases—the regulations are more explicit about the trigger for a risk assessment. In those contexts, the obligation attaches where a business “intends to use” personal information, which is defined by the regulations broadly to include not only actual use, but also planning to use, permitting others to use, or even advertising or marketing the use. This is a different framing than other in-scope activities, where the requirement is expressed as completing the assessment before initiating the processing.
Understanding the regulatory scope matters. The requirement is not a general obligation to assess every internal data use. It is tied to specific categories of processing that the regulations identify as higher risk, particularly where the activity could meaningfully affect consumer privacy, autonomy, or access to important opportunities. And while the requirement results in a written report, the document itself is not the point. The point is whether the business has actually evaluated the activity, weighed the tradeoffs, and made a deliberate decision about whether and how to proceed.
What the assessment must actually do
The regulations frame the assessment as a balancing exercise. The business must determine whether the risks to consumers’ privacy from the processing outweigh the benefits to the consumer, the business, other stakeholders, and the public from that same processing.
The regulations require the business to be concrete about what it is doing and why.
- The assessment report must identify the business’s purpose for the processing, and the regulations expressly reject generic descriptions like “improving services” or “security purposes.” The business must also identify the categories of personal information involved, including any sensitive personal information, and the minimum personal information necessary to achieve the stated purpose.
- The report must also document key operational details of the processing, including how the information will be collected, used, disclosed, retained, and sourced; how long it will be kept or how retention will be determined; how the business interacts with consumers in connection with the processing; the approximate number of consumers affected; what disclosures will be made; and which service providers, contractors, or third parties will receive the information and for what purpose.
- For certain ADMT uses, the report must also address the logic of the tool, its assumptions or limitations, its output, and how that output will be used in a significant decision.
The regulations are equally specific about benefits and risks. Benefits must be described concretely, not in “generic terms.” Risks must also be identified with specificity, including their sources and causes. The examples given in the regulations go well beyond data breach risk. They include discrimination, loss of consumer control, coercive design, economic harm, physical harm, reputational harm, and psychological harm.
The assessment must then identify any safeguards the business plans to implement to address those risks. The regulations give examples ranging from encryption, access controls, and monitoring to privacy-enhancing technologies, consultation with outside experts, and policies, procedures, and training intended to reduce harmful outcomes. Finally, the report must document whether the business will initiate the processing at all.
Who needs to be involved
The regulations also make clear that preparation of the risk assessment must involve the relevant stakeholders.
Employees whose job duties include participating in the processing activity must be included in the risk assessment process. In practical terms, that means the people who actually understand how the data will be collected, used, disclosed, or operationalized need to be part of the analysis. The regulations also permit businesses to include outside voices in the process, such as service providers, contractors, technical experts, consumer subsets, or advocacy stakeholders.
Just as important, the assessment must be reviewed and approved by someone with authority to participate in deciding whether the business will initiate the processing. That point is easy to miss, but it matters. The regulation is not just asking for input. It is asking for real decision-maker involvement.
When the obligation attaches
The timing requirements should not be overlooked. The regulations became effective on January 1, 2026, and businesses subject to the risk assessment requirement must begin complying as of that date. That means a required risk assessment must be completed before a new in-scope processing activity begins. If there is a material change to the processing activity, the business must update the assessment as soon as feasibly possible, but no later than 45 calendar days after the material change. The regulations also require businesses to review and update risk assessments at least once every three years to keep them accurate.
The submission timeline is phased. The California Privacy Protection Agency (CPPA) has stated that businesses subject to the requirement must begin compliance by January 1, 2026, and by April 1, 2028, must submit an attestation that required risk assessments were completed along with a summary of their risk assessment information. CPPA materials also describe phased timing tied to revenue thresholds, with April 1, 2028, April 1, 2029, and April 1, 2030, deadlines depending on annual gross revenue.
That timing structure matters. The obligation to do the analysis starts now, even though the first submission deadline comes later. In practice, that pushes the work upstream, to the point when the underlying decision is still capable of being shaped rather than defended after the fact.
A practical way to prepare
For many organizations, the hardest part will be building a repeatable way to identify when the requirement applies and who needs to be involved. A practical starting point is to treat the risk assessment requirement as a governance input, not a compliance output.
That usually means doing a few things well.
- Start with a small, cross-functional working group. At the outset, it may be helpful to identify a core leadership group responsible for establishing how the organization will handle risk assessments. This group does not need to own every underlying business decision, but it should own the process for compliance—identifying likely in-scope activities, coordinating the assessment workflow, and making sure the right stakeholders are involved at the right time.
- Define likely in-scope processing activities in advance. Identify the categories of data use within your business that are most likely to trigger the requirement so teams are not revisiting scope from scratch each time.
- Assign decision ownership, not just coordination responsibility. Someone needs to own the call on whether the activity proceeds, not just circulate the document.
- Use a consistent method for evaluating benefits, risks, and safeguards. Even a simple internal rubric makes the analysis more repeatable and more defensible across teams and use cases.
- Build the assessment into intake for new initiatives. Product changes, marketing practices, data-sharing arrangements, and AI-related uses should be surfaced early enough for the analysis to influence the decision.
- Document the tradeoffs honestly. The goal is not to produce a perfect narrative. It is to accurately reflect what the business decided, why, and what mitigation steps support that decision.
Conclusion
The risk assessment requirement is one of the more operationally significant components of the 2026 CCPA regulations. Preparing for it will require businesses to identify where the obligation applies, involve the right stakeholders, and develop a workable process for completing and maintaining assessments over time.
Thompson Coburn’s Cybersecurity, Privacy, & Data Governance practice group is continuing to monitor these developments and available to help clients prepare for implementation. The final piece in this series, addressing the new cybersecurity audit requirement under the 2026 CCPA regulations, is forthcoming.

