California governor signs landmark AI safety law, forcing major tech companies to disclose protocols and protect whistleblowers

By Beatrice NolanTech Reporter
Beatrice NolanTech Reporter

Beatrice Nolan is a tech reporter on Fortune’s AI team, covering artificial intelligence and emerging technologies and their impact on work, industry, and culture. She's based in Fortune's London office and holds a bachelor’s degree in English from the University of York. You can reach her securely via Signal at beatricenolan.08

California Governor Newsom
California Gov. Gavin Newsom signed landmark AI regulation, mandating safety disclosures.
Mario Tama—Getty Images

California has taken a significant step toward regulating artificial intelligence with Gov. Gavin Newsom signing a new state law that will require major AI companies, many of which are headquartered in the state, to publicly disclose how they plan to mitigate the potentially catastrophic risks posed by advanced AI models.

The law also creates mechanisms for reporting critical safety incidents, extends whistleblower protections to AI company employees, and initiates the development of CalCompute, a government consortium tasked with creating a public computing cluster for safe, ethical, and sustainable AI research and innovation. By compelling companies, including OpenAI, Meta, Google DeepMind, and Anthropic, to follow these new rules at home, California may effectively set the standard for AI oversight.

Newsom framed the law as a balance between safeguarding the public and encouraging innovation. In a statement, he wrote: “California has proven that we can establish regulations to protect our communities while also ensuring that the growing AI industry continues to thrive. This legislation strikes that balance.”

The legislation, authored by State Sen. Scott Wiener, follows a failed attempt to pass a similar AI law last year. Wiener said that the new law, which was known by the shorthand SB 53 (for Senate Bill 53), focuses on transparency rather than liability, a departure from his prior SB 1047 bill, which Newsom vetoed last year.

“SB 53’s passage marks a notable win for California and the AI industry as a whole,” said Sunny Gandhi, VP of political affairs at Encode AI, a cosponsor of SB 53. “By establishing transparency and accountability measures for large-scale developers, SB 53 ensures that startups and innovators aren’t saddled with disproportionate burdens, while the most powerful models face appropriate oversight. This balanced approach sets the stage for a competitive, safe, and globally respected AI ecosystem.”

Industry reactions to the new legislation have been divided. Jack Clark, cofounder of AI company Anthropic, which backed SB 53, wrote on X: “We applaud [the California Governor] for signing [Scott Wiener’s] SB 53, establishing transparency requirements for frontier AI companies that will help us all have better data about these systems and the companies building them. Anthropic is proud to have supported this bill.” He emphasized that while federal standards are still important to prevent a patchwork of state rules, California has created a framework that balances public safety with ongoing innovation.

OpenAI, which did not endorse the bill, told news outlets it was “pleased to see that California has created a critical path toward harmonization with the federal government—the most effective approach to AI safety,” adding that if implemented correctly, the law would enable cooperation between federal and state governments on AI deployment. Meta spokesperson Christopher Sgro similarly told media the company “supports balanced AI regulation,” calling SB 53 “a positive step in that direction,” and said Meta looks forward to working with lawmakers to protect consumers while fostering innovation.

Despite being a state-level law, the California legislation will have a global reach, since 32 of the world’s top 50 AI companies are based in the state. The bill requires AI firms to report incidents to California’s Office of Emergency Services and protects whistleblowers, allowing engineers and other employees to raise safety concerns without risking their careers. SB 53 also includes civil penalties for noncompliance, enforceable by the state attorney general, though AI policy experts like Miles Brundage note these penalties are relatively weak, even compared to those enforced by the EU’s AI Act.

Brundage, who was formerly the head of policy research at OpenAI, said in an X post that while SB 53 represented “a step forward,” there was a need for “actual transparency” in reporting, stronger minimum risk thresholds, and technically robust third-party evaluations.

Collin McCune, head of government affairs at Andreessen Horowitz, also warned the law “risks squeezing out startups, slowing innovation, and entrenching the biggest players,” and said it sets a dangerous precedent for state-by-state regulation that could create “a patchwork of 50 compliance regimes that startups don’t have the resources to navigate.” Several AI companies that lobbied against the bill also made similar arguments.

California is aiming to promote transparency and accountability in the AI sector with the requirement for public disclosures and incident reporting; however, critics like McCune argue that the law could make compliance challenging for smaller firms and entrench Big Tech’s AI dominance.

Thomas Woodside, a cofounder at Secure AI Project, a cosponsor of the law, called the concerns around startups “overblown.”

“This bill is only applying to companies that are training AI models with a huge amount of compute that costs hundreds of millions of dollars, something that a tiny startup can’t do,” he told Fortune. “Reporting very serious things that go wrong, and whistleblower protections, is a very basic level of transparency, and the obligations don’t even apply to companies that have less than $500 million in annual revenue.”

Fortune Global Forum returns Oct. 26–27, 2025 in Riyadh. CEOs and global leaders will gather for a dynamic, invitation-only event shaping the future of business. Apply for an invitation.