California Senate recently won new AI Security Act, SB 53, Send it to Governor Gavin Newsom Sign or reject.
If this sounds familiar, it’s because Newsom Another AI security bill rejectedalso written by state Senator Scott Wiener. But the SB 53 is narrower than the previous SB 1047 in Vienna, with the focus on the annual revenue of large AI companies exceeding $500 million.
I had the opportunity to discuss SB 53 with my colleagues Max Zeff and Kirsten Korosec TechCrunch’s flagship podcast assets. Max believes Vienna’s new bill has a better chance of becoming law, partly because the company’s focus is Human recognition of AI companies.
Read a preview of the conversation about AI security and state-level legislation below. (I’ve edited the transcript to illustrate length, clarity, and make us sound slightly smarter.)
Maximum: Why do you care about AI security legislation in a conference hall in California? We are entering this era where AI companies are becoming the most powerful companies in the world, which is probably one of the few checks on their power.
This is much narrower than the SB 1047, which got a lot of callbacks last year. But I think SB 53 still puts some meaningful regulations for AI labs. It allowed them to publish security reports for the model. If there is an incident in them, it basically forces them to report to the government. And, for employees in these labs, if they have questions, they will also provide them with a channel to report to the government rather than facing the company’s drive, even if many have signed up for NDAS.
To me, it feels like a potentially meaningful examination of the power of tech companies that we haven’t actually really had in the past few decades.
TechCrunch Events
San Francisco
|
October 27-29, 2025
Kirsten: For your point of view, it is important to consider the fact that it is California. Every major AI company is based almost here, and it has a major footprint in this state. Not that other states don’t matter – I don’t want to get emails from people in Colorado, but it’s really especially important for California because it’s really a hub for AI activity.
Max My problem is that there seem to be a lot of exceptions and engravings. Narrow, but more complex than before [bill]?
Maximum: In some ways, yes. What I want to say is that the main engraving of this bill is that it really tries not to apply to small startups. Basically, it is one of the main controversies in the last legislative effort of Sen. Scott Weiner, who wrote the bill, which many say could damage the startup ecosystem, and that is a problem many have caused because it is a booming California economy.
This bill applies specifically to AI developers [generating] More than $500 million [from] Their AI model. This does try to target Openai, Google DeepMind, these big companies, not your mineral startup.
Anthony: As far as I understand, if you are a smaller startup, you have to share some security information, but not much.
It is [also] It’s worth talking about the wider landscape of AI regulations and one of the big changes between last year and this year is that now we have a new president. The federal government has taken more unregulated stances and companies should be able to do what they want to do so that they actually include [language] In the Funds Act, it says The country cannot have its own AI regulations.
I don’t think any of the past has passed so far, but they may try to pass in the future. So this could be another aspect of the Trump administration and the Blue Nations fighting.
Equity is TechCrunch’s flagship podcast produced by Theresa Loconsolo and is released every Wednesday and Friday.
Subscribe to us Apple Podcast,,,,, Grey,,,,, Spotify and all the actors. You can also follow fairness x and Threadin @equitypod.