Skip to main content
Politics

Newsom had big plans to regulate AI. Trump said: Hold my beer

The governor’s vision varies wildly from that of the president. The issue could determine who wins the White House in 2028.

Two men in suits are standing in front of a blue curtain. One holds a sign partially visible in the background. They appear to be engaged in a serious discussion.
Gov. Gavin Newsom intends to take a tougher stance than President Donald Trump on regulating AI technology. | Source: Tayfun Coskun/Anadolu Agency/Getty Images

Gov. Gavin Newsom has spent the last year positioning himself as the adult in the room when it comes to regulating AI technology. He has signed bills to crack down on sexually explicit deepfake videos and deceptive election-related content while vetoing legislation he deemed an overreach.

These efforts could be seen as a savvy political compromise — and a blueprint for Democrats to follow on a controversial issue before the 2026 midterms. Newsom has managed to appease AI skeptics while avoiding the wrath of tech execs — ever-quick to bristle over intrusions into the sandbox — who have in recent years increased their political donations and will likely spend big in the 2028 presidential campaign.

“Newsom is running for president, so let’s just do away with the fiction of ‘might be running for president,’” said Jim Ross, a consultant who ran Newsom’s first San Francisco mayoral campaign in 2003. “He needs to raise money — $40 million, $50 million, $100 million — to run for president, and to do that, he needs to get all these San Francisco-oriented, California-oriented AI folks to give him money.”

But as the saying goes: A governor makes plans, and the president laughs.

A man with light hair and a suit, featuring a red tie, stands in front of American flags with a composed expression on his face.
President Donald Trump released an AI Action Plan that gives industry leaders carte blanche to deploy new technology with few regulations. | Source: Andrew Harnik/AP Photo

The White House on Wednesday announced its AI Action Plan, largely seen as a giveaway to the industry that cuts safety guardrails. Tech entrepreneur David Sacks, President Donald Trump’s AI and crypto czar, helped craft the plan, which has received the support of OpenAI’s Sam Altman and other industry leaders. 

The plan promises to accelerate innovation and encourage construction of AI infrastructure in the U.S., thereby maintaining global dominance. Notably, the plan includes a shot across the bow to states that pass “burdensome AI regulations,” warning that federal funds could be withheld if laws are deemed “unduly restrictive.”

In a social media post, Newsom’s press office called Trump’s efforts to restrict states from regulating AI “disgusting.” 

“This moratorium threatens to defund states like California with strong laws against #AI-generated child porn,” the post said. “But no surprise given Trump’s years palling around with Jeffrey Epstein.” 

Trump’s announcement has set ablaze any middle-ground approach to regulating AI, putting Newsom — and other Democrats facing pressure to rein in the industry — in a political pickle. Will tech play ball with Newsom and California Democrats on commonsense restrictions when the counteroffer from Trump is no restrictions at all?

Sacha Haworth, executive director of the Tech Oversight Project, said Trump’s action plan is like a Rorschach test, with the expectation that major players in AI will embrace it.

“Overwhelmingly, people who like it are David Sacks and other industry folks who have embedded themselves in the White House and use their massive political influence to get what they want,” Haworth said.

While the feds commit to a laissez-faire approach to regulating AI, California has signaled it wants a hand in the game. A bill working its way through the state Legislature seeks to add guardrails on the technology and could serve as Newsom’s first test on AI regulation since Trump’s announcement. 

Senate Bill 53 would require AI companies to publish their safety and security protocols and report critical safety incidents to the state attorney general’s office, while adding whistleblower protections for workers. The measure, authored by state Sen. Scott Wiener (D-San Francisco), passed the Senate in May on a bipartisan 37-0 vote and has received broad support in the Assembly.

The measure is a watered-down version of a bill Newsom vetoed in September that would have installed first-of-its-kind AI safety restrictions. That legislation, also written by Wiener, faced opposition from Meta and other tech giants, as well as the California Chamber of Commerce and venture capitalist Ron Conway. Y Combinator CEO Garry Tan and nearly 150 other tech execs signed an open letter saying the bill would “gravely harm” California’s position as the global leader in AI technology. 

“Given the stakes — protecting against actual threats without unnecessarily thwarting the promise of this technology to advance the public good — we must get this right,” Newsom wrote in his veto message.

Along with his veto, Newsom announced the creation of a task force to make recommendations on how to manage the risks of AI while avoiding stifling innovation. 

The task force released its report in June, and Wiener adopted many of the recommendations into SB 53, which encourages transparency instead of forcing $500 million fines on companies in the event of doomsday scenarios. 

“We tried really hard to stay close to the working group recommendations in crafting this bill,” Wiener said. “It is a different approach than [last year’s bill]; it’s not a liability bill, it’s a transparency bill.”

a man in a light blue suit holding a sheaf of papers points while speaking
State Sen. Scott Wiener came back with a much different AI bill after Newsom vetoed his legislation in 2024. | Source: Fred Greaves for CalMatters

Until Trump’s announcement, the more measured approach to AI regulation was seen as a potential win for Newsom, as well as for Wiener, who enraged many major AI players with last year’s bill. Wiener has ambitions to succeed Rep. Nancy Pelosi to represent San Francisco in Congress, and he’ll need Big Tech’s support — or at least will need to avoid its wrath. Meanwhile, Newsom’s interest in signing AI-related legislation could cement his position as the industry-aligned presidential candidate who best understands the economic engine of the future.

Haworth noted that the Trump administration’s “vague” language around withholding federal funds could force states to stand down on AI regulations. State lawmakers have until mid-September to pass legislation this year, giving Newsom time to consult with industry leaders to gauge the temperature on Wiener’s AI bill.

“California did not become the innovation hub of the nation by turning its back on new technology — and we can help ensure that future growth happens responsibly and safely,” Tara Gallegos, a Newsom spokesperson, said in a statement. Newsom’s office declined to comment on SB 53 specifically.

Jason Elliott, a former deputy chief of staff to Newsom who served as an adviser to the committee that published the governor’s AI report, said Wiener’s new bill is a compromise that might work for all stakeholders.

“[Last year’s bill] was an expansive idea that clearly the governor didn’t think fit the scope of the problem,” said Elliott, a policy fellow at Stanford’s Institute for Human-Centered AI. “The [work group’s] report has been widely embraced in Sacramento.”

Keally McBride, professor of politics at the University of San Francisco, said Newsom has an opportunity to lay out a compelling argument for AI regulation. But the conversation around AI and its risks — the elimination of jobs, the creation of deepfake videos, cyberattacks on the power grid and the stock market — could evolve just as rapidly as the technology.

“He’d be able to crow: ‘Here in California, we figured out how to fund and support innovation but also public safety,’” McBride said of a potential presidential campaign. “If I were him, I’d run on that. But who knows what AI will be doing in another 12 months.”