California’s Gavin Newsom Signs New Executive Order on AI Risks

Gavin Newsom touted Silicon Valley’s AI dominance, but warns of a “Pandora’s box.” 

(Bloomberg) — The state of California has entered the frenzied and at times confusing race among governments around the world to both regulate and harness the technology known as generative artificial intelligence.

On Wednesday morning, Governor Gavin Newsom signed Executive Order N-12-23, a 2,500-word directive that instructs state agencies to examine how AI might threaten the security and privacy of California residents, while also authorizing state employees to experiment with AI tools and try integrating them into the state’s operations.

Generative AI “is a new technology and requires a new class of responsibility,” Newsom said in an interview with Bloomberg News. “There’s a Pandora’s box being opened here, and we just want it done in a safe way.” He added: “We can’t make the same mistakes we did with social.” 

The executive order comes as Washington and other governments struggle with how to regulate artificial intelligence. The European Union has taken preliminary steps toward passing more stringent rules, proposing for example that companies like OpenAI and Alphabet Inc. be required to perform frequent risk assessments and disclose the copyrighted material used to train their language models. The Biden administration has developed an outline for an AI Bill of Rights, which among other things asks companies to put watermarks on AI-generated content to make it easier to identify.

In addition, an executive order from the White House intended “to keep Americans safe” from AI is expected later this fall.

Newsom’s executive order appears to take a light regulatory approach to new development in the field, which even some proponents have warned could foment massive job losses or lead to apocalyptic outcomes for humanity. Instead, the order’s text extols the technology and heralds California’s role in fostering it, noting that the state is home to 35 of the top 50 AI companies.

The initiative also directs the Governor’s Office of Business and Economic Development to establish formal partnerships with computer science labs at the region’s two academic powerhouses, the University of California at Berkeley and Stanford University, to study the impacts of generative AI and how to advance the state’s leadership position in the face of competing ambitions in Europe, China and elsewhere.

Newsom’s optimism about AI is guarded. The executive order deputizes state agencies responsible for law enforcement, emergency response and technology procurement to conduct a joint risk analysis of potential threats to California’s critical infrastructure and to deliver a classified report to the governor by March of next year. The agencies are tasked with ensuring that AI systems “remain under effective human control,” according to the order.

The governor wants growth and innovation, said Russell Wald, director of policy for Stanford’s Institute for Human-Centered Artificial Intelligence, but he’s also “laying out the beginning of possible guardrails, and making sure this is something that benefits all Californians.”

Tristan Harris, executive director of the Center for Humane Technology, applauded the governor’s executive order and said “We are in a situation where very little substantively has ever been done” to regulate AI. Harris, whose video on the catastrophic dangers of the technology, the A.I. Dilemma, has been viewed on YouTube nearly three million times, believes that governments are still woefully unprepared for the challenges of the technology. “The scale and speed of what his coming is so much bigger than what we are ready for,” he said.

Over the past six months, Newsom and his staff have met with labor leaders, consumer protection advocates, and industry figures like Sam Altman, the chief executive officer of OpenAI. Newsom said his administration drafted AI legislation for the state legislature but pulled it back from the current session, which ends on Sept. 14, in part because the industry remains young and there are too many uncertainties around potential unintended consequences. The executive order, he said, “is our first step.”

The federal government is also moving methodically. For binding action around AI, the White House requires action in Congress, which has yet to draft legislation, and whose members are currently holding a wide array of informational meetings on the topic. Senator Chuck Schumer, a New York Democrat, is convening one such gathering on Sept. 13 with tech CEOs like Tesla Inc.’s Elon Musk, Meta Platforms Inc.’s Mark Zuckerberg, Alphabet’s Sundar Pichai and Nvidia Corp.’s Jensen Huang — part of an effort to begin crafting bipartisan legislation.

Newsom, while caveating that he’s a “wholly owned subsidiary of all things Biden,” says, “I do worry that the feds will be late to the game.”

A charismatic politician reelected in a landslide last year, Newsom has emerged as a leading Biden surrogate for the 2024 presidential campaign. He’s committed a $10 million political action committee to the cause, and frequently tours red states on Biden’s behalf. Meanwhile, the governor has also sought to cultivate his profile for a presidential campaign of his own, most likely in 2028, though he routinely deflects questions about his presidential ambitions.

Newsom says his attitude is different than that in states like Montana, which recently banned the popular Chinese social media app TikTok. Their approach “is ready, fire, aim. Ours is ready, aim, fire,” Newsom said. The executive order “contrasts clearly to what Montana just did.”

He added that his oldest daughter, whose name is Montana, was recently in the state over the July Fourth holiday and expressed concern that she could be arrested for using TikTok. He reassured her there was no danger, though regular Montanans will no longer be able to download the app next year. “I’m sensitive to overregulating, and as a fiduciary of our state I don’t want to just compete,” Newsom said. “I want to dominate in innovation.”

–With assistance from Karen Breslau.

More stories like this are available on bloomberg.com

©2023 Bloomberg L.P.