SAG-AFTRA and women’s groups urge Gavin Newsom to sign AI safety bill

As California Governor Gavin Newsom considers whether to sign or veto the hotly contested artificial intelligence safety bill SB 1047, SAG-AFTRA and two women's groups are urging him to approve it – adding even more voices to an already heated debate. The performers' union, the National Organization for Women (NOW) and Fund Her have each sent letters to Newsom, all of which are available from The edge and are published here for the first time.

The letters from SAG-AFTRA, NOW, and Fund Her underscore concerns about AI's potential to cause catastrophic harm if the technology is left unregulated. SAG-AFTRA explains SB 1047's mandate for developers to test for and protect against AI-caused disasters, such as cyberattacks on critical infrastructure or the development of bioweapons. NOW and Fund Her cite dire warnings from people at the forefront of AI and discuss the technology's potentially disproportionate impact on vulnerable groups.

SAG-AFTRA posted a call for support on X yesterday to its 160,000 members, which include stars like Scarlett Johansson and Tom Hanks. NOW, the largest feminist organization in the U.S. with about 500,000 members, said it was motivated by expert testimony about “how dangerous this incredible technology can be if not developed and used responsibly.” Fund Her, a PAC that helped elect 12 progressive women to prominent positions in California in 2022, writes of the “race to develop the first independently thinking AI,” at which point “it will be too late to put in place safety guardrails.”

SAG-AFTRA and NOW are the latest influential players to weigh in on the California bill, which has become the subject of extraordinary national interest and has crossed traditional partisan lines.

SB 1047, authored by Senator Scott Wiener, would be the most significant AI safety bill in the U.S. It would impose civil liability on developers of next-generation AI models like ChatGPT if they cause disasters without taking proper safety precautions. The bill also provides whistleblower protections for employees of AI companies and has support from OpenAI whistleblowers Daniel Kokotajlo and William Saunders.

“The AI ​​safety standards set by California will change the world”

NOW writes in its letter that “the AI ​​safety standards set by California will change the world,” a view shared by the bill’s co-sponsor, Dan Hendrycks, director of the Center for AI Safety. Hendrycks says The edge that SB 1047 could be Newsom's “Pat Brown moment,” referring to the then-California governor's signing of a landmark emissions law in 1966. He cites the so-called California effect: “When California leads the way on major regulations, the rest of the country follows.”

After the bill passed both houses of the state legislature with a large majority in late August, it now awaits Governor Newsom's decision, which is expected by September 30. The governor's office said it “does not normally comment on pending bills. This measure will be evaluated on its merits.”

That comment notwithstanding, the fate of SB 1047 may ultimately depend on political calculations—a fact that both sides appear to acknowledge as they court support in the bill's final hours.

The strange political coalitions that have formed in the fight over SB 1047 suggest a turbulent future for AI policy. Billionaire Elon Musk supports the bill alongside social groups and unions, while former House Speaker Nancy Pelosi, progressive Congressman Ro Khanna, Trump-supporting venture capitalist Marc Andreessen and AI “godmother” Fei-Fei Li all oppose it.

AI is a rare topic where the camps are not yet clearly defined. As this technology becomes more important, the debate about its regulation is likely to intensify and further confuse the usual camps.

These latest letters join a campaign of support against the bill from organizations such as the nearly 2 million-member SEIU and the Latino Community Foundation.

SAG-AFTRA has been the site of some of the most organic anti-AI sentiment. Many film actors see generative AI as an existential threat to their livelihoods. The use of the technology was a major sticking point in the 2023 actors' strike, which resulted in studios being required to obtain the informed consent of performers before creating digital replicas of them (the actors must also be compensated for their use).

“SAG-AFTRA knows the potential dangers of AI only too well”

The union's letter states, “SAG-AFTRA knows all too well the potential dangers that AI brings.” The union points to problems its members have experienced in the form of non-consensual deepfake pornography and the theft of performers' images, concluding that “policymakers have a responsibility to step in and protect our members and the public. SB 1047 is a measured first step to get us there.”

In a phone interview, the organization's president, Christian Nunes, said NOW got involved because the group was concerned about how unregulated AI could affect women. She and NOW have previously supported efforts to ban involuntary deepfakes.

In the NOW letter, Nunes writes that the dangers AI experts warn of would “disproportionately affect vulnerable groups, including women.” She highlights Newsom's “courageous support for us in the face of intense lobbying pressure” on reproductive rights, equal pay, and paid family leave, and that this support is “one of the reasons women are voting for [him] over and over again.”

Although SB 1047 does not explicitly address the more central concerns of these groups, the organizations seem to see strategic value in joining the coalition behind it. Nunes said The edge She sees the bill as part of a broader project to hold big tech companies accountable.

This support for SB 1047 complements other pending AI legislation that more directly addresses the specific issues facing these groups. For example, the federal NO FAKES Act aims to combat deepfakes, while another AI bill on Newsom's desk, backed by SAG-AFTRA, would regulate the use of digital replicas. By supporting SB 1047 alongside these more targeted initiatives, these organizations appear to be taking a comprehensive approach to AI governance.

Both the NOW and Fund Her letters draw parallels between unregulated AI and the history of social media. Fund Her founder and president Valerie McGinty writes to The edge, “We've seen the incredible damage social media has done to our children and how difficult it is to undo it. If Governor Newsom signs SB 1047 into law, we won't be stuck in that mess again.”

It's unclear whether the letters will be enough to convince the powerful forces opposing it. While Wiener and other supporters describe the rule as “frivolous” and “common sense,” the industry at large is furious.

The U.S. currently relies almost exclusively on self-regulation and non-binding voluntary commitments to regulate AI, and industry wants it to stay that way. As the first U.S. AI safety regulation with teeth, SB 1047 would set a strong precedent, which is likely a motive for these letters and the fierce industry opposition.

Google, Meta and OpenAI have taken the unusual step of writing their own letters opposing the bill. Resistance from AI investors has been even fiercer. Renowned startup incubator Y Combinator (YC) and venture fund Andreessen Horowitz (a16z) have tried every means possible to defeat SB 1047. These and other prominent opponents warn that the bill could trigger an exodus from California, cede U.S. leadership in AI to China and destroy the open source community.

Of course, supporters dispute each of these arguments. In a July letter addressing YC and a16z's claims about the bill, Wiener points out that SB 1047 would apply to any affected AI company doing business in California, the world's AI hub and fifth-largest economy. Dario Amodei, CEO of leading AI company and later de facto SB 1047 backer Anthropic, called the threat to leave the bill “pure theater” (but it was also cited by OpenAI, Meta, and Google).

Nancy Pelosi called the bill “well-intentioned but ill-informed”

In her statement opposing the bill, Pelosi called it “well-intentioned but ill-informed.” In a phone interview, Wiener said, “I have tremendous respect for the Speaker Emerita. She is the GOAT,” but called Pelosi's statement “unfortunate” and noted that “some of the world's leading pioneers in machine learning support the bill,” citing the support of deep learning “godfathers” Geoffrey Hinton and Yoshua Bengio. Wiener also points to a supportive open letter published Monday by over 100 employees and alumni of leading AI companies.

To evaluate SB 1047 on its merits, the most compelling letter is probably that of Anthropic, which broke away from its competitors, writing that the “benefits of the revised legislation likely outweigh the costs.” This letter followed a series of changes made in direct response to the company's previous complaints. Anthropic's Claude family of chatbots are world-leading by some metrics, and the company will likely be one of the few AI developers directly affected by the law in the near future.

With congressional leaders vowing to block key federal AI regulations and opposing SB 1047, California could go it alone, as it has done on net neutrality and privacy. As NOW's Nunes writes, the “AI safety standards California sets will change the world” and give Governor Newsom a chance to make history and be a model of “balanced AI leadership.”

Fund Her’s McGinty summed up the supporters’ stance in an email to The edge: “We should listen to these experts who care more about our well-being than to the executives of big tech companies who are cutting corners on AI safety.”

As the September 30 deadline approaches, all eyes are on Governor Newsom to see how he will shape the future of AI governance in California and beyond. “My experience with Gavin Newsom, whether you agree or not, is that he makes thoughtful decisions based on what he believes is best for the state,” Wiener says. “I've always appreciated that about him.”

Correction: The article initially cited the “godfather” of deep learning, Yann LeCun, as a supporter of SB 1047. LeCun opposes the bill. We regret the error.

Leave a Comment

url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url