In a move that surely has AI developers sobbing into their keyboards, California Governor Gavin Newsom signed new legislation to protect Hollywood actors from becoming AI clones. The bills, strongly supported by SAG-AFTRA, require consent from performers or their estates before AI can replicate their faces, voices, or questionable acting choices.
Newsom made the announcement with his signature flair, promising to stop the rise of AI actors. “No one will turn Arnold Schwarzenegger into the next Terminator sequel without asking first,” he declared. Of course, this protection doesn’t apply to Marvel movies, where digital actors remain the standard.
Protecting Hollywood from AI Clones
The laws target digital replicas of actors, making it illegal to use AI-generated likenesses without consent. Actors from across Hollywood, many still figuring out what AI means, rallied behind the bill. For them, it’s a win in the ongoing battle to remain human in a world slowly overtaken by flawless, AI-powered performers.
According to SAG-AFTRA, the legislation ensures that even the estates of deceased actors control their digital resurrection. “Marilyn Monroe’s hologram won’t be starring in another Fast & Furious film anytime soon,” said a Hollywood insider. However, the rules don’t stop studios from thinking about it.
Newsom Targets Deepfakes in Elections
Newsom’s bill doesn’t just stop at Hollywood. He also targeted deepfakes in election campaigns. Under the new law, platforms must remove or label deceptive, AI-altered content during elections. Campaign ads now need to disclose when AI-generated content is used, because apparently, we’re not ready to see politicians act like humans on their own.
“Why should voters be fooled by a computer program making politicians sound smart?” Newsom joked. “Let’s keep the chaos real, people.”
The bill aims to prevent deepfakes from misleading the public during elections. Although, the legislation seems unlikely to stop AI from making sure everyone looks good in selfies.
Hollywood’s Reaction: AI-Free Bad Acting Still Safe
While SAG-AFTRA celebrates the victory, studio executives are already grumbling about the limits on AI use. “Sure, we’ll protect their likenesses. But if you think we’re giving up on flawless CGI, think again,” said one anonymous executive. “The next blockbuster can still feature real actors, with a little AI help to make them look better.”
Actors, meanwhile, are breathing a sigh of relief, knowing their jobs are safe—unless they forget their lines.
The Deepfake Debate: Will This Law Change Anything?
Social media responded quickly, mocking the law and celebrating its potential to bring “honesty” back to election campaigns. “Finally, AI won’t trick us into thinking a politician is competent!” one Twitter user joked.
But skeptics questioned the effectiveness of the new rules. “Sure, they’ll label deepfakes. But what about the real politicians who already sound like robots?” tweeted another.
The new legislation might change how election campaigns use technology, but for now, it’s clear deepfakes won’t be disappearing entirely. They just need to disclose themselves more—because that’s how deception works.
What’s Next?
California is leading the charge against AI misuse, but the entertainment industry and political campaigns will likely continue pushing boundaries. Newsom remains confident the laws will protect actors and voters from technology gone rogue. “We’re paving the way for a future where real people—not robots—run Hollywood and our elections,” he promised.
For now, AI-generated actors are out of luck. But in the battle between human actors and AI clones, the sequel has just begun.
Key Takeaways
- Governor Gavin Newsom signed a law requiring consent for AI-generated replicas of actors and performers, protecting their digital likenesses.
- Election campaigns will now have to label or remove deepfake content, making it harder to mislead voters during election cycles.
- The entertainment industry and election campaigns are preparing for new rules, but AI continues to push boundaries.