A new AI deepfake bill introduced by a bipartisan group of senators is bringing together actors, studios and tech companies.
The No Fakes Act, led by Democratic Sen. Chris Coons of Delaware, is a revised version of an earlier discussion draft introduced last fall that targeted digital fakes and protected the images of actors (and the average citizen).
“Game over, AI scammers! Enshrining protections against unauthorized digital replications as a federal intellectual property right will keep us all protected in this brave new world,” SAG-AFTRA President Fran Drescher said in a statement posted on the union's website. “Especially for performers whose livelihoods depend on their image and brand, this step forward is a huge win.”
She thanked Senator Coons, as well as other supporters of the bill, including Senators Marsha Blackburn (R-TN), Amy Klobuchar (D-MN), and Thom Tillis (R-NC).
WHAT IS ARTIFICIAL INTELLIGENCE (AI)?
Duncan Crabtree-Ireland, SAG-AFTRA's national executive director and chief negotiator, told Fox News Digital: “I think it was always Senator Coons' view, for example, and it was certainly our view that all of the key stakeholders in the process should be consulted before the bill was formally introduced, because it's so difficult to get legislation passed in Washington, especially right now. And we thought that if all of the concerns and issues could be truly heard, then we would have the best chance of getting something enacted. And from our perspective, this is absolutely crucial. The time is now and it's desperately needed.”
The Motion Picture Association, which represents several major studios including Netflix, Sony, Paramount, Universal, Disney and Warner Bros, also praised the bill.
“We support protecting artists from abuse of generative AI, and this bill carefully establishes federal protections against harmful uses of digital replicas while respecting First Amendment rights and creative freedoms,” MPA President and CEO Charles Rivkin said in a statement on the organization's website.
“This is absolutely crucial. The time is now and it is desperately needed.”
SCARLETT JOHANSSON REFUSED TO JOIN OPENAI BECAUSE IT WOULD BE 'STRANGE' FOR HER CHILDREN, 'IT WAS AGAINST MY CORE VALUES'
The MPA had initially been reticent about the wording of the original bill, and last year in a statement when it was introduced, said it hoped to work with senators on the bill “without infringing on the First Amendment rights and creative freedoms our industry relies on.”
Crabtree-Ireland said: “I think the MPA and the RIAA had been in agreement from the beginning… I think it was a combination of all those factors that led to what I consider to be an unprecedented level of support for any legislation affecting the entertainment industry.”
Artificial intelligence expert Marva Bailer said tech companies like OpenAI and IBM also have an interest in supporting the bill.
WATCH: AI EXPERT EXPLAINS WHY TECH COMPANIES ARE ON BOARD WITH HOLLYWOOD-BACKED AI LEGISLATION
CLICK HERE TO SUBSCRIBE TO THE ENTERTAINMENT NEWSLETTER
“What might surprise some people is that tech companies, along with film organizations, professional associations, and creators, are actually in favor of this bill,” he told Fox News Digital. “So why would Open AI or Disney or IBM Alliance WatsonX be interested? Well, it's because it's going to put some barriers around the established market. And what's happening with these deepfakes is that people are creating a surrogate market. And this surrogate market has no rules and no monetization.”
Coons' website summarizes the bill, explaining that it would “hold individuals or companies liable for harm caused by producing, hosting, or sharing a digital replica of a person acting in an audiovisual work, image, or sound recording in which such person never appeared or otherwise approved, including digital replicas created by generative artificial intelligence (AI).”
“I mean, having SAG-AFTRA, the MPA, the RIAA, OpenAI, all supporting a piece of legislation that is relevant to the entertainment industry, but well beyond the entertainment industry, of course, is something to be noted.”
NICOLAS CAGE FEARS AN AI WILL STEAL HIS BODY AND DO WHAT THEY WANT WITH IT
The proposed penalties include a $5,000 fine, plus damages, and the removal of the digital replica. Civil action may also be brought against the offender, under which he or she may be liable for $5,000 for each work containing the unauthorized replica if it is an online service, and $25,000 if the author is not an online service, such as a studio.
“I think the most important thing is that something gets passed, because this is an issue that is affecting people right now and it’s very real,” Crabtree-Ireland said. “And I’ve spoken to dozens of our members who have been personally affected by this. I myself have been personally affected by this. Last year, during the ratification process for our contract, I was deepfaked. At the end of the film and TV strike last year, someone made a video of me saying false things about the contract, urging people to vote against the contract that I had negotiated. They posted it on social media like Instagram and tens of thousands of people saw it and there was no way to undo the bell. So I think there’s a need that’s certainly being faced by our members and also people far beyond.”
WATCH: LEGAL EXPERT EXPLAINS WHAT CHANGED IN THE ANTI-COUNTERFEITING LAW TO ATTRACT STUDIES
Rosenberg cited the recent case of a Maryland school principal who was allegedly framed for making racist comments via an AI deepfake, and stressed that the bill is “not limited to just celebrities.”
“Therefore, you don't necessarily have to prove that someone has a commercial value in their voice or a commercial value in their image to be covered by this law.”
MORGAN FREEMAN CALLS AI DEEPFAKE A 'SCAM' AFTER HER VOICE WAS REPLICATED ON TIKTOK
He added: “The stories we hear about deepfakes are certainly bad and they affect people. But I agree that there are a lot of good things. And by having these safe harbor-type provisions, it allows the technology to continue to develop and grow.”
Bailer felt similarly, saying the bill's importance lies in establishing “transparency.”
“So nobody's saying, 'Oh, AI's not going to happen. We need to stop it. ' They want to understand the transparency of where AI is being used and they want the permissions. And what we really need to look at is the surrogate market. And what that means is we're seeing these real brands licensing through contracts their image and likeness to have the Elvis experience, the Kiss experience, and the ABBA experience. And it's very exciting.”
While protections for writers and actors have been put in place following last year’s strikes, SAG-AFTRA is still grappling with the impact of AI on other forms of entertainment.
LIKE WHAT YOU'RE READING? CLICK HERE FOR MORE ENTERTAINMENT NEWS
The union is currently on strike on behalf of its members who perform in video games after more than a year and a half of negotiations.
“While agreements have been reached on many issues important to SAG-AFTRA members, employers refuse to state clearly, in clear and enforceable language, that they will protect all performers covered by this contract in their AI language,” SAG-AFTRA's website states.
WATCH: SAG-AFTRA REPRESENTATIVE ON WHY LAST YEAR'S “DEVASTATING” ATTACKS IN HOLLYWOOD WERE “NECESSARY”
Regarding the current strike, Crabtree-Ireland said he hopes the No Fakes Bill, if passed, will add to what he calls a “patchwork of protection.”
Reflecting on the previous strike, which shut down Hollywood for nearly six months last year, Crabtree-Ireland said: “Our members suffered. Other industry workers suffered. The industry suffered. It was necessary at the time. I wish it hadn’t been that way. I mean, to me, when I look at the final agreement, I feel like the companies could have reached this agreement with us on July 12, and this whole thing could have been avoided, and yet they refused. And that’s very frustrating. On the other hand, it was essential that we got ahead of the implementation of AI. If we tried to negotiate this after the industry had already started using it on a large scale, it would be impossible to put the genie back in the bottle. And so I feel very good that we successfully anticipated this challenge.”
CLICK HERE TO GET THE FOX NEWS APP
He added: “It's an existential battle and that's why we're fighting it right now with the game companies. Because if we wait three years, it'll be too late. This will have gone too far and we won't be able to turn back. So this is a fight for the future of our members' careers and something even more fundamental.”