Scarlett Johansson called for legislation last month to outlaw AI deepfakes, after OpenAI released a voice chat demo that sounded eerily like her.
Now, one of the AI industry’s leading trade groups is joining that call — urging Congress to pass legislation that would protect artists from the unauthorized use of their likenesses.
Microsoft, which has invested $13 billion in OpenAI, is one of the key members of BSA Software Alliance. In a policy statement issued on Monday, the trade group called for the creation of a new federal right to prevent misuse of digital replicas.
“Artists should have the right to prevent the unauthorized commercial dissemination of any digital replica that is so realistic that a reasonable observer would believe it is the actual artist’s name, image, likeness or voice,” the group said in a statement.
Congress is considering a range of AI regulations, though observers have been skeptical that anything will pass any time soon.
SAG-AFTRA, the actors’ union, has rallied support for the No Fakes Act, which would make it illegal to produce or distribute unauthorized digital replicas. Sen. Chris Coons, D-Del., one of the authors, is expected to release an updated version in the coming days.
BSA, which also represents companies like Adobe and Oracle, is pushing for a narrower approach, which would encourage takedowns of digital replicas while exempting platforms from legal liability for users’ conduct.
The trade group also wants Congress to outlaw software tools whose “primary purpose” is creating unauthorized fakes, without impeding AI technologies that have legitimate and beneficial uses.
“There are going to be some bad actors who pop up and do things they shouldn’t do,” said Aaron Cooper, vice president of global policy at BSA. “That’s bad for the reputation of the entire industry… The more trust we can add to the system, the better for everybody.”
In the absence of congressional action, most of the actual lawmaking is happening at the state level. Tennessee adopted a law in March — the ELVIS Act — that makes it illegal to use AI to clone a person’s voice without their approval. The law includes criminal penalties.
The New York legislature passed a bill last week that requires informed consent in contracts to create AI replicas. California is also considering similar bills.
The Motion Picture Association, which represents the major entertainment studios, has pushed back on some of that legislation, arguing that it inadvertently infringes on the First Amendment by outlawing legitimate parody or recreations of historical figures. Likewise, BSA is calling for exceptions for things that have long been protected in the copyright context, such as comment, criticism and scholarship.
From the industry standpoint, federal legislation would help avoid a “patchwork” of state laws on digital replicas that might conflict with each other.
VIP+ Analysis: Scarlett vs. Sam Altman Is ‘Black Widow’ Sequel No One Saw Coming
“It’s just better and more efficient across the board, whether you’re an artist or someone who’s looking to create things, to have one harmonized system,” Cooper said.
Congress is also considering narrower legislation that would outlaw pornographic deepfakes and prevent the use of AI deepfakes to influence elections.
When explicit fakes of Taylor Swift circulated online in January, leaders on all sides of the issue called for a quick response.
“We have to act,” said Microsoft CEO Satya Nadella at the time. “I think we all benefit when the online world is a safe world.”
Source Agencies