Lainey Wilson Calls For AI Legislation At House Judiciary’s Los Angeles Field Hearing, But Law Professor Warns Of Unintended Consequences
Singer-songwriter Lainey Wilson called for passage of proposed legislation that would curb the use of unauthorized AI deepfakes and voice clones, and she also told lawmakers that “we need to keep humanity in art. We cannot lose that.”
Wilson was among the witnesses testifying at a Los Angeles field hearing of a House Judiciary subcommittee today, with some strong endorsements for the proposed bill, and also a stark warning against one of its provisions.
More from Deadline
“My art is uniquely and literally me, my name, my likeness, my voice,” Wilson said. “I do not have to tell you how much of a gut punch it is to have your likeness or your voice ripped from you and used in ways that you could never imagine or would never allow. It is wrong, plain and simple.”
She added, “There aren’t many things that we can control in life, but making decisions about the use of our own selves, our own unique qualities, that should be one.”
As an example, she said that she has spotted herself online in unauthorized ads in which she is selling weight loss gummies. “I would never in a million years ever do anything like that. But at the end of the day, people say like, ‘I got to see it to believe it.’ Well they’re seeing it and they’re believing it, and folks that I am super close to believe it at times too. And it’s really, really scary when it gets to that point.”
Wilson said that she was representing the Human Artistry Campaign, which placed an ad in USA Today signed by more than 275 performers and other artists calling for passage of the bill.
The legislation would give individuals more control over the use of their identifying characteristics in digital replicas. It affirms that every person has a “property right in their own likeness and voice,” and do not expire upon a person’s death. The rights can be transferred to the heirs or designees for a period of 10 years after the individual’s passing. It sets damages at $50,000 for each unauthorized violation by a personalized cloning service, or the actual damages suffered plus profits from the use. Damages are set at $5,000 per violation for unauthorized publication, performance, distribution or transmission of a digital voice replica or digital depiction, or the actual damages.
Christopher Mohr, president of the Software and Information Industry Association, told the committee that “many identity based harms are already covered by different doctrines created by federal and state statutes, and these doctrines already apply to generative AI uses.” He cited the Lanham Act, which restricts false celebrity endorsement.
He also said that it was “important to consider the limits of the First Amendment. Statutory rights that regulate identity will be reviewed by courts as a form of speech regulation. In order to survive the heightened scrutiny that such a statute would receive, it is critical that it both be tailored to remedy a specific harm and contain sufficient breathing space for expressive works and other kinds of protected speech.”
The House bill includes a provision for a First Amendment defense, including whether the likeness is “necessary for and relevant to the primary expressive purpose of the work in which the use appears.” That includes whether the work is transformative, and whether it is “constitutionally protected commentary on a matter of public concern.” Those are factors that courts have considered in a “fair use” defense in the unauthorized use of copyrighted material.
The Motion Picture Association also has expressed First Amendment concerns. A spokesperson said last month, when the House bill was introduced, that “any legislation must protect the ability of the MPA’s members and other creators to use digital replicas in contexts that are fully protected by the First Amendment.” The trade association also noted that the recent SAG-AFTRA contract includes “rights to informed consent and compensation for use of their digital replicas.”
Harvey Mason Jr., president and CEO of the Recording Academy, told lawmakers said AI is “the latest example of a tool that can expand opportunities for different voices to follow their passion and create new music,” as well as uses outside the creative process. At Sunday’s Grammy Awards, the Recording Academy has partnered with IBM to create customized content before and during the ceremony this weekend.
But he stressed that the Recording Academy will be involved in everything that is produced for the show, with the editorial team using AI to “enhance and expand the work, not replace it.”
“We understand that AI is here and it’s not going anywhere,” he said. “But our award guidelines stay true to our mission, to honor the people behind the music. Only human creators are eligible to win a Grammy Award.”
That said, Mason said that there was a “patchwork of state laws” that address the right of publicity for individuals, and some do not have any laws at all.
The No AI Fraud Act, he said, “establishes in federal law that an individual has a personal property right in the use of their image and voice. That’s just common sense and it’s long overdue.”
Senate lawmakers this week proposed legislation that would set criminal penalties for the spread of non-consensual, AI-generated sexual images of individuals. That followed the furor over the spread of explicit AI images of Taylor Swift that spread on social media. Some of the lawmakers cited the Swift images at the hearing, which was chaired by Rep. Darrell Issa (R-CA).
Another witness at the hearing, University of Pennsylvania Law School Professor Jennifer Rothman, gave a warning that the proposed No AI Fraud Act, as well as a companion bill in the Senate, may have unintended consequences because they allow for the transferability of a person’s right of publicity to another person or entity. That would be especially egregious in situations where younger performers, lacking leverage in negotiations, sign away their rights.
“Imagine a world in which Taylor Swift’s first record label obtained rights in perpetuity to young Swift’s voice and likeness,” she said. “The label can then replicate Swift’s voice over and over in new songs that she never wrote, and have AI renditions of her perform and endorse the songs and videos, and even have holograms perform them on tour.” She said that under the proposed legislation, “the label would be able to sue Swift herself for violating her own right of publicity if she used her voice and likeness to write and record new songs and publicly perform them. This is the topsy turvy world that the two draft bills would create.”
Your favorite artists, actors, & performers agree – we all deserve the right to protect our voice & likeness from AI generated deepfakes and clones. Join us and support the No AI Fraud Act https://t.co/8JLGILsFEy pic.twitter.com/lQdAaSVhjL
— Human Artistry Campaign (@human_artistry) February 2, 2024
Best of Deadline
'The White Lotus' Season 3: Everything We Know About The Cast, Premiere Date & More
Remembering Carl Weathers: His Career In Photos From 'Rocky' To 'Happy Gilmore' To 'The Mandalorian'
Sign up for Deadline's Newsletter. For the latest news, follow us on Facebook, Twitter, and Instagram.