Skip to playerSkip to main contentSkip to footer
  • 5/21/2025
The Senate Judiciary Committee held a hearing on Wednesday on AI-generated deepfakes.
Transcript
00:00:00The Senate Judiciary Committee on Privacy, Technology, and the Law will come to order.
00:00:05And I want to thank all of you for being here with us this afternoon.
00:00:10As you can see, everyone is curious about AI.
00:00:15And today, we are going to talk about AI and how this affects our content creators, our
00:00:22creative community, our children, and how it affects all Americans when it comes to
00:00:29the digital space and what happens with AI-generated deepfakes.
00:00:37This hearing is titled, The Good, The Bad, and The Ugly.
00:00:43And it is titled in that regard for a specific reason.
00:00:47Now, in Tennessee, we talk about there is a lot of good that has come from AI.
00:00:54When you're talking about logistics, advanced manufacturing, healthcare, cutting-edge research,
00:01:02and we've even seen the amazing role that AI has played in giving voice to some, Randy
00:01:10Travis, to be specific, who joined us here on the Hill recently in introducing the NoFakes bill.
00:01:19It gave Randy Travis the ability to share his talent with the world once again.
00:01:26And despite some of these benefits, there are really some bad and unpleasant sides to
00:01:33AI, and specifically, when it comes to AI-generated deepfakes.
00:01:39These deepfakes cause tremendous harm.
00:01:43And today, we're going to examine those harms and the legislative solutions, including the
00:01:50NoFakes Act that Senators Coons, Klobuchar, Tillis, and I have introduced.
00:01:57We've introduced them specifically to address these harms.
00:02:01First, these deepfakes pose significant harm to our content creators.
00:02:08From Music Row to Beale Street, back over to the Smoky Mountains in Upper East Tennessee,
00:02:15Tennesseans have made their mark in the music world, and we've got one of those artists
00:02:22with us today.
00:02:24But the proliferation of these digital replicas created without the artist's consent pose
00:02:31a real threat to their livelihoods and the livelihoods of all American artists and creators.
00:02:40The NoFakes Act is a monumental step forward in protecting our creative community.
00:02:47It provides landmark protection of the voice and visual likenesses of all individuals and
00:02:54creators from the spread of these digital replicas that are created without their consent.
00:03:03And I'm looking forward to speaking with our witnesses about this critical bill and how
00:03:09impactful it will be for the creative community.
00:03:14And I've got to be clear, our efforts must protect all Americans from the harms of deepfakes,
00:03:22and that includes our precious children.
00:03:26In recent years, we've seen a deeply troubling spike in the use of generative AI to create
00:03:33sexually explicit deepfake content.
00:03:37Just as concerning, NCMEC saw a, get this number, 1,325% increase from 2023 to 2024
00:03:50in reports involving generative AI.
00:03:54We've got to do something about that and both the NoFakes Act and the Take It Down Act,
00:04:01which President Trump just signed into law this week, go a long way to providing greater
00:04:07protections for our children from these deepfakes.
00:04:11These deepfakes have also served as a powerful tool for fraud.
00:04:17In one example, scammers used AI-generated images and voices of a multinational firm's
00:04:24CEO to steal millions of dollars.
00:04:29We've also seen celebrities' likenesses misappropriated in false product endorsements.
00:04:36It's clear that Congress has to act, and that's why the three of us sitting right here on
00:04:43this dais have joined forces, plus Senator Tillis, who's going to get here in a little
00:04:50bit, to work on the NoFakes Act and get it to President Trump's desk this year.
00:04:57We know that the creative community, all these content creators, our children, and all Americans
00:05:04deserve nothing less than our best efforts on this issue.
00:05:10And I turn to Senator Klobuchar for her opening statement.
00:05:13Well, thank you very much, Senator Blackburn.
00:05:16I'm very excited about this subcommittee and the work we've already done together for years
00:05:23on this issue and similar issues when it comes to tech.
00:05:28I share your hopes for AI and see that we're on this cusp of amazing advancements if this
00:05:34is harnessed in the right way.
00:05:36But I'm also concerned if things go the wrong way.
00:05:41I think it was Jim Brooks, a columnist, that said he has trouble writing about it because
00:05:45he doesn't know if it will take us to heaven or hell.
00:05:48So it's our job to head to heaven, and it's our job to put some rules in place, and this
00:05:54is certainly one of them.
00:05:56We want this to work for children, for consumers, for artists, and not against them.
00:06:04And you brought up the example, Chair, of Randy Travis, who was at the event that we
00:06:11recently had with you and Senator Coons and myself about the bill and how he used AI in
00:06:18such a positive way.
00:06:20But then we know there are these risks.
00:06:22And one of the things that I think is really exciting about this week is that, in fact,
00:06:28on Monday, the President signed my bill with Senator Cruz, the Take It Down Act, into law.
00:06:35This was a bill I discussed with him and the First Lady at the inaugural lunch.
00:06:39It's an example of use every moment you have to advance a cause.
00:06:45And then she supported the bill and helped to get it passed in the House.
00:06:49Senator Cruz and I had already passed it in the Senate, and we were having some trouble
00:06:54getting it done over in the House.
00:06:55So we're really pleased, because it actually does set some track moving forward, even though
00:07:03that bill is about non-consensual porn, both AI-created and non-AI-created.
00:07:10It's had huge harmful effects, about 20-some suicides a year of young kids who think they're
00:07:16sending a picture innocently to a girlfriend or a potential boyfriend, and then it gets
00:07:22sent out on their school internet.
00:07:25It gets sent out to people they know, and basically they believe their life is in ruins
00:07:30and don't have any other context and take their own lives.
00:07:34And that's just the most obvious and frightful part of this, but there's others as well.
00:07:39So I'm hoping this is going to be a first step to some of the work that we can do, including
00:07:45with the bill that we're going to be discussing today.
00:07:49So AI-enabled scams have become far too common.
00:07:53We know that.
00:07:54It takes only a few seconds of audio to clone a voice.
00:07:59Criminals can pull the audio sample and personal backstory from public sources.
00:08:05Just last week, the FBI was forced to put out an alert about scams using AI-cloned voices
00:08:10of FBI agents and officials asking people for sensitive payment information.
00:08:16Jamie Lee Curtis was forced to make a public appeal to Mark Zuckerberg to take down an
00:08:20unauthorized deepfake ad that included her digital replica endorsing a dental product.
00:08:27While Metta removed the ad after her direct outreach, most people don't have that kind
00:08:33of influence.
00:08:35We also need rules of the road to ensure that AI technologies empower artists and creators
00:08:40and not undermine them.
00:08:43Art just doesn't entertain us.
00:08:45It's something that uplifts us and brings us together.
00:08:49And I recently met with Corey Wong, a Grammy-nominated artist from Minnesota.
00:08:54He talked about how unauthorized digital replicas threaten artists' livelihoods and undermine
00:08:58their ability to create art.
00:09:01So this is not just a personal issue.
00:09:03It's also an economic issue.
00:09:05One of the reasons our country, one of our best exports to the world is music and movies.
00:09:11When you look at the numbers and how we've been able to captivate people around the world,
00:09:16that's going to go away if people can just copy everything that we do.
00:09:20And one of the keys to our success as a nation and innovation has been the fact, and Senator
00:09:25Coons does a lot of work in this area, we've been able to respect copyrights and patents
00:09:30and people own right to their own products.
00:09:34So that's why this No Fakes Act is so important.
00:09:37It protects people from having their voice and likeness replicated using AI without their
00:09:43permission, all within the framework of the Constitution.
00:09:47And it protects everybody because everyone should have a right to privacy.
00:09:53I also am working in the space on AI to put some base rules in place in my role on the
00:09:58Commerce Committee.
00:09:59Senator Thune and I have a bill that we're reintroducing on this to set some rules for
00:10:04NIST to be able to put out there for companies that are using AI.
00:10:10And then I'm always concerned about its effect on democracy.
00:10:13But that is for a different day and in a different committee.
00:10:18But I do want to thank Senator Blackburn for her willingness to come out on doing something
00:10:23about tech, including the work she does with Senator Blumenthal, the work that we've done
00:10:28together on commerce.
00:10:30And if Monday is any sign with the first bill getting through, and they're in that Rose
00:10:36Garden signing ceremony, there's more to come.
00:10:39And so thank you and look forward to hearing from the witnesses.
00:10:43Thank you, Senator Klobuchar.
00:10:45Senator Coons, you're recognized.
00:10:47Thank you so much, Chair Blackburn, Ranking Member Klobuchar.
00:10:50It is a delight to work with you.
00:10:52And thank you for inviting me to give some brief opening remarks about the No Fakes bill.
00:10:57Because of you and Senator Tillis working on this together since 2023, we have made
00:11:04real progress.
00:11:06There is momentum with this bill.
00:11:07We've been adding co-sponsors.
00:11:09My thanks to Senators Durbin and Hagerty, Schiff and Cassidy.
00:11:13We're adding organizations that are endorsing it, like YouTube and RAINN.
00:11:18And as we saw at the White House on Monday, if there's bipartisan agreement in Congress
00:11:23and support from the White House that action is needed, we can make progress in complex,
00:11:28challenging technical areas.
00:11:30This hearing is a chance to look critically at the current state of the No Fakes bill
00:11:34so we can both build on that momentum and answer the questions, what did we get right?
00:11:38What do we need to tweak?
00:11:39How can we get more co-sponsors and push to a full committee markup?
00:11:43So I'm excited to hear from our witnesses today.
00:11:46There are two other committee hearings going on right now, which is why you will see senators
00:11:49come in and out.
00:11:51Not a lack of interest.
00:11:52Aren't we late?
00:11:54Yes.
00:11:55When we were drafting this bill, its applicability to pillars of the creative community, like
00:12:02Ms. McBride, Martina McBride, or to a movie star like Tom Hanks, its applicability to
00:12:07people who make a living off of their voice or likeness was clear.
00:12:11But Senator Blackburn and I agreed at the outset the rules we were drafting should apply
00:12:15to everyone.
00:12:16Everyone should have the power to control their digital replica online, not just those
00:12:20who are superstars.
00:12:21So I appreciate, Chair Blackburn, the witnesses you brought together today speak to the full
00:12:25scope of what this bill can do to keep the public safe from scams.
00:12:30Just like the bill Senator Klobuchar just got signed into law, and help wipe nonconsensual
00:12:34deepfake pornography off the Internet.
00:12:37Second, the revised draft we introduced last month was the product of stakeholders negotiating
00:12:42in good faith.
00:12:43Ms. Carlos, you and YouTube came to the table with the intention of getting to yes, and
00:12:48we got there.
00:12:49And if Google can get behind this bill, can handle the obligations that nofakes impose,
00:12:54so can the other tech platforms.
00:12:57And I look forward to hearing from you and returning to questions.
00:13:01Thank you, Senator Coons.
00:13:02I would like to introduce our witnesses.
00:13:05Martina McBride is a Nashville-based singer-songwriter who has sold more than 23 million albums worldwide
00:13:15with six singles hitting number one on the country music chart.
00:13:20In addition to her 14 Grammy Award nominations, Ms. McBride is a four-time Country Music Association
00:13:27Female Vocalist of the Year, a three-time Academy of Country Music Top Female Vocalist,
00:13:34and a member of the Grand Ole Opry.
00:13:37She first signed to RCA Records in 1991, and has since been awarded 14 gold records, 12
00:13:45platinum honors, three double platinum records, and two triple platinum awards.
00:13:54Mitch Glazier is the CEO and Chairman of the Recording Industry Association of America.
00:14:00We use the acronym RIAA.
00:14:03He helps to represent the rights and interests of over 1,600 member labels.
00:14:09Prior to joining RIAA, Mr. Glazier served as Chief Counsel for Intellectual Property
00:14:16to the U.S. House of Representatives Judiciary Committee, as well as numerous other roles
00:14:22in and around government, including as a commercial litigation associate.
00:14:28He earned his bachelor's degree from Northwestern University and his JD from Vanderbilt School of Law.
00:14:35Our next witness is Kristen Price.
00:14:38Ms. Price serves as Senior Legal Counsel for the National Center for Sexual Exploitation
00:14:45in COSI, correct?
00:14:48And she works to combat all forms of sexual exploitation and advocate for justice for
00:14:56survivors of sex trafficking, child sexual abuse, pornography, and prostitution.
00:15:03Before her work at NCOSI, Ms. Price served as Legal Counsel at the Alliance Defending
00:15:08Freedom, where she specialized in First Amendment law and conscious protections.
00:15:15Ms. Price earned her bachelor's degree from Cedarville University and her JD from Georgetown
00:15:22University Law Center.
00:15:25And Mr. Justin Brookman.
00:15:28Mr. Brookman is the Director of Technology Policy for Consumer Reports, where he specializes
00:15:35in data privacy and security issues.
00:15:38Before joining Consumer Reports, he was Policy Director of the Federal Trade Commission Office
00:15:44of Technology Research and Investigation.
00:15:47Earlier in his career, he served as Chief of the Internet Bureau of the New York Attorney
00:15:52General's Office.
00:15:54He earned his bachelor's degree from University of Virginia and his JD from New York University
00:16:00School of Law.
00:16:03And Ms. Susana Carlos, who serves as Head of Music Policy at YouTube.
00:16:08Until 2022, she served as Senior Counsel for YouTube's Music Publishing and in senior
00:16:14positions at the American Society of Composers, Authors, and Publishers.
00:16:20We like to call it ASCAP, Universal Music Group, and EMI Publishing.
00:16:26She is also on the board of Digital Media Association, which represents the leading
00:16:31global audio streaming companies and promotes legal access and engagement of music content
00:16:38between creators and users.
00:16:41Ms. Carlos earned her bachelor's at the University of California, Los Angeles, and her JD from
00:16:47Fordham University School of Law.
00:16:50Welcome to each of you.
00:16:51At this time, I want to ask you all to rise and raise your right hands.
00:17:00Do you affirm that the testimony that you're going to give to this committee is the truth,
00:17:05the whole truth, and nothing but the truth, so help you God.
00:17:10And let the record reflect that everyone is in the affirmative.
00:17:14We'll begin with our testimony.
00:17:17Ms. McBride, you are recognized for five minutes and welcome.
00:17:23Chairman Blackburn, Ranking Senator Klobuchar, Senator Coons, and members of the subcommittee,
00:17:28thank you for inviting me to speak about S. 1367, the No Fakes Act of 2025.
00:17:34A landmark effort to protect human voices and likenesses from being cloned by artificial
00:17:39intelligence without consent.
00:17:41I'm so grateful for the care that went into this effort, and I want to thank you and your
00:17:45colleagues for making this issue a priority.
00:17:48I started singing when I was four years old, and my voice is at the center of my art form.
00:17:53Each of my recordings includes a piece of me that is individual and unique.
00:18:01My songs reflect the human experience, and I'm honored that they are a part of people's
00:18:06lives, from wedding vows to breakups to celebrating milestones and even the special relationship
00:18:11between a mother and daughter.
00:18:12But today, my voice and likeness, along with so many others, are at risk.
00:18:17AI technology is amazing and can be used for so many wonderful purposes.
00:18:21But like all great technologies, it can also be abused, in this case, by stealing people's
00:18:25voices and likenesses to scare and defraud families, manipulate the images of young
00:18:30girls in ways that are unconscionable, impersonate government officials, or make phony recordings
00:18:36posing as artists, like me.
00:18:39It's frightening, and it's wrong.
00:18:42Congress just took a very important step forward to deal with sexually explicit deepfake images
00:18:45by passing the Take It Down Act.
00:18:47I want to thank all the leaders, including Senators Cruz, Klobuchar, Blackburn, and many
00:18:51on this committee who worked hard with others to push that bill into law.
00:18:56The No Fakes Act is a perfect complement to that effort by preventing AI deepfakes that
00:19:00steal someone's voice or likeness and use them to harass, bully, and defraud others,
00:19:06or to damage their career, reputation, or values.
00:19:10The No Fakes Act would give each of us the ability to say when and how AI deepfakes of
00:19:15our voices and likenesses can be used.
00:19:18If someone doesn't ask before posting a harmful deepfake, we can have it removed without jumping
00:19:22through unnecessary hoops or going to court.
00:19:25It gives every person the power to say yes or no about how their most personal human
00:19:30attributes are used.
00:19:32It supports AI technology by providing a roadmap for how these powerful tools can be developed
00:19:37in the right way.
00:19:39And it doesn't stand in the way of protected uses like news, parodies, or criticism.
00:19:45I want to thank the technology companies like OpenAI and Google who support this bill, as
00:19:49well as the legions of creators who have worked so hard to advocate for it, and the child
00:19:52protection and anti-sex trafficking and exploitation groups who support it and continue to fight
00:19:57for those who are most vulnerable.
00:19:59In my career, it's been a special honor to record songs that shine a light on the battles
00:20:04that many women fight, especially the terrible battle of domestic violence.
00:20:10Many fans have told me that the song Independence Day has given them strength, and in some cases,
00:20:16the song has been the catalyst that has made them realize that they need to leave an abusive
00:20:21situation.
00:20:23Imagine the harm that AI and AI deepfake could do, breaching that trust, using my voice in
00:20:30songs that belittle or justify abuse.
00:20:35One of the things I'm most proud of in my career is I've tried to conduct myself with
00:20:40integrity and authenticity.
00:20:43And the thought that my voice could be deepfaked or my likeness could be deepfaked to go against
00:20:48everything that I've built, to go against my character, is just terrifying.
00:20:55And I'm pleading with you to give me the tools to stop that kind of betrayal.
00:21:01Putting America on the right course to develop the world's best AI while preserving the sacred
00:21:05qualities that make our country so special, authenticity, integrity, humanity, and our
00:21:11endlessly inspiring spirit.
00:21:14That's what the NOFIX Act will help to accomplish.
00:21:16I urge you to pass the bill now.
00:21:21We thank you.
00:21:22Mr. Glazier, you're recognized for five minutes.
00:21:25Thank you so much.
00:21:26Thank you for having me.
00:21:27I'm honored to testify today alongside the groundbreaking artist, Martina McBride, who
00:21:32just spoke so eloquently about the value of someone's voice, the value of their image,
00:21:40and the threats posed by abuses of deepfake technology.
00:21:44I'd also like to recognize the almost 400 artists and performers and actors who have
00:21:52just signed a statement in support of the NOFIX Act with some very simple words.
00:21:58It's your voice, your face, your image, your identity.
00:22:05Protect your individuality.
00:22:07That's why we're here.
00:22:08That's what this is all about.
00:22:10Artists' voices and likenesses are fundamental to their work, credibility, expression, careers.
00:22:18In many ways, these deeply personal, highly valuable attributes are the foundations of
00:22:23the entire music ecosystem.
00:22:26And unauthorized exploitation of them using deepfakes does cause devastating harm.
00:22:31We have to prevent that harm.
00:22:34So my deepest thanks and the thanks of a very grateful music community go out to all of
00:22:40you, to Chairman Blackburn, to Ranking Member Klobuchar, to Senator Coons, and to all of
00:22:46the Senators, Senator Tillis, Hagerty, Durbin, Cassidy, Schiff, and I hope many more on this
00:22:51committee and throughout the Senate for introducing and supporting the NOFIX Act.
00:22:56You did it.
00:22:58After months, actually years, of work with each other, stakeholders, your counterparts
00:23:03in the House, you've been able to build bipartisan, bicameral, broad-based consensus around legislation
00:23:10that will protect not just artists, but all victims of deepfake abuses, including child
00:23:16exploitation and voice clone scams, which we'll hear about from the other witnesses
00:23:20today.
00:23:21You've shaped a common-sense bill that has won the support of AI companies like Google,
00:23:26who's here today, OpenAI, IBM, as well as broadcasters, motion picture studios, child
00:23:33protection groups, free market groups, labor unions, and virtually the entire creative
00:23:37community.
00:23:39That's hard to do.
00:23:41The NOFIX Act provides balanced, yet effective protections for all Americans, while supporting
00:23:46free speech, reducing litigation, and promoting the development of AI technology.
00:23:53It empowers individuals to have unlawful deepfakes removed from UGC platforms as soon as it can
00:23:59be done, without requiring anyone to hire lawyers or go to court in those situations.
00:24:06It contains clear exemptions for uses typically protected by the First Amendment, such as
00:24:11parody, news reporting, and critical commentary.
00:24:14And it encourages AI development and innovation, targeting only malicious applications and
00:24:21setting the stage for the legitimate licensing of rights with real and meaningful consent.
00:24:28NOFIX is the perfect next step to build on after the Take It Down Act.
00:24:34It provides a civil remedy to victims of invasive harm that go beyond the criminal
00:24:38posting of intimate images addressed by that legislation, and protects artists like Martina
00:24:44from nonconsensual deepfakes and voice clones that breach the trust she's built with millions
00:24:49of fans.
00:24:51American music is the most valuable music in the world.
00:24:56We lead in investment, exports, and market power.
00:25:00Music drives the success of other important American industries, including the technology
00:25:05industry, through thriving partnerships.
00:25:08If we signal to the rest of the world that it's acceptable to steal Americans' voices
00:25:13and likenesses, we have the most to lose.
00:25:17Our voices and our music are the most popular and will be taken the most, destabilizing
00:25:23the music economy, our intellectual property system, our national identity, and the very
00:25:28humanity of the individuals who bless us with their genius.
00:25:32The NOFIX Act is a critical step in setting America up as an example, and to continue
00:25:38and extend its global leadership in innovation and creativity.
00:25:42It shows that we can boost AI development while preserving every individual's autonomy,
00:25:48all individual liberties, and protect our constitutional property rights at the same
00:25:53time.
00:25:55We are really proud to support this legislation, and we vow to help you pass it into law this
00:25:59year.
00:26:00Thank you again.
00:26:01We thank you.
00:26:03Ms. Price, you're recognized for five minutes.
00:26:08Chair Blackburn, Ranking Member Klobuchar, thank you for holding this hearing and addressing
00:26:12this truly urgent matter.
00:26:15My name is Kristen Price, Senior Legal Counsel at the National Center on Sexual Exploitation,
00:26:21a nonpartisan nonprofit dedicated to eradicating all forms of sexual exploitation by exposing
00:26:27the links between them.
00:26:29Our law center represents survivors in lawsuits against those who perpetrate, enable, and
00:26:35profit from sex trafficking, including pornography companies.
00:26:39Contemporary pornography depicts and normalizes violence, including asphyxiation, electrocution,
00:26:46and rape.
00:26:47This is pervasive.
00:26:48The top four sites, Pornhub, Xvideos, Xhamster, and X and XX, had nearly 60 billion total
00:26:56visits in 2024.
00:26:59One woman's husband sexually assaulted her while she was sleeping and put the video on
00:27:03Xvideos, which was tagged sleeping pills.
00:27:07Pornhub hosts child sexual abuse material and sex trafficking content, with their employees
00:27:12admitting that traffickers use their sites with impunity.
00:27:18Forged or deepfake pornography uses AI that is trained on this kind of abusive content,
00:27:23merging it with the faces of other women and girls.
00:27:26A 2023 report found that deepfake pornography increased by 464% between 2022 and 2023.
00:27:36The top 10 deepfake pornography sites had 300 million video views in 2023.
00:27:4398% of all deepfake videos are pornography related, and 99% of those who are targeted
00:27:50are women.
00:27:51The perpetrators are disproportionately male.
00:27:54One survey found that 74% of deepfake pornography users don't feel guilty about it.
00:28:02A high schooler discovered a boy she'd never met took a photo off of her Instagram, created
00:28:07an AI deepfake, and circulated it through Snapchat.
00:28:11Two years later, she still hasn't been able to remove all the images.
00:28:16A woman whose close family friend made deepfake pornography of her said, my only crime was
00:28:21existing online and sharing photos on platforms like Instagram.
00:28:25The person who did this was not a stranger.
00:28:28I was not hacked, and my social media has never been public.
00:28:33These are serious human rights abuses, violating the person whose face is depicted and the
00:28:38person whose body is shown.
00:28:41The perpetrators report fear, isolation, shame, powerlessness, suicidal thoughts, doxing,
00:28:48harassment from sex buyers, and difficulty attending school, maintaining jobs, and participating
00:28:55in public life.
00:28:57This is a form of sexual exploitation from which it is impossible to fully exit.
00:29:03There's a very old idea that to protect more privileged women from male violence, society
00:29:09needs an underclass of women that men can violate with impunity.
00:29:14This was always a morally inexcusable premise, and the rise of forged pornography shows that
00:29:19it is also a lie.
00:29:22Deepfake technology allows any man to turn any woman into his pornography.
00:29:28These are impossible conditions for equality.
00:29:30As Andrew Dworkin stated in 1986, the civil impact of pornography on women is staggering.
00:29:37It keeps us socially silent, socially compliant.
00:29:40It keeps us afraid in neighborhoods, and it creates a vast hopelessness for women, a vast
00:29:44despair.
00:29:46One lives inside a nightmare of sexual abuse that is both actual and potential, and you
00:29:50have the great joy of knowing that your nightmare is someone else's freedom and someone else's
00:29:55fun.
00:29:57The harms are severe and irreversible, so deterrence is essential.
00:30:02This is why Nkosi supported the bipartisan effort to pass the Take It Down Act, which
00:30:06the president signed into law on Monday, and requires online platforms to remove nonconsensual
00:30:12content within 48 hours of being notified.
00:30:15Nkosi strongly supports three additional bills that complement Take It Down, the No Fakes
00:30:20Act, the Kids Online Safety Act, and the Defiance Act.
00:30:24These bills help protect individuals from the harmful effects of image-based sexual
00:30:28abuse and increase pressure on tech companies to manage websites more responsibly.
00:30:33Finally, Nkosi is concerned about the recent AI state moratorium language included in the
00:30:39House Budget Reconciliation Bill, as it creates a disincentive for AI companies to put safety
00:30:45first.
00:30:47Technological progress should not come at the expense of human dignity.
00:30:50It is our collective responsibility to protect the voice, face, and likeness of every person
00:30:55from unauthorized use.
00:31:00We thank you, and I will note for the record that we're submitting your full testimony
00:31:06into the record with all of your footnotes.
00:31:08I really appreciate that.
00:31:10Thank you so much.
00:31:11Mr. Brookman, you're recognized for five minutes.
00:31:15Thank you, Chairwoman Blackburn, Ranking Member Klobuchar.
00:31:17Thank you very much for the opportunity to get to testify here today.
00:31:21I'm here on behalf of Consumer Reports, where I head up our work on tech policy advocacy.
00:31:25We're the world's largest independent testing organization.
00:31:28We use our ratings, our journalism, our surveys, our advocacy to try to create a more fair,
00:31:34healthier, and safer world.
00:31:35I'm gratified the committee is focusing on the problems created by audio and video deepfakes,
00:31:40which for better or worse, are getting more realistic and convincing every day.
00:31:44They're used in romance scams and grandparent scams, where a relative gets a frantic call
00:31:48from a distressed family member who's in immediate need of cash.
00:31:52As the Chairwoman noted, they're used in fake testimonial videos from celebrities hawking
00:31:56everything from meme coins to cookware.
00:31:59I believe Elon Musk is one of the most frequently impersonated celebrities online.
00:32:04As Ms. Price testified eloquently, obviously one of the most prevalent uses is for the
00:32:07creation of non-consensual intimate images and videos, and they're increasingly used
00:32:11to propagate misinformation, certainly in the political realm, but also in the more
00:32:15petty personal realm.
00:32:17There's a story in Maryland recently about an aggrieved teacher who created deepfake
00:32:22audio of his boss saying racist and anti-Semitic slurs.
00:32:26As this last example shows, realistic cloning tools are easily available to the public and
00:32:31very cheap and easy to use.
00:32:33Earlier this year, Consumer Reports conducted a study of six voice cloning tools that are
00:32:37easy to find online to see how easy it would be to create fake audio based on a public
00:32:42recording like a YouTube clip.
00:32:45Our study found that four of the six companies we looked at didn't employ any reasonable
00:32:49technical mechanisms to reasonably ensure they had the consent of the person whose voice
00:32:54was being cloned.
00:32:55Instead, the customer just had to click, like, yes, I have the person's consent, to require
00:33:00the person to read a script, so to help indicate the person was on board with having their
00:33:05voice cloned.
00:33:07Four of the companies also did not collect much identifying information from customers,
00:33:11just a name or an email address to start creating deepfake voice clones.
00:33:15Given how likely abuse these services is, I don't think they were doing enough, and
00:33:19a lot of our members agree.
00:33:21We recently got 55,000 signatures on a recent petition asking the Federal Trade Commission
00:33:26and state attorneys general to investigate whether these services were in violation of
00:33:30existing consumer protection laws.
00:33:33And that brings me to solutions.
00:33:34So one thing, we need strong consumer protection agencies who have the resources to crack down
00:33:39on abuse of emerging technologies.
00:33:42Last year, the FTC brought a handful of AI cases as part of Operation AI Comply, but
00:33:47they don't have the capacity right now to confront the massive wave of scams and abuses
00:33:51online.
00:33:53Tools and responsibilities.
00:33:54I think some of these AI-powered tools are designed such that they're almost always going
00:33:58to be used for illegitimate purposes, whether it's deepfake pornographic image generators
00:34:03or voice impersonation.
00:34:06Developers of these tools need to have heightened obligations to try to forestall harmful uses.
00:34:11If they can't do that, then maybe they should not be freely available to the public.
00:34:15Platforms, too, need to be doing more to proactively get harmful material off their
00:34:18platforms.
00:34:19It's a very difficult job.
00:34:21It takes resources, but it absolutely needs to be done.
00:34:24Transparency.
00:34:25People deserve to know whether the content they're seeing online is real or fake.
00:34:29I know there have been a number of bills introduced in this Congress to try to address that.
00:34:33Also, there's a law recently passed in California to start to put transparency obligations on
00:34:38entities that make deepfake content.
00:34:42Stronger privacy and security laws.
00:34:44As this committee knows very well, the United States generally has fairly weak legal protections.
00:34:48As a ranking member noted, the ready availability of information about us online makes it easier
00:34:54for scammers to target us with scams.
00:34:56We've seen a ton of progress at the state level on privacy and security laws, but they're
00:35:00not strong enough.
00:35:02Whistleblower protections and incentives.
00:35:04In many cases, we only find out about abuses inside these tech companies when someone comes
00:35:09forward with their story.
00:35:11I was glad to see bipartisan legislation introduced on this issue, protecting AI whistleblowers,
00:35:17in the last week.
00:35:18Education.
00:35:19I don't want to put all the burden on consumers, but the reality is this is the world we live
00:35:24in.
00:35:25We need to teach people to look out for these sorts of scams.
00:35:27We're part of a campaign called Pause Take Nine, which tries to train people that if
00:35:31they get an urgent call to action, they should pause, take nine seconds, think about if this
00:35:35is real or not.
00:35:36And finally, I want to echo the words of Ms. Price about a lot of discussion about a moratorium
00:35:42on state laws policing bad uses of AI.
00:35:45I want to stress this is the wrong idea.
00:35:47AI has tremendous, amazing potential, but as this hearing shows, it has some real potential
00:35:52harms as well.
00:35:53The states have been leaders in trying to address these harms, whether it's privacy,
00:35:57co-opting performers' identities, regulating self-driving cars, rooting out hidden biases,
00:36:02other deep fakes.
00:36:03AI is an incredibly powerful technology, but that does not mean it should be completely
00:36:08unregulated.
00:36:09Thank you very much, and I look forward to answering your questions.
00:36:12And Ms. Carlos, you're recognized for five minutes.
00:36:17Chairwoman Blackburn, Ranking Member Klobuchar, and members of the subcommittee, thank you
00:36:21for the opportunity to speak with you today on the important topic of the No Fakes Act
00:36:26and AI-generated digital replicas.
00:36:28My name is Susanna Carlos, and I serve as the head of music policy for U2.
00:36:33This last month, U2 marked the 20th anniversary of the first video ever uploaded to our platform.
00:36:38It's difficult to fathom how much the world and U2 have changed in those two short decades.
00:36:43Today, we have over 2 billion active monthly members on our platform across more than 100
00:36:49countries, with 500 hours of content uploaded every minute.
00:36:53We are proud that U2 has transformed culture through video and built a thriving creator
00:36:57economy here in the United States and around the world.
00:37:01Our unique and industry-leading revenue-sharing model empowers our creators to take 55 percent
00:37:06of the revenue earned against ads on their content.
00:37:10As a result, U2's creative economy has contributed more than $55 billion to the United States'
00:37:16gross domestic product and supported more than 490,000 full-time American jobs in the
00:37:22last year alone.
00:37:24In the three years prior to January 2024, U2 paid more than $70 billion to creators,
00:37:29artists, and media companies.
00:37:32At U2 Music, we built one of the world's deepest catalogs.
00:37:36Over 100 million official tracks, plus remixes, live performances, covers, and hard-to-find
00:37:41music you simply can't find anywhere else.
00:37:44We've now reached over 125 million paid U2 music and premium subscribers, and U2 continues
00:37:50to be at the forefront of handling rights management at scale, protecting the intellectual
00:37:55property of creators and our content partners, ensuring that they can monetize their content
00:37:59and keeping U2 free for viewers around the world.
00:38:03In 2007, U2 launched Content ID, a first-of-its-kind copyright management system that helps rights
00:38:09holders effectively manage their works.
00:38:12Rights holders or their agents provide U2 with reference files for their works they
00:38:16own, along with metadata such as title and detailed ownership rights.
00:38:21Based on these references, U2 creates digital fingerprints for those works in question and
00:38:25scans the platform to determine when content in an uploaded video matches the reference
00:38:29content.
00:38:31Rights holders can instruct the system to block, monetize, or track the reference content.
00:38:37Over 99% of the copyright issues on YouTube are handled through Content ID.
00:38:42It has also proven to be an effective revenue generation tool for rights holders, as over
00:38:4690% of Content ID claims are monetized.
00:38:50As we navigate the evolving world of AI, we understand the importance of collaborating
00:38:54with partners to tackle emerging challenges proactively.
00:38:58We firmly believe that AI can and will supercharge human creativity, not replace it.
00:39:03Indeed, AI has the potential to amplify and augment human creativity, unlocking new opportunities
00:39:09for artists, creators, journalists, musicians, and consumers to engage creatively with new
00:39:14tools and play an active role in innovation.
00:39:18We are already seeing creators exploring new areas, including the creation of new types
00:39:22of music, books, photography, clothing, pottery, games, and other art inspired in collaboration
00:39:29with AI models.
00:39:30And as this technology evolves, we must collectively ensure that it is used responsibly, including
00:39:35when it comes to protecting our creators and viewers.
00:39:39Platforms have a responsibility to address the challenges posed by AI-generated content,
00:39:43and Google and YouTube stand ready to apply our expertise to help tackle them on our services
00:39:47and across the digital ecosystem.
00:39:50We know that a practical regulatory framework addressing digital replicas is critical, and
00:39:55that is why we are especially grateful to Chairwoman Blackburn, Senator Coons, Ranking
00:39:59Member Klobuchar, and all the bill sponsors for the smart and thoughtful approach adopted
00:40:03in developing the No Fakes Act of 2025.
00:40:07We deeply appreciate their willingness to bring a variety of stakeholders together to
00:40:11forge a consensus on this important topic.
00:40:14YouTube and Google are proud to support this legislation, which tackles the problems of
00:40:18harm associated with unauthorized digital replicas, and provides a clear legal framework
00:40:23to address these challenges and protect individuals' rights.
00:40:26The No Fakes Act appropriately balances innovation, creative expression, and individuals' rights,
00:40:32while offering a broadly workable, tech-neutral, and comprehensive legal solution.
00:40:38By supplanting the need for a patchwork of inconsistent legal frameworks, the No Fakes
00:40:42Act would streamline global operations for platforms like ours, and empower artists and
00:40:47rights holders to better manage their likeness online.
00:40:50We look forward to seeing the legislation passed by Congress and enacted into law.
00:40:54We have similarly proudly supported the Take It Down Act, because it's critical to prevent
00:40:58bad actors from producing and disseminating non-consensual explicit images.
00:41:03We would like to thank Ranking Member Klobuchar, along with Senator Coons, for their leadership
00:41:07on the legislation.
00:41:08This is an area we continue to invest in at Google, building our longstanding policies
00:41:13and protections to ultimately keep people safe online.
00:41:17Thank you again for inviting me to participate in today's hearing.
00:41:20I look forward to your questions.
00:41:22And we thank you all for sticking to the five-minute clock.
00:41:25I didn't have to gavel down a person.
00:41:29These are great content creators, Amy.
00:41:32There we go.
00:41:33I'm going to recognize myself for five minutes for questions, and as Senator Coons said earlier,
00:41:40there are going to be members coming and going, because we do have a variety of hearings that
00:41:45are going on.
00:41:46Mrs. McBride, I want to come to you first.
00:41:50I think that your perspective is such an important perspective as we talk about this and talk
00:41:58on the direct impact to someone who is creating content.
00:42:05And I appreciated so much that in your testimony, you talked about how your voice and likeness,
00:42:15along with so many other creators, that that is at risk.
00:42:19And therefore, your livelihood is at risk.
00:42:24So talk a little bit about how harmful deepfakes are in the long term and why it is important
00:42:33to get legislation like this to the president's desk.
00:42:38And then talk about fellow artists that you have spoken with and their concerns on the
00:42:44issue.
00:42:45Well, as you said, it does affect livelihood for art musicians, backup singers, voiceover
00:42:55actors, authors, like so many people in the arts.
00:43:00For me, being established and having done this for over 30 years, that's not necessarily
00:43:06my first concern.
00:43:10I have the luxury of that not being my first concern, but it is a concern for younger artists
00:43:14that are coming up.
00:43:16So as I said in my testimony, the thing that I'm most concerned with personally is how
00:43:22we work so hard to present ourselves with integrity and a certain character.
00:43:28And the fact that long term, that could be distorted or manipulated to be the opposite
00:43:36of what I stand for and what I believe.
00:43:39Or to be used to cause harm to someone through endorsing a harmful product.
00:43:46Or, you know, far into the future after I'm gone, somebody creating a piece of music or
00:43:53me saying something that I never did, and it just kind of like disintegrating what I've
00:43:59worked so hard to establish, which is trust with my fans, with people who, you know, when
00:44:04I say something, they believe it.
00:44:06I think for younger artists, to your point of livelihood, to be new and having to set
00:44:15up what you stand for and who you are as a person and as an artist, what you endorse,
00:44:19what you believe in, and establishing a trust with your fans.
00:44:24And then on top of it, having to navigate this, these waters of someone coming in and
00:44:30distorting all of that is devastating.
00:44:33Like I don't know how, I can't stress enough how it can impact the careers of up and coming
00:44:38artists.
00:44:39And even just in their ability to speak their truth or just to live in fear of being a victim
00:44:47of these deepfakes.
00:44:48Yeah.
00:44:49Mr. Glazier, I want to come to you on something you mentioned about the critical balance of
00:44:57protecting the artist voices and likenesses, and then also reducing litigation.
00:45:05And that is why we need to have this framework.
00:45:10And I think helping artists stay out of court, I mean, they're at a point where they may
00:45:16have to spend much of what they've earned in order to protect themselves and to protect
00:45:21their brand, if you will, if you'll elaborate on that.
00:45:25Sure.
00:45:26I'm happy to.
00:45:27The bill has to be effective and practical at the same time, both for the victim and
00:45:33for the platform who is going to limit the damage to the victim.
00:45:37It has to work on both ends.
00:45:39And that's why I think the approach that was taken both in the Take It Down Act and in
00:45:43this act are so important, especially in areas where the platform has less knowledge and
00:45:49less control because end users are posting on the platform.
00:45:54And those can go viral very, very quickly.
00:45:57The ability for the platform to take it down as soon as technically and practically feasible
00:46:05so that they stop the damage, and to keep it down so that the artist or any other victim
00:46:12doesn't have to spend their lives monitoring a platform and continually sending more notices
00:46:17and more notices as end users keep putting up the same material over and over and over
00:46:22again.
00:46:23We now have tools that will allow the removal off of the platform.
00:46:30And once the removal is done, the damage can be limited.
00:46:34There is no liability for the platform.
00:46:37And the artist doesn't have to spend their time just litigating.
00:46:42Where there is more knowledge and control, right, where the platform has an employee
00:46:46upload it, for example, then there should be responsibility on the platform.
00:46:51And those are cases where you might need to go to court because the platform could
00:46:56have prevented it and they didn't prevent it.
00:46:58So I think the bill is incredibly balanced and really innovative in its approach to protecting
00:47:05free speech, reducing litigation, but also effectively protecting the right that's necessary.
00:47:10All right.
00:47:14Thank you very much.
00:47:15I guess I'll start with Mr. Brookman, the non-Grammy winner.
00:47:21And I want to talk to you just a little bit about this consumer angle here, which I think
00:47:27is interesting to people.
00:47:29I think at its core, all of us involved in this legislation have made it really clear
00:47:34that it's not just people who are well known that will be hurt by this eventually and that
00:47:41getting this bill passed as soon as possible is just as important for everyone.
00:47:46But I do so appreciate Ms. McBride's being willing to come forward because those stories
00:47:52and the stories that we've heard from, like I mentioned, Jamie Lee Curtis or the stories
00:47:57that we've heard from many celebrities are very important to getting this done.
00:48:01So you just did a report, AI-generated voice cloning scams, including that AI voice cloning
00:48:08applications, in the words of the report, presents a clear opportunity for scammers.
00:48:13And we need to make sure our consumer protection enforcers are prepared to respond to the growing
00:48:19threat of these scams.
00:48:20I had this happen to my state director's husband, whose kid is in the Marines, and they got
00:48:26a call.
00:48:27They figured out that it wasn't really him asking for stuff and money.
00:48:32They knew he couldn't call from where he was deployed to.
00:48:34But this is just going to be happening all over the place.
00:48:37And the next call will be to a grandma who thinks it real and she sends her life savings
00:48:42in.
00:48:43So I have called on the FTC and the FCC to step up their efforts to prevent these voice
00:48:47cloning scams.
00:48:50What are some of the tools that agencies need to crack down on these scams, even outside
00:48:55of this bill?
00:48:56Yeah, absolutely.
00:48:57So I think the first thing the federal trade commission probably needs is more resources.
00:49:00They only have like 1,200 people right now for the entire economy.
00:49:05It's down from like 100 just in the past couple of months.
00:49:08Way down from even during like the Nixon era.
00:49:10Yeah, like 1,700 it used to be, and the economy has grown like three or four times.
00:49:16Chairman Ferguson has said more cuts are coming, which I think is the wrong direction.
00:49:20I worked at the Federal Trade Commission for a couple of years.
00:49:22We could not do a fraction of all the things that we wanted to do to protect consumers.
00:49:26So more people, more capacity, more technologists.
00:49:29There's just not enough technology capacity in government.
00:49:32I was in the Office of Technology Research and Investigation there.
00:49:36That was like five people.
00:49:37That's just not enough, obviously, with all these very sophisticated, I mean, just deep
00:49:41fakes alone, let alone the rest of the tech economy.
00:49:44The ability to get penalties and even injunctive relief, right?
00:49:47If someone gets caught stealing something, the FTC often doesn't have the ability to
00:49:51make them give the money back.
00:49:53I know this committee has tried to restore that authority, but that would be important.
00:49:58And also, again, maybe Claire, FTC could have rulemaking authority, but also this, I would
00:50:03like to see Congress consider legislative authority to address tools.
00:50:08Like, again, if you are offering a tool that can be used only for harm, voice
00:50:12impersonation, deep fake, pornographic images, maybe there should be
00:50:16responsibilities to make sure it's not being used for harm.
00:50:19OK, thank you.
00:50:22Ms. Carlos, can you talk about what YouTube is doing to ensure it's not
00:50:27facilitating these scams?
00:50:31Sharon, thank you for the question, Senator, and thanks for your support for the bill, of
00:50:34course. So just to primarily consider, we obviously see great and tremendous
00:50:39opportunity coming from A.I., but we also acknowledge that there are risks and it is
00:50:44our utmost responsibility to ensure that it is deployed responsibly.
00:50:48So we've taken a number of efforts to to protect against unharmful contact on our
00:50:53platform. Primarily, we have uploaded.
00:50:56We have updated our privacy policies last year to ensure that all individuals can now
00:51:01submit a notice to YouTube when their unauthorized voice or likeness has been used on
00:51:05our platform. And once reviewed, if it is applicable, we've confirmed that that content
00:51:10should be removed. We will take it down.
00:51:12We've additionally implemented watermarks on our A.I.
00:51:16products. We originally began with both image and watermarks using our Synth ID
00:51:20technology, and we've recently expanded it to also be applied to text generated from
00:51:25our Gemini app and web experience.
00:51:28And most recently, as part of our video video tool, we've also taken the additional
00:51:33step to become a member of C2PA, the Coalition for Content Provenance and
00:51:39Authenticity. And there we're serving as a steering member to work with the
00:51:43organization to create indicators and markings that will allow the content provenance
00:51:47that was created off platforms to additionally be recognized.
00:51:51And we're deploying those technologies across our platform.
00:51:54OK, thank you.
00:51:56We've mentioned the Take It Down Act, and thank you for the support for that.
00:52:00Mr. Glazer, you talked about how this is the first federal law related to generative
00:52:05A.I. and that it's a good first step.
00:52:09And could you talk about how if we don't move on from there and we just stop and
00:52:16don't do anything for years, which seems to be what's been going on, what's going to
00:52:21happen here and why it's so important to do this?
00:52:25I think there's a very small window and an unusual window for Congress to get ahead
00:52:31of what is happening before it becomes irreparable.
00:52:35The Take It Down Act was an incredible model.
00:52:38It was done for criminal activity, you know, cabined in, you know, right, you know, you
00:52:44wrote it. But but it was a great model, but it only goes so far.
00:52:48But we need to use that model now and we need to expand it carefully in a balanced way
00:52:53to lots of other situations, which is exactly what the No Fakes Act does.
00:52:58Right. And I think, you know, we have a very limited amount of time in order to allow
00:53:03people and platforms to act before this gets to a point where it's so far out of the
00:53:09barn that instead of encouraging responsible A.I.
00:53:11development, instead we allow investment and capital to go into A.I.
00:53:17development that hurts feeling things.
00:53:19Yeah. So let's encourage investment the right way to boost great development and be
00:53:23first. Let's not be the folks that encourage investment in A.I.
00:53:27technologies that really harm us.
00:53:29And Ms. Price, you expressed concerns about this 10 year moratorium on state rules.
00:53:35I'm very concerned, having spent years trying to pass some of these things.
00:53:39And I think that one of the ways we pass things quickly, like Mr.
00:53:42Glazer was talking about, is if people actually see a reason that they don't want to
00:53:46patchwork, they want to get it done.
00:53:47But if you just put a moratorium and you look at like the Elvis law coming out of
00:53:52Tennessee, Ms. McBride and some of the other things that would stop all of that.
00:53:56Could you. My last question here before we go to another round, could you talk about
00:54:01why you're concerned about what is right in front of us now, which is this 10 year
00:54:05moratorium? Yes.
00:54:06Thank you for the question, Senator.
00:54:08We're concerned about the moratorium because it's basically signaling to the A.I.
00:54:12companies that they can kind of do whatever they want in the meantime.
00:54:16And it inhibits states ability to adapt their laws to this form of technology that's
00:54:22changing very quickly and then has this potential to cause great harm.
00:54:27Thank you. And I know Senator Coons is on his way and Senator Holly is coming back,
00:54:33but Ms. Price staying with you, you talked about the Take It Down Act and the
00:54:40importance there.
00:54:42But touch on the gap that no fakes fields for a child who may have something posted,
00:54:50but yet it doesn't fit under take it down and how this would open up an avenue of
00:54:56recourse for them.
00:54:58Yes. Thank you, Senator.
00:54:59So under the no fakes act, because there is a private right of action, there would be
00:55:05another way essentially for a victim to seek accountability from a perpetrator or
00:55:12platform, which is really important because the layers of accountability are what
00:55:16really deter bad actors from engaging in harm.
00:55:19So having the criminal, but then also having the ability to do the private right of
00:55:24action, the civil action is important.
00:55:27And speaking to the states and their actions, I do want to mention that Tennessee
00:55:34passed the Elvis Act, which is like our first generation of the no fakes act.
00:55:43And we certainly know that in Tennessee we need those protections.
00:55:49And until we pass something that is federally preemptive, we can't call for a
00:55:55moratorium on those things.
00:55:57So excellent statement, of course.
00:56:04Of course, Miss Carlos, I want to talk with you for just a minute.
00:56:10And we're we're grateful for the support that you all have talked about.
00:56:16And there's a provision in the bill that I know is important to your platform and many
00:56:22others. And that's the notification piece and giving individuals harm.
00:56:30You've talked about artists being able to contact you, but for you all to be able to
00:56:38to notify and letting people know about this and then asking for that content to come
00:56:49down and then taking that action is we have worked on the Kids Online Safety Act.
00:56:54One of the complaints that had come to Senator Blumenthal and I from individuals that
00:57:00tried to get things off was they could not get a response.
00:57:06So this is something that that notification is an imperative.
00:57:13So talk a little bit about how you're approaching notification.
00:57:18Thank you. Thank you for the question.
00:57:20Yes. So in looking at the framework of no fakes again, we began with a voluntary
00:57:24framework on YouTube, which allows individuals to notify us when digital replica
00:57:29content of them is online.
00:57:30And this is smartly mirrored in the No Fakes Act.
00:57:33It empowers a user to identify content and flag it to us when they believe it should be
00:57:38removed for an unauthorized use of their voice or likeness.
00:57:41And as you mentioned, that notification is critical because it signals to us when the
00:57:46difference between content that is authorized and harmful fakes.
00:57:49And it's with that notice that we are able to review content and make an informed
00:57:54decision as to whether or not it should be removed.
00:57:57And then what is your length of time for getting it down upon receiving notification?
00:58:05What what is your process going to be on implementation?
00:58:10Sure. So as a similar framework, we envision as under the DMCA, where a web form would
00:58:15be easily available for any user quickly filled out and then submitted to our trust
00:58:19and safety team. We make every effort to review every notice on a case by case, case
00:58:24by case basis and remove it as soon as possible.
00:58:27So are you talking hours, days?
00:58:30What is your framework?
00:58:31I don't have the exact number on the top of my head, but I do know that we try to
00:58:35process every notification as quickly as possible.
00:58:38Thank you. I if you will check on that and then get that information back to us.
00:58:44I think we would like to know that because the fact that this has taken such lengths
00:58:51of time for people to have any kind of response has been very difficult for consumers
00:58:58and they feel like they're talking to the outer space and nobody is listening and
00:59:05nobody's responding.
00:59:07Thank you for flagging the concern.
00:59:08I'd be happy to follow up with you in the committee.
00:59:10I appreciate that. Senator Coons, you're recognized for five minutes.
00:59:13Thank you so much, Madam Chair.
00:59:14I'd like to first thank Miss McBride for being here to testify in support of no
00:59:20fakes. Could you speak to why this bill is so important, both to protect artists like
00:59:26you and to protect your fans?
00:59:33Thank you. I think that it's important because it's as artists, we hopefully want
00:59:42to speak the truth.
00:59:44We want to build a relationship with our fans in which they trust us.
00:59:47So when they believe what we say.
00:59:50So when you have something like a deep fake that either sells a product or says a
00:59:57statement, it can be so harmful to that trust.
01:00:01You know, I mean, I just realized sitting here that I bought and I bought a product, a
01:00:05collagen supplement off of Instagram the other day because it had Leanne Rimes and a
01:00:10couple of other people. And I'm sitting here thinking, oh, my goodness, I don't even
01:00:13know if that was really them.
01:00:14Right. So it's it's damaging to the artist and to the fan.
01:00:19You know, we had a we had a situation personally where one of my fans believe they
01:00:25were talking to me, ended up selling their house and funneling the money to someone
01:00:32who they thought was me. That is so devastating to me to realize that that somebody
01:00:38could who trusts me could be duped like that.
01:00:41You know. And then also, I think that eventually somebody who is duped by a deep
01:00:49fake is going to be angry enough to to, you know, have retribution, which we're on.
01:00:55We're on stages in front of thousands of people.
01:00:57We're in public places.
01:00:59So it's it's a danger to the artist as well.
01:01:03Mr. Glazer, to follow up on Miss McBride's testimony, what do you think are the
01:01:07consequences for the music industry if we don't get no fakes over the finish line?
01:01:11What what will the consequences be for music fans and for the industry?
01:01:16The the entire music ecosystem is dependent on the authentic voice and the authentic
01:01:23image of the artist.
01:01:24Right. That is what the music industry is.
01:01:29If you allow deep fakes to perpetuate, you're taking the soul out of the art.
01:01:34And when you do that, you're taking the humanity out of the art.
01:01:38And that's what art is.
01:01:40So I think it's fairly existential that the voice of music be the voice of music.
01:01:48I think that's what everything is built on.
01:01:51And the idea that it's almost bizarre that we have to sit here today talking about
01:01:57allowing someone to protect the use of themselves.
01:02:00If there's anything that we have a right in and should be able to control, it's the
01:02:05gifts that God gave us, the voices that we have, the image that we have.
01:02:09And for that to be taken from you is devastating, both for the individual and
01:02:15obviously for the industry itself, which is built on these very voices.
01:02:19Miss Carlos, if I might, I just want to thank you for YouTube's partnership in getting
01:02:25to the place where you support no fakes.
01:02:27Other tech companies haven't come forward.
01:02:30I'd be interested in what you might say or encourage me to say to the medias or
01:02:34TikToks of the world about why they should support this bill, even though it imposes
01:02:38new obligations on them.
01:02:40And some have argued that no fakes might show legitimate speech by incentivizing
01:02:45platforms to overremove content out of fear of being sued.
01:02:49How does YouTube think about balancing its obligations under this bill with its
01:02:54First Amendment obligations to users?
01:02:57Thank you for the question, Senator.
01:02:59As we mentioned, YouTube largely supports this bill because we see the incredible
01:03:03opportunity of A.I., but we also recognize those harms and we believe that A.I.
01:03:08needs to be deployed responsibly.
01:03:10I believe Mr. Glazer mentioned during his opening statement that the No Fakes Act does
01:03:14carry First Amendment exemptions, parody, satire, newsworthiness.
01:03:18And that is one of the reasons that we felt comfortable endorsing this bill.
01:03:21We are, at the end of the day, an open platform and we believe that a variety of
01:03:25viewpoints can succeed on YouTube.
01:03:27So those would be some of the things that I would share with you to share with those
01:03:30other companies. But I cannot speak directly on behalf of why they may or may not
01:03:34choose to endorse the bill.
01:03:35Understood. Thank you. And thank you all for your testimony today.
01:03:38Thank you. Senator Hawley, you're recognized for five minutes.
01:03:42Thank you very much, Madam Chair.
01:03:43Thanks to all of the witnesses for being here.
01:03:46Ms. Carlos, if I could just start with you.
01:03:49You you're here on behalf of YouTube, is that right?
01:03:52That is correct.
01:03:53Can you tell me why is it that YouTube has monetized videos
01:03:58that teach people how to generate pornographic deep fakes of women?
01:04:02Why does that happen on your platform?
01:04:04Thank you for the question. Protecting our users is one of our top priorities.
01:04:08My general expertise is in music policy, so I'm not in the best position to answer
01:04:13that question. But I'm happy to follow up with you.
01:04:15Do you know how many such videos there are out there that are good?
01:04:18These are monetized videos now on YouTube.
01:04:22I'm not aware of that number.
01:04:23I can't say that our community policies do not allow that type of content on our
01:04:27platform. Well, Forbes magazine just reported that YouTube's, in fact, promoted
01:04:31over 100 YouTube videos with millions of views
01:04:35that showcase AI deepfake porn and include tutorials on how
01:04:40to make deepfake porn, particularly porn that targets
01:04:45young women. Do you have any idea how much money YouTube has made
01:04:49off of this monetization?
01:04:51Thank you for bringing this to my attention.
01:04:53I do not have detail on this specific news article.
01:04:55I'm happy to follow up with you in the committee.
01:04:57So you don't have any idea of how many ad dollars YouTube has made off of this?
01:05:00I do not.
01:05:02Are you aware that one of these websites that was promoted by YouTube in these
01:05:05videos was later cited in a criminal prosecution for AI
01:05:10sexual abuse material generating?
01:05:12Let me be more specific. Generating AI sexual abuse material involving children.
01:05:17Thank you again for the question, Senator.
01:05:18As we mentioned earlier, YouTube has endorsed the Take It Down Act, and we take
01:05:22these issues very seriously.
01:05:24Again, I'll notify that I represent music policy and do not have the information
01:05:28to give you a fulsome response.
01:05:30Let me ask you this then. If a teenage girl's face ends up
01:05:35in an AI porn video on your platform, what does YouTube do about it?
01:05:39What's her recourse right now?
01:05:41What can she do to get some recompense, get some restitution?
01:05:47After over a year ago, we updated our privacy policy so that anybody who believes
01:05:51that their voice or likeness is being used without their authorization on our
01:05:54platform can submit a request for removal.
01:05:57A request for removal. Is there some policy in getting
01:06:02reimbursement for any profits the company may have made?
01:06:05Again, if these videos are monetized, I mean, does the victim get a share of
01:06:08anything?
01:06:09I'm not aware of those policies. I would have to follow up with you, Senator.
01:06:13Why is it that the enforcement of YouTube's
01:06:17own policy here seems to only happen after videos go viral?
01:06:21Is there a reason for that?
01:06:22I do not have the answer to that question to you.
01:06:24Do you know how many AI generated deepfake videos or deepfake content is removed
01:06:29before a victim complains?
01:06:31Does the victim have to complain before YouTube does anything?
01:06:33Again, my specialty is in music policy.
01:06:36I do understand that we use technology such as AI to search for that content.
01:06:39And when it is in violation of our policies, we will remove it.
01:06:43Let me ask you about this.
01:06:45YouTube training data.
01:06:47Has YouTube provided data for use in Google's Gemini or other AI training
01:06:51programs?
01:06:52YouTube does provide data in Google training data in accordance with our
01:06:55agreements.
01:06:55So if an artist uploads music to YouTube, does the company use that music to
01:07:00train AI models?
01:07:01As I mentioned, we do share data in accordance with our agreements.
01:07:04I can't speak to the specifics of any individual agreement.
01:07:07Well, so how are people like Ms. McBride protected?
01:07:10I mean, so if you're an artist and you put any content on YouTube, does that
01:07:14mean that it's just it's just free range?
01:07:17I mean, you can do whatever you want with it.
01:07:18Again, it goes down to the terms of our agreement.
01:07:21I will say that we have forged deep partnerships with the music industry.
01:07:24We came out of the gate with forming AI music principles with the music
01:07:28industry and are continuing to experiment with them to see how AI can best
01:07:32benefit their creative process.
01:07:33So are there privacy protections you're telling me YouTube hasn't placed
01:07:36privacy protections for artists?
01:07:38We they apply to all individuals on our platform.
01:07:41Oh, so is this you have to this is the click wrap scenario.
01:07:44This is in order to watch cute dog videos or whatever.
01:07:46You've got to click the I consent and that that wraps in.
01:07:50You basically give consent for your stuff to be used.
01:07:52There are all different types of various agreements, but our terms of service are
01:07:55included in that batch.
01:07:56So where are I guess my question is, where are users told about their privacy
01:08:01protections if they have any and where do they explicitly consent?
01:08:04There are, they agree to our terms of service and we also have our, our
01:08:07privacy policy available on the web.
01:08:08Okay.
01:08:09So that's, that's the click wrap.
01:08:10So in other words, if you, you come onto YouTube, you want to use it.
01:08:13You, you click, you got to click through.
01:08:14So you click it and there, you basically agreed to allow you to, to give your
01:08:20content to AI and allow them to train it without any further consent.
01:08:23Is that basically it?
01:08:24Again, we implement our policies in terms of our agreement are what govern
01:08:28what goes into our training.
01:08:29Well, I'm, and I'm asking you the content of that agreement.
01:08:31So in other words, if I'm an artist and I upload something to YouTube and yeah,
01:08:36sure.
01:08:36I've, I've, I've clicked the button that says, yeah, I want to be able to use
01:08:39YouTube.
01:08:40Are you telling me that I don't have any further recourse if YouTube then goes
01:08:43and gives the information to AI models and systems, there's nothing further I
01:08:47can do or am I, am I missing something?
01:08:50It is.
01:08:51If it is in accordance with our agreements, we will share that data.
01:08:54Yeah.
01:08:55That seems like a big problem to me.
01:08:56That seems like a huge, huge problem to me.
01:08:58And the fact that YouTube is monetizing these kinds of videos seems like a huge,
01:09:02huge problem to me.
01:09:03I I'm glad you're here today.
01:09:05I wish there were more tech companies here today, but we've got to do more.
01:09:09I mean, uh, YouTube, I'm sure is making billions of dollars off of this.
01:09:12The people who are losing are the artists and the creators and the teenagers whose
01:09:17lives are upended.
01:09:18Uh, we, we've got to give individuals powerful enforceable rights in their
01:09:23images, in their property, in, in their lives back again, or this is just never
01:09:27going to stop.
01:09:28Thank you, madam chair.
01:09:30And that is the reason we have the no fakes bill and we are trying to push it
01:09:35across the finish line.
01:09:37I would like to offer a second round.
01:09:39Senator Klobuchar, do you have additional questions?
01:09:42Very, very short.
01:09:44I know that, um, Senator Coons has asked some of my questions about, um, just
01:09:48people's personal experience with this.
01:09:52Um, I guess I'd ask you, Mr.
01:09:53Glazer, I'm not sure you were asked about this.
01:09:55Do you agree that using copyrighted materials to create copycat content
01:10:01undermines the value of the music created by artists and could show creation of
01:10:06new art?
01:10:07Absolutely.
01:10:08If you know the, if, if the, if you are able to copy, uh, copyrighted material
01:10:14for any purpose without consent, uh, you're basically allowing the person
01:10:20who's copying to make the money and to do with it what they want, but not the
01:10:25creator who's supposed to actually control it and who made it to be
01:10:29compensated for it and to control its exploitation.
01:10:32It's the very opposite of what the constitution calls for in creating
01:10:36intellectual property.
01:10:37Very good.
01:10:38Um, one last question, last one on, uh, the consumer education issue that was
01:10:44raised.
01:10:44Uh, thank you.
01:10:45I'm sure you all care about it, but, um, Mr.
01:10:48Brookman, um, so while we should not place the burden solely on consumers to
01:10:55protect themselves from AI scams, I don't think that's going to work very well.
01:10:59What steps should Congress take to help educate consumers when it comes to AI
01:11:05literacy and the like?
01:11:06I think it's something we could have some agreement on.
01:11:09So, yeah, I mean, I think, I think a public, uh, spending the money for a
01:11:12public awareness campaign is, is I think a really good idea.
01:11:14I think people are, you know, hear stories of friends of friends who has
01:11:17happened to, but a lot of people just have no idea that the things they see
01:11:20online, the things they see on Facebook are, are, are, are just not real.
01:11:24Um, so in addition to, to log,
01:11:26you know, and that says I'm the fourth richest woman in the world now.
01:11:29Oh, congratulations.
01:11:30Yeah.
01:11:30That just, that just this week, I'm sorry.
01:11:32I don't want to exaggerate America.
01:11:35And then people try to defend me by sending out the list of the top 10
01:11:39richest with like Oprah.
01:11:40And I always think it's kind of sad that I'm nowhere near it, but yeah, that's
01:11:44the latest thing that's got there.
01:11:47Uh, go on training people to be, to be aware of it, to think about it, uh,
01:11:52just to, um, you know, watch out for social engineering attacks, um, false,
01:11:55you know, calls for urgency.
01:11:57Um, you know, deep, deep, big voice right now is usually good for a little
01:12:00while, but it's getting better.
01:12:02Right.
01:12:02Um, and, and it's going to continue to get better.
01:12:04Um, so one idea is, you know, having a family safe word, right.
01:12:08I think like a word that only you and your family know that the scammer can't
01:12:11get, um, but like they, they have access to a lot of personal data about us.
01:12:14So we are all vulnerable.
01:12:16The numbers are going up dramatically.
01:12:18So just like teaching people, um, like I said, it's, it's a shame.
01:12:21We have to teach people to do this, but it is the world we live in.
01:12:24Okay.
01:12:25Thank you, Senator Blackburn.
01:12:28Um, Ms.
01:12:29Price, I was glad to see president Trump sign the take it down act earlier this
01:12:34week.
01:12:34Why is no fakes still necessary?
01:12:37If take it down is on the books.
01:12:39Thank you, Senator.
01:12:40No fakes is still necessary because it provides a way for victims to bring, um,
01:12:47a civil lawsuit on their own behalf.
01:12:49And so there's an importance to having yes, on the one hand, the criminal piece,
01:12:54the criminal law accountability, and the required takedown under the FTC.
01:12:58But then of course, the victims being able to bring their own lawsuit, if they
01:13:02wish to do that, it's more effective for deterrence to, to have multiple things.
01:13:08Ms.
01:13:08Carlos, why did YouTube come to the table?
01:13:10Um, you could have just made it a whole lot harder for the bill to move forward
01:13:14if you didn't make concessions and agree to be a part of advocating for the bill.
01:13:23Thank you for the question.
01:13:24And thank you for including us in that round of stakeholders.
01:13:27So YouTube sits in a very unique kind of, um, universe.
01:13:30You know, we not only have our users and music partners and media partners, but
01:13:36we also have creators and that is one area where this, um, idea of digital
01:13:41replicas can cause real world harm.
01:13:44So in addition to supporting no fakes, which gives them the individual right to
01:13:48remove content, not just from YouTube, but from other platforms, we're continuing
01:13:52to invest in new technology, which we refer to as likeness ID, which will allow
01:13:57our participating, um, um, members and our pilot to have their, uh, base and
01:14:03scanned.
01:14:04We'll be able to match across our platform.
01:14:06So we're continuing to invest in this technology as we see it's a top issue.
01:14:10Thank you very much.
01:14:11Um, Senator Blackburn ranking member cloture.
01:14:14Thank you, Senator Coons.
01:14:16Uh, Mr.
01:14:16Glacier, I want you to touch on contracts.
01:14:20We've had quite a discussion this week on copyright and as artists negotiate
01:14:28these contracts for their name, their image, their likeness, uh, recently SAG
01:14:34AFTRA made a move in some of their negotiations on, uh, this, but talk a
01:14:41little bit about the importance of having a federal standard as it relates
01:14:47to standard contract law.
01:14:49Yeah, this, this goes to the very essence of consent for the artist.
01:14:54And so not only does the no fakes act give control and consent to the
01:15:00individual about the use of their voice and the use of their likeness, it also
01:15:04imposes some guardrails, uh, around what those, uh, the length of those
01:15:08contracts, uh, what those contracts mean when the person is alive, what that
01:15:12person means after the, uh, after the person passes.
01:15:16Um, and it also has special provisions that protect minors who might enter
01:15:21into contracts that includes parents and guardians and also court authority.
01:15:26So it does a very good job of preventing abuse while giving the power to the
01:15:32individual whose voice is at stake and whose image is at stake in being able to
01:15:36license it.
01:15:39Thank you so much, Ms.
01:15:40Price.
01:15:41I want you to submit to us, uh, you can do this in writing.
01:15:47When we look at the physical world and the statutes that exist for protecting,
01:15:55uh, individuals, uh, from some of the harms that you've listed today and, um,
01:16:03the importance of take it down and the importance of no fakes, but we don't
01:16:09have all of those criminal statutes that transfer to the virtual space.
01:16:15And I would like for you to give me a summary of your thoughts on that.
01:16:21Uh, your testimony is expensive and helpful.
01:16:25Um, and as I said, we've submitted that whole testimony, Mr.
01:16:30Brookman, we've submitted your entire testimony also, and we thank you for
01:16:34that, but I would like to have just a little bit more from you on that issue
01:16:39of those protections is we've talked about, uh, no fakes and the copied act
01:16:46and COSA.
01:16:47We talk often about this difference and you touched on it and I'd like to
01:16:52have something more expansive with that.
01:16:56We have no further members present and no further questions.
01:17:01I will, um, remind you all that members have five days to submit questions for
01:17:07the record, and then you're going to have 10 days to return those answers to us.
01:17:13I thank you all.
01:17:14Our witnesses have been wonderful today.
01:17:17We appreciate your testimony for the record.
01:17:20And with that, the hearing is adjourned.

Recommended