Skip to playerSkip to main contentSkip to footer
  • yesterday
During a Senate Judiciary Committee hearing on Wednesday, Sen. Josh Hawley (R-MO) claimed the Meta CEO approved using stolen material to train AI models.

Category

🗞
News
Transcript
00:00The hearing today, which is entitled,
00:07Too Big to Prosecute, Examining the AI Industry's Mass
00:11Ingestion of Copyrighted Works for AI Training.
00:15This is the third hearing of the Senate Judiciary Committee's
00:17subcommittee on crime and counterterrorism, which I'm delighted to work on with my
00:21colleague, Ranking Member Dermot. I want to thank you, say a special thank you
00:25to the witnesses for being here, many of you, all of you,
00:27I traveled in order to be here today. Thanks to everybody for accommodating our
00:31change in time. The Senate floor is going to be tied up here later today and thus
00:35no committee business is happening. So thanks all of you for being here and for
00:39accommodating us. I'm going to make just a few opening remarks. Senator Durbin will
00:44do the same, then we'll swear in the witnesses and be off to the races. Let me
00:47just start by saying that today's hearing is about the largest intellectual property
00:51theft in American history. For all of the talk about artificial intelligence and
00:56innovation and the future that comes out of Silicon Valley, here's the truth that
01:00nobody wants to admit. AI companies are training their models on stolen material,
01:06period. That is just the fact of the matter. And we're not talking about these
01:10companies simply scouring the internet for what's publicly available. We're talking
01:15about piracy. We're talking about theft. For years, AI companies have stolen massive
01:22amounts of copyrighted material from illegal online repositories. Now, the FBI and the
01:29Department of Homeland Security regularly prosecute individuals who engage in exactly the same
01:36kind of behavior using platforms like LimeWire or Napster in the old days, using a process
01:42called torrenting. But have these big tech companies been prosecuted? No, of course not.
01:49They're getting off scot-free. And this hearing will show us that Meta and Anthropic and other AI
01:56companies are willfully using these illegal networks, these torrenting networks, as they're
02:01called, to steal vast swaths of copyrighted materials. The amount of material that we're
02:07talking about is absolutely mind-boggling. We're talking about every book and every academic article
02:14ever written. Let me say that again. Every book and every article ever written. Billions of pages of
02:24copyrighted works. Enough to fill 22 libraries the size of the Library of Congress. Think about that.
02:3222 libraries of Congresses full of works. That is how much has been stolen. And this theft was not some
02:39innocent mistake. They knew exactly what they were doing. They pirated these materials
02:43willfully. As the idea of pirating copyrighted works percolated through Meta, to take one example,
02:50employee after employee warned management that what they were doing was illegal. One Meta employee
02:57told management that, and I quote now, this is not trivial. And he shared an article asking,
03:03what is the probability of getting arrested for using torrents, illegal downloads, in the United States?
03:09Another Meta employee shared a different article saying that downloading from illegal repositories
03:14would open Meta up to legal ramifications. That's a nice way of saying that what they were doing
03:20was exactly, totally, 100% barred by copyright law. Did Meta management listen? No.
03:28They bulldozed straight ahead. We'll see evidence today that Mark Zuckerberg himself approved the decision
03:35to use these pirated materials. And then, the best part, Meta management tried to hide it.
03:40They tried to hide the fact that they were engaged in the illegal download of pirated works. And not just
03:46the illegal download, but the illegal distribution of these same works. They tried to hide it by using
03:51non-company servers. They went so far as to train their AI model. Like, get this, Meta trained its AI model
03:59to lie to users about what data it had been trained on. I mean, you talk about an inception-level-worthy
04:05deception. Training the AI model to lie about what its own sources were. This isn't just aggressive business
04:10tactics. This is criminal conduct. And I just want to point out, Meta's conduct is not an exception. This is the rule
04:18when it comes to what is happening right now in the AI space among these mega companies.
04:23Big tech operates on the model of, do whatever you want, and count on the lobbyists and the lawyers
04:28to fix it later. They don't care about the rule of law. They don't care about America. They don't care
04:32about freedom. They certainly don't care about working people. They care about power, and they care about
04:36money. And every time they say things like, we can't let China beat us, let me just translate that for
04:42you. Every time they say that, oh, we can't let China beat us, what they're really saying is, give us
04:46truckloads of cash and let us steal everything from you and make billions of dollars on it.
04:50That's the translation. We're going to see that in the testimony and the evidence today. Here's
04:56the bottom line. We have got to do something to protect the people of this country. I'm all for
05:04innovation, but not at the price of illegality. I'm all for innovation, but not at the price of
05:08destroying the intellectual property of the average man and woman in this country. We have laws for a
05:14reason. Those laws ought to be enforced, and big tech should not be above the law. Enough is enough.
05:20It's time to enforce the law, and that's what this hearing today is about. Now I'll turn it over to
05:24Ranking Member Durbin.

Recommended