Sen. Maria Cantwell (D-WA) leads a Senate Commerce Committee hearing on artificial intelligence and the need to protect Americans' privacy.
Fuel your success with Forbes. Gain unlimited access to premium journalism, including breaking news, groundbreaking in-depth reported stories, daily digests and more. Plus, members get a front-row seat at members-only events with leading thinkers and doers, access to premium video that can help you get ahead, an ad-light experience, early access to select products including NFT drops and more:
https://account.forbes.com/membership/?utm_source=youtube&utm_medium=display&utm_campaign=growth_non-sub_paid_subscribe_ytdescript
Stay Connected
Forbes on Facebook: http://fb.com/forbes
Forbes Video on Twitter: http://www.twitter.com/forbes
Forbes Video on Instagram: http://instagram.com/forbes
More From Forbes: http://forbes.com
Fuel your success with Forbes. Gain unlimited access to premium journalism, including breaking news, groundbreaking in-depth reported stories, daily digests and more. Plus, members get a front-row seat at members-only events with leading thinkers and doers, access to premium video that can help you get ahead, an ad-light experience, early access to select products including NFT drops and more:
https://account.forbes.com/membership/?utm_source=youtube&utm_medium=display&utm_campaign=growth_non-sub_paid_subscribe_ytdescript
Stay Connected
Forbes on Facebook: http://fb.com/forbes
Forbes Video on Twitter: http://www.twitter.com/forbes
Forbes Video on Instagram: http://instagram.com/forbes
More From Forbes: http://forbes.com
Category
🗞
NewsTranscript
00:00:00The Committee on Commerce, Science, and Transportation will come to order.
00:00:03I want to thank the witnesses for being here today for testimony on the need to protect
00:00:08Americans' privacy and AI as an accelerant to the urgency of passing such legislation.
00:00:15I want to welcome Dr. Ryan Calo, University of Washington School of Law and co-director
00:00:20of the University of Washington Technology Lab, Ms. Amber Cack, co-executive director
00:00:26of the AI Now Institute in New York, Mr. Abhav Tiwari, global product policy for Mozilla
00:00:36San Francisco, and Mr. Morgan Reed, president of ACT, the App Association of Washington,
00:00:43D.C.
00:00:44Thank you all for being here for this very, very important hearing.
00:00:49We are here today to talk about the need to protect Americans' privacy and why AI is
00:00:54an accelerant that meets the needs of us passing legislation soon.
00:01:00Americans' privacy is under attack.
00:01:02We are being surveilled.
00:01:04Wow, I'm being tracked.
00:01:13We are being surveilled, tracked online in the real world through connected devices.
00:01:23And now, when you add AI, it's like putting fuel on a campfire in the middle of a windstorm.
00:01:30For example, a Seattle man's car insurance increased by 21 percent because his Chevy
00:01:35Bolt was collecting detailed information about his driving habits and sharing it with data
00:01:39brokers who then shared it with his insurance company.
00:01:42The man never knew the car was collecting the data.
00:01:46Data about our military members, including contact information and health conditions,
00:01:50is already available for sale by data brokers for as little as 12 cents.
00:01:55Researchers at Duke University were able to buy such data sets for thousands of our active
00:02:00military personnel.
00:02:03Every year, Americans make millions of calls, texts, chats to crisis lines seeking help
00:02:08when they are in mental distress.
00:02:10You would expect this information would be kept confidential, but a nonprofit suicide
00:02:15crisis line was sharing data from those conversations with its for-profit affiliates that it was
00:02:21using to train its AI product.
00:02:25Just this year, the FTC sued a mobile app developer for tracking consumers' precise
00:02:29location through software embedded in the grocery list and shopping rewards app.
00:02:34The company used this data to sort consumers into precise audience segments.
00:02:40Consumers whose use of this app helped them remember when to buy peanut butter didn't
00:02:45expect to be profiled and categorized into a precise audience segment like, quote, parents
00:02:51of preschoolers.
00:02:53These privacy abuses and millions of others that are happening every day are bad enough,
00:02:58but now AI is an accelerant, and they're the reason why we need to help in speeding up
00:03:03our privacy law.
00:03:05AI is built on data, lots of it.
00:03:08Tech companies can't get enough to train their AI models, your shopping habits, your
00:03:13favorite videos, who your kids' friends are, all of that.
00:03:17And we're going to hear testimony today from Professor Callow about this issue, about how
00:03:23AI gives the capacity to drive sensitive insights about individuals.
00:03:30So it is not just the data that is being collected.
00:03:33It is the ability to have sensitive insights about individuals into the system.
00:03:39This, as some people have said, you're referring to your testimony now, is creating an inference
00:03:45economy that could become very challenging.
00:03:49That is why I think you also point out, Dr. Callow, that a privacy law helps offset the
00:03:57power of these corporations and why we need to act.
00:04:01I also want to thank Ms. Cack for her testimony, because she is clearly talking about that
00:04:06same corporate power and the unfair and deceptive practices, which we've already given to the
00:04:11FTC as their main authority.
00:04:14But the lack of transparency about what is going on with prompts and the AI synergy is
00:04:23that people are no longer just taking personal data and sending us cookie ads.
00:04:33They are taking that and putting it actually into prompt information.
00:04:38This is a very challenging situation.
00:04:42And I think your question is, are we going to allow our personal data to train AI models,
00:04:48is a very important question for our hearing today.
00:04:52We know that they want this data to feed their AI models to make the most amount of
00:04:58money.
00:04:59These incentives are really a race to the bottom, where most privacy-protected companies
00:05:05are at a competitive disadvantage.
00:05:09Researchers project that if current trends continue, companies training large language
00:05:13models may run out of new publicly available high-quality data to train AI systems as early
00:05:19as 2026.
00:05:21So without a strong privacy law, when the public data runs out, nothing's stopping it
00:05:26from using our private data.
00:05:28I am very concerned that the ability of AI to collect vast amounts of personal data about
00:05:34individuals and create inferences about them quickly at very low cost can be used in harmful
00:05:43ways like charging consumers different prices for the same product.
00:05:48I talked to a young developer in my state.
00:05:50I said, what is going on?
00:05:52And he said, well, I know one country is using AI to basically give it to their businesses.
00:05:56I said, well, why would they do that?
00:05:58Well, because they want to know when a person calls up for a reservation at a restaurant
00:06:04how much income they really have.
00:06:07If they don't really have enough money to buy a bottle of wine, they're giving the reservation
00:06:11to someone else.
00:06:12So the notion is that discriminatory practices can already exist with just a little amount
00:06:18of data for consumers.
00:06:20AI in the wrong hands is also, though, a weapon.
00:06:24Deepfake phone scams are already plaguing my state.
00:06:27Scammers used AI to clone voices to fraud consumers by posing as a lumped one in need
00:06:35of money.
00:06:36These systems can recreate a person's voice in just minutes, taking the familiar grandparent
00:06:41scam and putting it on steroids.
00:06:44More alarming, earlier this month, the director of national intelligence reported that Russian
00:06:48influence actors are planning to covertly use social media to subvert our elections.
00:06:53The ODNI called AI, quote, a maligned influence accelerant, end quote, saying that it was
00:07:00being used to more convincingly tailor video and other content ahead of the November election.
00:07:07Just two days ago, the Department of Justice reported that it dismantled a Russian bot
00:07:11farm intended to sow discord in the United States.
00:07:15Using AI, Russian created scores of fictitious user profiles on X and generated posts and
00:07:25then used those bots to repost like comments on the post, further amplifying the original
00:07:32fake posts.
00:07:33So this was possible at tremendous scale, given AI.
00:07:37I'm not saying that misinformation might have not existed before and may not have been placed
00:07:42in a chat group, but now with the use of bots and AI accelerant, that information can be
00:07:49more broadly discriminated very, very quickly.
00:07:53So privacy is not a partisan issue, according to the Pew Research, majority of Americans
00:07:58across the political spectrum want more support for regulation.
00:08:01I believe our most important private data should not be bought or sold without our approval
00:08:07and tech companies should serve and make sure that they implement these laws and help stop
00:08:14this kind of interference.
00:08:16The legislation that Representative McMorris Rogers and I have worked on, I think, does
00:08:22just that and I want to say I very much appreciate this morning legislation that Senator Blackburn
00:08:29and I will be introducing called the COPY Act, which provide much needed transparency
00:08:34around AI generated content.
00:08:37The COPY Act will also put creators, including local journalists, artists, and musicians
00:08:41back in control of their content with a watermark process that I think is very much needed.
00:08:47I'll now turn to the ranking member, Senator Cruz, for his opening statement.
00:08:52Thank you, Madam Chair.
00:08:55American prosperity depends upon entrepreneurs.
00:08:58These are ambitious and optimistic men and women who are willing to take risks, pursue
00:09:04their dreams, and try to change the world.
00:09:07They mortgage their own homes, they put everything on the line to build a business that fills
00:09:11an unmet need or does something better than what's offered today.
00:09:16But throughout history, prosperity and human flourishing has been stymied or delayed by
00:09:22governments that impose regulatory policies to address supposed harms but in actuality
00:09:29overstated risk in order to protect incumbent operators, often large and powerful companies
00:09:37that didn't want to compete and that just happened to give big campaign checks to the
00:09:41politicians in power.
00:09:44The United States has mostly chosen a different path.
00:09:50One where a free enterprise system governed by the rule of law allows Americans to freely
00:09:56pursue their ideas, grow their own businesses, and compete without having to obtain permission
00:10:02from all-knowing bureaucrats.
00:10:05Today's hearing on data privacy and artificial intelligence is a debate about which regulatory
00:10:10path we will take.
00:10:12Do we embrace our proven history, one with entrepreneurial freedom and technological
00:10:18innovation?
00:10:20Or will we adopt the European model, where government technocrats get to second-guess
00:10:25and manage perceived risks with economic activity, ultimately creating an environment where only
00:10:32big tech with its armies of lawyers and lobbyists exist?
00:10:38Consider this.
00:10:39In 1993, at the dawn of the tech age, the economies of the United States and the European
00:10:45Union were roughly equal in size.
00:10:48Today, the American economy is nearly 50% larger than the EU's.
00:10:56The tech boom happened in America in part because Congress and the Clinton administration
00:11:01deliberately took a hands-off approach to the nascent internet.
00:11:07The result was millions of jobs and a much higher standard of living for Americans.
00:11:14Unfortunately, the Biden administration and many of my colleagues are suggesting the European
00:11:21model for AI, based heavily on hysterical doomsday prophecies to justify a command-and-control
00:11:29federal regulatory scheme that will cause the United States to lose our technological
00:11:35edge over China.
00:11:37The Biden administration's AI executive actions, as well as many of the AI legislative proposals,
00:11:44call for a new regulatory order that protects giant incumbent operators and discourages
00:11:52innovation with supposedly optional best practices or guidance written by all-knowing bureaucrats,
00:12:02some of whom were recently employed by the same big tech firms they seek to regulate
00:12:07and some of whom hope to be employed again by those same big tech firms right after they
00:12:12write the rules that benefit those giant big tech firms.
00:12:16We already see federal AI regulators and Biden allies talking about the need to stop quote
00:12:22bias, quote misinformation, quote discrimination in AI systems and algorithms.
00:12:31That's code for speech police.
00:12:34If they don't like what you say, they want to silence it.
00:12:39Now, AI can certainly be used for nefarious purposes, just like any other technology.
00:12:46But to address specific harms or issues, we should craft appropriate and targeted responses.
00:12:54For example, Senator Klobuchar and I have introduced the bipartisan Take It Down Act,
00:13:00which targets bad actors who use AI to create and publish fake, lifelike, explicit images
00:13:07of real people.
00:13:10Our bill, which is sponsored by many Republican and Democrat members of this committee, would
00:13:16also require big tech to follow a notice and take down process so ordinary Americans who
00:13:22are victimized by these disturbing images can get them offline immediately.
00:13:28The bipartisan Take It Down Act is a tailored solution to a real problem.
00:13:33On behalf of the teenage girls and others who've been victimized by deep, fake, explicit
00:13:39imagery, I hope that this committee will take up soon the Take It Down Act and pass it and
00:13:45move it to the floor and get it signed into law.
00:13:49As I conclude, I'd like to address a related matter, the American Privacy Rights Act, APRA.
00:13:58I support Congress, not the FTC or any federal agency, but Congress setting a nationwide
00:14:04data privacy standard.
00:14:06Not only is it good for Americans to be empowered with privacy protections, but it's good for
00:14:11American businesses that desperately need legal certainty given the increasingly complex
00:14:16patchwork of state laws.
00:14:19But our goal shouldn't be to pass any uniform privacy standard, but rather the right standard
00:14:26that protects privacy without preventing U.S. technological innovation.
00:14:31I've discussed APRA with Chairwoman McMorris-Rogers and will continue my offer to work with her.
00:14:37But right now, this bill is not the solution.
00:14:41It delegates far too much power to unelected commissioners at the FTC.
00:14:47It focuses on algorithmic regulations under the guise of civil rights, which would directly
00:14:53empower the DEI speech police efforts underway at the Biden White House, harming the free
00:15:00speech rights of all Americans.
00:15:02As currently constructed, APRA is more about federal regulatory control of the Internet
00:15:08than personal privacy.
00:15:10In the end, it's the giant companies with vast resources that ultimately benefit from
00:15:15bills like APRA at the expense of small businesses.
00:15:20The path that Congress needs to take is to put individuals in control of the privacy
00:15:25of their own data and give them transparency to make decisions in the marketplace.
00:15:30And I look forward to working with my colleagues to do exactly that.
00:15:34Thank you, Senator Cruz.
00:15:35We'll now turn to our panel, starting with Dr. Calo.
00:15:39Thank you so much.
00:15:40We're really proud of the work that the University of Washington has done.
00:15:45I think we're on both.
00:15:47Senator Cruz is mentioning the innovation economy.
00:15:49I think we have that down in the Northwest, but we also want to make sure we have down
00:15:54the important protections that go along with it.
00:15:57So thank you, Dr. Calo, for your presence.
00:16:02Chair Cantwell, Ranking Member Cruz, and members of the committee, thank you for the opportunity
00:16:07to share my research and views on this important topic.
00:16:10I'm a law professor and information scientist at the University of Washington, where I co-founded
00:16:15the Tech Policy Lab and Center for an Informed Public.
00:16:19The views I express today are entirely my own.
00:16:23Americans are not receiving- Dr. Calo, could you just bring that microphone
00:16:26a little closer?
00:16:27Of course.
00:16:28Thank you.
00:16:31Americans are not receiving the privacy protections they demand or deserve.
00:16:36Not when Cambridge Analytica tricked them into revealing personal details of 87 million
00:16:41people through a poorly vetted Facebook app.
00:16:45Not when car companies share their driving habits with insurance companies without their
00:16:50consent, sometimes leading to higher premiums, as the senator mentioned.
00:16:55Privacy rules are long overdue, but the acceleration of artificial intelligence in recent years
00:17:01threatens to turn a bad situation into a dire one.
00:17:05AI exacerbates consumer privacy concerns in at least three ways.
00:17:10First, AI fuels an insatiable demand for consumer data.
00:17:15Sources of data include what is available online, which incentivizes companies to scour
00:17:20and scrape every corner of the internet, as well as the company's own internal data, which
00:17:27incentivizes them to collect as much data as possible and store it indefinitely.
00:17:33AI's insatiable appetite for data alone deeply exacerbates the American consumer privacy
00:17:39crisis.
00:17:40Second, AI is increasingly able to derive the intimate from the available.
00:17:47Many AI techniques boil down to recognizing patterns in large data sets.
00:17:52Even so-called generative AI works by guessing the next word, pixel, or sound in order to
00:17:57produce new text, art, or music.
00:18:01Companies increasingly leverage this capability of AI to derive sensitive insights about individual
00:18:06consumers from seemingly innocuous information.
00:18:10The famous detective Sherlock Holmes, with the power to deduce whodunit by observing
00:18:15a string of facts most people would overlook as irrelevant, is the stuff of literary fiction.
00:18:22But companies really can determine who is pregnant based on subtle changes to their
00:18:27shopping habits, as Target reportedly did in 2012.
00:18:32And finally, AI deepens the asymmetries of information and power between consumers and
00:18:37companies that consumer protection law exists to arrest.
00:18:43The American consumer is a mediated consumer.
00:18:46We increasingly work, play, and shop through digital technology.
00:18:51And a mediated consumer is a vulnerable one.
00:18:55Our market choices, what we see, choose, and click, are increasingly scripted and arranged
00:19:00in advance.
00:19:02Companies have an incentive to use what they know about individual and collective psychology,
00:19:08plus the power of design, to extract as much money and attention as they can from everyone
00:19:13else.
00:19:16The question is not whether America should have rules governing privacy.
00:19:20The question is why we still do not.
00:19:24Few believe that the internet, social media, or AI are ideal as configured.
00:19:29A recent survey by the Pew Research Center suggests that an astonishing 81% of Americans
00:19:36assume that companies will use AI in ways for which they are not comfortable.
00:19:4181%.
00:19:44Just for context, something between 30% and 40% of Americans identify as Taylor Swift fans.
00:19:50Meanwhile, the EU, among our largest trading partners, refuses to certify America as adequate
00:19:58on privacy, and does not allow consumer data to flow freely between our economies.
00:20:04What is the point of American innovation if no one trusts our inventions?
00:20:11More and more individual states, from California to Colorado, Texas to Washington, are passing
00:20:16privacy or AI laws to address their residents' concerns.
00:20:21Congress can and should look to such laws as a model, yet it would be unwise to leave
00:20:26privacy legislation entirely to the states.
00:20:30The internet, social media, and AI are global phenomena.
00:20:34They do not respect state boundaries.
00:20:37And the prospect that some states will pass privacy rules is small comfort to the millions
00:20:41of Americans who reside in states that have not.
00:20:46Congress should pass comprehensive privacy legislation that protects American consumers,
00:20:51reassures our trading partners, and gives clear, achievable guidelines to industry.
00:20:56Data minimization rules, which obligate companies to limit the data they collect and maintain
00:21:01about consumers, could help address AI's insatiable appetites.
00:21:07Broader definitions of covered data could clarify that inferring sensitive information
00:21:11about consumers carries the same obligations as collecting it.
00:21:15And rules against data misuse could help address consumer vulnerability in the face of a growing
00:21:21asymmetry.
00:21:23Thank you for the opportunity to testify before the committee.
00:21:25I look forward to a robust discussion.
00:21:28Thank you, Dr. Callow.
00:21:29Ms. Cack, thank you so much.
00:21:30Welcome.
00:21:31We look forward to your testimony.
00:21:33Thank you, Chair Cantwell, Ranking Member Cruz, and esteemed members of this committee.
00:21:37Thank you for inviting me to testify.
00:21:39We're at a very clear inflection point in the trajectory of AI.
00:21:44Without guardrails to set the rules of the road, we're committing ourselves to more of
00:21:48the same, to carrying forward extractive, invasive, and often predatory data practices
00:21:54and business models that have characterized the past decade of the tech industry.
00:22:00We are also committing ourselves to the seamless transition of big tech from surveillance monopolies
00:22:05to AI monopolies.
00:22:07A federal data privacy law could break this cycle, especially one with strong data minimization,
00:22:14which could challenge the culture of impunity and recklessness in the AI industry that's
00:22:18already hurting both consumers and competition.
00:22:22If there's a single point I want to make in today's testimony, it is that now is the moment
00:22:26when passing such a law actually matters before the trajectory has been set.
00:22:32Data privacy regulation is AI regulation, and it provides many of the tools that we
00:22:36need to protect the public from harm.
00:22:39But let's get concrete.
00:22:40How might the AI market shape up differently in the presence of a strong data privacy law?
00:22:46First, with data minimization rules, firms would need to put reason in place of recklessness
00:22:53when it comes to decisions about what data to collect, the purposes to which it can be
00:22:58used, and for how long it can be stored.
00:23:01These requirements would empower lawmakers but also the public to demand very basic accountability.
00:23:07So before Microsoft, for example, rushes to release its new recall AI feature, which
00:23:12by the way takes continuous screenshots of everything you see or do on your computer,
00:23:17the company might need to ask itself and importantly document its answer.
00:23:22Does the utility of this feature outweigh the honeypot it's creating for bad actors?
00:23:27As a security researcher quickly discovered with Microsoft recall, it's actually scarily
00:23:32trivial for an attacker to use malware to extract a record of everything you've ever
00:23:37viewed on your PC.
00:23:39Now a strong data minimization mandate would have nipped this in the bud.
00:23:43It would have likely disincentivized the development of such a patently insecure feature to begin with.
00:23:50Second, we would have transparency about the data decisions that affect us all.
00:23:56Meta and Google recently announced updates unilaterally, of course, to their terms, which
00:24:01allow AI training from user data.
00:24:04Now we know about this because European users were alerted by Meta.
00:24:08Without a legal mandate to require it, of course, American users received no such notification.
00:24:13And users of Reddit are also similarly up in arms because their content was just sold
00:24:19to the highest bidder, in this case Google, for use of training its AI.
00:24:24But it's important to remember that a privacy law would offer much more than just transparency
00:24:29in every single instant I mentioned.
00:24:32Purpose limitation rules would prevent big tech from using AI as its catch-all justification
00:24:38to use and combine data across contexts and store it forever.
00:24:43The FTC has already penalized Amazon for storing the voice data of children indefinitely using
00:24:49AI as its excuse.
00:24:51But we can't rely on this kind of one-off enforcement.
00:24:54There needs to be rules of the road.
00:24:57And these moves, it's important to remember, would not just safeguard our privacy.
00:25:01They would also act as a powerful check on the data advantages currently being consolidated
00:25:06by big tech to build a moat around them and stave off competition in AI.
00:25:11Thirdly, in a world with a data privacy mandate, AI developers would need to make data choices
00:25:17that deliberately prevent discriminatory outcomes.
00:25:20So we shouldn't be surprised when we see, you know, that women are seeing far less ads
00:25:24for high-paying jobs on Google Ads.
00:25:26That is 100% a feature of data decisions that have already been made upstream.
00:25:30I mean, the good news here is that these are avoidable problems.
00:25:34And it's not just in scope for a privacy law.
00:25:37I would say it is integral to protecting people from the most serious abuses of this data.
00:25:43And where specific AI practices have sort of inherent well-established harms, emotion
00:25:47recognition systems that have faulty scientific foundations or pernicious forms of ad targeting,
00:25:53the law would hold them entirely off limits.
00:25:56Finally, and here's the thing about very large-scale AI, it is not only computationally, ecologically,
00:26:04and data intensive, it's also very, very expensive to develop and run.
00:26:09Now these eye-watering costs will need a path to profit.
00:26:13And by every account, a viable business model still does not exist.
00:26:17Now it is precisely in this kind of an environment with a few incumbent firms feeling the pressure
00:26:23to turn a profit that predatory business models emerge.
00:26:26Meanwhile, we're hearing new research that LLM systems are capable of hyper-personalized
00:26:31inferences about us, even from the most general prompts.
00:26:35You do not need to be a clairvoyant to see that all roads might well be leading us right
00:26:39back to the surveillance advertising business model that got us here.
00:26:43So to conclude, there is nothing about the current trajectory of AI that is inevitable.
00:26:48And as a democracy, the U.S. has a huge opportunity to take global leadership and shape this next
00:26:54era of tech so that it reflects the public interests and not just the bottom lines of
00:26:58very few companies.
00:27:00This is the moment for action.
00:27:02Thank you.
00:27:04Thank you, Ms. Keck.
00:27:05Thank you for that testimony.
00:27:06And the notion that in your testimony you talked about how just the sound of our voice
00:27:13could be used to project different detections and different outcomes of what people are
00:27:18doing is just very disturbing, very disturbing.
00:27:22So Mr. Tiwari, welcome.
00:27:27Chair Cantwell, Ranking Member Cruz, and esteemed members of the committee, thank you for the
00:27:32opportunity to testify on the critical issue of protecting Americans' privacy in the age
00:27:37of artificial intelligence.
00:27:40My name is Udbhav Tiwari, and I'm the Director of Global Product Policy at Mozilla.
00:27:44Today, I will discuss the urgent need for comprehensive privacy legislation, the importance
00:27:50of data minimization, and the role of privacy-enhancing technologies in fostering responsible AI development.
00:27:58At Mozilla, we approach tech policy issues from a unique vantage point.
00:28:02As a nonprofit foundation, an open source community, and a tech company, we build the
00:28:09open source Firefox web browser, Mozilla VPN, and products like Solo, an AI-powered
00:28:14website builder for micro SMBs and Solo printers.
00:28:18These products are used by hundreds of millions of people around the world, and Mozilla's
00:28:23mission is to ensure that the Internet is a global public resource, open and accessible
00:28:28to all.
00:28:29We believe that comprehensive privacy legislation is foundational to any AI framework, and that
00:28:35maintaining U.S. leadership in AI requires America to lead on privacy and user rights.
00:28:43Privacy is a critical component of AI policy, not just because AI has the potential to accelerate
00:28:49privacy-related harms, but because AI systems thrive on data, and the drive to develop advanced
00:28:55AI models has also intensified the demand for vast amounts of personal information.
00:29:02Advanced data collection, often done without the adequate consent or via deceptive choices,
00:29:07poses significant risks to individual privacy and security.
00:29:12By championing policies that promote innovation, create clear rules of the road for companies,
00:29:18and protect fundamental user rights, we can create both a competitive and a level playing
00:29:23field for the American AI industry, and prepare domestic champions for global leadership.
00:29:29At the core of these principles and policies should be data minimization.
00:29:34Data minimization is a crucial principle that ensures only the necessary data is collected
00:29:39and used for specific purposes.
00:29:42In the context of AI, data minimization can be achieved through several strategies, including
00:29:47informed consent and ensuring there is privacy by design.
00:29:52While legislation is essential, technical advances must work hand in hand with legislative
00:29:57solutions to create a more safe and private future.
00:30:01In the context of what the ecosystem currently needs, we need a significant investment in
00:30:07privacy-enhancing technologies to develop AI systems that respect and protect individual
00:30:13privacy.
00:30:14Openness and open source are also essential ingredients for improving verifiable and meaningful
00:30:20privacy in AI technologies.
00:30:23Open approaches play a vital role in promoting innovation and preventing the concentration
00:30:28of power in the hands of few companies that my fellow panelists have spoken about.
00:30:34They enable the economic benefit of AI to be more widely shared amongst businesses of
00:30:39different sizes and capabilities.
00:30:42This leads to increased investment and job creation.
00:30:46We also have clear evidence from open source development practices that openness allows
00:30:51for diverse input and collaboration, fostering the development of privacy-preserving techniques
00:30:57that can benefit everyone rather than relying on security through obscurity.
00:31:02Turning to the potential risks of AI amplifying privacy violations, we know that online manipulation,
00:31:09targeted scams, and online surveillance are not new risks in our digital lives.
00:31:13However, AI technologies can supercharge such harms, creating risks like profiling and manipulation,
00:31:20bias and discrimination, and deep fakes and identity misuse.
00:31:24To mitigate these risks, we need comprehensive federal privacy legislation.
00:31:29This should be accompanied by strong regulatory oversight and continued investments in privacy-enhancing
00:31:35technologies.
00:31:36We must also ensure that AI systems are transparent and accountable, with mechanisms in place
00:31:42to address privacy violations and provide recourse for affected individuals, underpinned
00:31:47by disclosure and accountability.
00:31:50When it comes to the AI's potential to impact civil liberties, the risk cannot be understated.
00:31:56The same technologies that drive innovation can also be used to infringe upon fundamental
00:32:00rights and be used by big tech companies to trample on individual privacy.
00:32:06It is therefore imperative that AI development and deployment are guided by principles that
00:32:11protect civil liberties.
00:32:13This includes safeguarding freedom of expression, preventing unlawful surveillance, and ensuring
00:32:18that AI systems do not perpetuate discrimination or bias.
00:32:22In conclusion, protecting Americans' privacy in the age of AI is a critical challenge that
00:32:27requires comprehensive legislation.
00:32:30As we navigate the complexities of AI and privacy, it is crucial to strike a balance
00:32:34between innovation and protection.
00:32:37Thank you for the opportunity to testify today.
00:32:39I look forward to your questions and to working with you to protect Americans' privacy in
00:32:43the AI era.
00:32:44Thank you, Mr. Tiwari.
00:32:45I very much appreciate you being here.
00:32:47And Mr. Reid, thank you so much for being here.
00:32:49I'm not trying to ask you a question in advance, but I'm pretty sure you have thousands of
00:32:54members of your organization, and I'm pretty sure you have quite a few in the Pacific Northwest.
00:32:59So thank you for being here.
00:33:00We do.
00:33:01Many in the great state of Washington.
00:33:05Chair Cantwell, Ranking Member Cruz, and members of the committee, my name is Morgan
00:33:08Reid, President of ACT, the App Association, a trade association representing small and
00:33:13medium-sized app developers and connected device manufacturers.
00:33:16Thank you for the opportunity to testify today on two significant and linked subjects, privacy
00:33:21and AI.
00:33:22Let me say very clearly that the U.S. absolutely needs a federal privacy law.
00:33:28For years, we have supported the creation of a balanced bipartisan framework that gives
00:33:32consumers certain protections and businesses clear rules of the road.
00:33:37Instead, what we have now is a global array of mismatched and occasionally conflicting
00:33:41laws, including here in the U.S. with either 19 or 20, depending on who you ask, state-level
00:33:46comprehensive privacy laws, and more coming every year.
00:33:50To prevent this morass of confusion and paperwork, preemption must also be strong and without
00:33:55vague exceptions.
00:33:57And it must include small businesses so that customers can trust the data is being protected
00:34:02when they do business with a company of any size.
00:34:05Unfortunately, the American Privacy Rights Act, or APRA, falls short of both of these
00:34:10objectives.
00:34:11Carving out small businesses from the definition of covered entities, as APRA does, is a non-starter
00:34:16because it would deny us the benefits of the bill's preemption provisions.
00:34:20Instead, small business would be required to comply separately with 19 state laws and,
00:34:26more importantly, the not-yet-passed laws in 31 states, exposing us to costly state-by-state
00:34:32compliance and unreasonably high litigation costs.
00:34:35And it isn't just small tech impacted by this.
00:34:39In today's economy, everyone uses customers' data, even bricks and mortar.
00:34:43A local bike shop in Nevada likely has customers coming from Utah, Arizona, California, Colorado,
00:34:49and Idaho.
00:34:50An alert reminding these customers about a tire sale with free shipping or that it's
00:34:54time to get their bike in for a tune-up requires at least a passing familiarity with each state's
00:35:00small business carve-out.
00:35:02In the ultimate irony, APRA may even incent small businesses to sell customers' data in
00:35:07order to gain the benefit of preemption.
00:35:10Congress must instead move forward with a framework that incorporates small businesses
00:35:14and creates a path to compliance for them.
00:35:17And this acceptance of small businesses' role in the tech ecosystem becomes even more pronounced
00:35:21when we turn to AI.
00:35:24Mainstream media is all abuzz about big companies like Amazon, Microsoft, Google, and Apple
00:35:28moving into AI.
00:35:29But the reality is small business has been the faster adopter.
00:35:34More than 90% of my members use generative AI tools today, with an average 80% increase
00:35:41in productivity.
00:35:42And our members who develop those AI solutions are more nimble than larger rivals.
00:35:47We're developing, deploying, and adapting AI tools in weeks rather than months.
00:35:52Their experiences should play a major role in informing policymakers on how any new laws
00:35:57should apply to AI's development and use.
00:36:00Here are two examples of how our members are using AI in innovative ways.
00:36:04First up is our Iowa-based SwineTech.
00:36:06It's reshaping the management of hog farms.
00:36:09Farmers use their product, PigFlow, that's really the name, to manage the entire process
00:36:14of using sensors to track the health of sows and piglets, and then market analytics and
00:36:18public information to build AI-powered holistic solutions.
00:36:22But as Paul Simon said, big and fat pigs supposed to look like that.
00:36:26For our member MetricMate in Atlanta, the goal is the opposite.
00:36:30This Atlanta-based startup uses a combination of off-the-shelf and custom fitness trackers
00:36:34to help individuals and physical therapists to track and refine fitness goals.
00:36:38AI helped MetricMate respond to the progress being made over time and instantaneously,
00:36:43while their patented tap sensor gives the user the ability to track their workouts and
00:36:48seamlessly transmit data to the MetricMate app.
00:36:52These examples show how AI is useful for innovative yet normal activities.
00:36:56So rather than limit AI as a whole, policymakers must target regulation to situations where
00:37:01a substantial risk of concrete harm exists.
00:37:05The risk of AI use on a pig farm should not be treated the same as risks in sectors like
00:37:10healthcare or wellness, and yet both of them might be covered by future laws.
00:37:15Finally, I want to stress the importance of standards to the future of AI.
00:37:20Standards are a valuable way for new innovators to make interoperable products that compete
00:37:24with the biggest market players.
00:37:26As the committee considers bills on the subject, we need NIST to remain a supporter rather
00:37:31than an arbiter of voluntary industry-led standards.
00:37:35The committee should also be aware of the threat to small business through standard
00:37:37essential patent abuse.
00:37:40With non-U.S.-based companies holding the crown for the most U.S. patents every year,
00:37:45federal policy must combat abuse of patent licensing in standards by ensuring that licenses
00:37:50are available to any willing licensee, including small businesses, on fair, reasonable, and
00:37:55non-discriminatory terms.
00:37:57If we're not capable of doing that, the next generation of AI standards will not be
00:38:01owned or run through U.S. companies, but through those outside of the U.S. with different perspectives
00:38:07on human rights and our expectations.
00:38:10So thank you for your time, and I look forward to answering your questions.
00:38:13Thank you, Mr. Reid, and on that last point, we'll have you expand today or more generally
00:38:19for the record what we need to do about that last point.
00:38:23I definitely think in the past we have sided too much with the big companies on the patent
00:38:29side and not enough empowerment of the inventors, the smaller companies.
00:38:34So I want to make sure we get that part right in addition to your recommendations.
00:38:39You've made a couple of very good recommendations.
00:38:41Thank you.
00:38:42I'd like to go to a couple of charts here if we could.
00:38:44One, when we first started working on the privacy law, I'm going to direct this to you,
00:38:50Professor Callow, but what got me going was the transfer of wealth to online advertising.
00:38:58I don't think people really quite understood how much the television, newsprint, radio,
00:39:07magazine, the entire advertising revenue shift went online.
00:39:14Now we're just talking about the internet.
00:39:15Could you bring that a little closer, please?
00:39:18Put that up a little closer.
00:39:20So we are now at 68%.
00:39:33I don't know if people can see that, but we're now at 68% of all spending, two-thirds
00:39:40of all spending of advertising has now taken place online with data and information.
00:39:48So that trend is just going to keep continuing.
00:39:51Now you and I have had many conversations about the effect of that on the news media,
00:39:57having community voices.
00:40:00Our community in Seattle, Kinkviver, the Seattle Times, couldn't exist if it had misinformation.
00:40:06It just wouldn't exist.
00:40:09But in the online world, you can have misinformation.
00:40:11There's no corrective force for that.
00:40:13But all the revenue is now gone to the online world.
00:40:18And the second chart describes, I think, a little bit about your testimony that I want
00:40:22to ask a question about, and that is the amount of information that is now being derived about
00:40:29you that AI is this capacity to derive sensitive insights.
00:40:36So that trend that I just described, where two-thirds of all advertising revenue, I mean,
00:40:41somebody said data is like the new oil, right?
00:40:44It's just where everybody is going to go and make the money.
00:40:47So that's a lot of money already in that shift over those years that I mentioned on the chart.
00:40:54But now you're saying they're going to take that information, and they're going to derive
00:41:00sensitive information about us.
00:41:02Ms. Cact said it's the way your voice sounds.
00:41:05You've described it as various features.
00:41:08So could you tell me how protecting us against that in the AI model, why that's so important?
00:41:16And I just want to point out, we're very proud of what the Allen Institute is doing on AI.
00:41:22We think we're the leaders in AI applications.
00:41:25We're very proud of that, both in health care, farming, energy.
00:41:32We have an agreement today between the United States and Canada in principle on the Columbia
00:41:36River Treaty.
00:41:37I think water AI will be a big issue of the future.
00:41:41How do you manage your natural resources to the most effective possible use?
00:41:45So we're all big on the AI implications in the Pacific Northwest.
00:41:50We're very worried about the capacity to drive sensitive insights.
00:41:56And then as you mentioned, an insurance company or somebody using that information against
00:42:01you.
00:42:02Could you expound on that, please?
00:42:05Absolutely.
00:42:07I was talking to my sister who's on the board of the Breast Cancer Alliance about my testimony.
00:42:12And she said, Ryan, just make sure that people know how important it is for AI to be able
00:42:18to spot patterns in medical records to ensure that people get better treatment, for example,
00:42:24for breast cancer.
00:42:25And I agree with that.
00:42:26And I'm also proud of all the work we're doing at University of Washington and Allen.
00:42:30The problem is that the ability to derive sensitive insights is being used in ways that
00:42:36disadvantage consumers.
00:42:39And they're not able to figure out what's going on and fight back.
00:42:44So for example,
00:42:45Thereby driving up costs.
00:42:47For example, we know why everything costs $9.99.
00:42:52It's because your brain thinks of it as being a little bit further away from $10 than it
00:42:55really is.
00:42:57But the future we're headed to, and even the present, is a situation where you're charged
00:43:02exactly as much as you're willing to pay in the moment.
00:43:06Say I'm trying to make dinner for my kids, and I'm just desperately trying to find a
00:43:12movie for them to stream that they both can agree on.
00:43:19If Amazon can figure that out, or Apple can figure that out, they can charge me more in
00:43:26the moment when I'm flustered and frustrated.
00:43:29Because they can tell.
00:43:30And if that sounds far-fetched, Senator, Uber once experimented with whether or not people
00:43:36would be more willing to pay surge pricing when their batteries were really low on their
00:43:42phone.
00:43:43Because they'd be desperate to catch a ride, right?
00:43:47Amazon itself has gotten into trouble for beginning to charge returning customers more
00:43:53because they know that they have you in their ecosystem.
00:43:57This is the world of using AI to extract consumer surplus.
00:44:02And it's not a good world.
00:44:04And it's one that data minimization could help address.
00:44:11Senator Wicker.
00:44:12Well, thank you very much, Madam Chair.
00:44:20Mr. Reid, let me go to you.
00:44:21You've testified before on this topic.
00:44:26Where are most of the jobs created in the United States economy?
00:44:31What size business?
00:44:32Right.
00:44:33Small businesses are the single largest source of new business, of new jobs in the United
00:44:39States.
00:44:40Okay.
00:44:41And so let's talk about how the current situation affects small and medium-sized businesses
00:44:48and how legislation would affect them positively or negatively.
00:44:53Let's pretend I am a small business owner in the state of Washington or the state of
00:45:00Mississippi.
00:45:01And I use the Internet to sell products in multiple states.
00:45:07That would be a very typical situation, would it not?
00:45:11Yes.
00:45:12In fact, one of the hardest things that I think has been transformative, but yet it's
00:45:17hard for legislatures to understand, is our smallest members are global actors.
00:45:22My smallest developers are selling products around the world and are developing a customer
00:45:26base around the world.
00:45:27And in many ways, you can think of it this way.
00:45:30Maybe there's 5,000 people in your hometown who want to buy your product, but there's
00:45:34500,000 people who want to buy your product everywhere.
00:45:38So as a small business, the Internet allows you to reach all of those customers.
00:45:41The problem with APRA carving out small business is all of a sudden now, a small business that
00:45:47wants to grow from 10,000 customers in the great state of Mississippi to 500,000 customers
00:45:52throughout the United States has to comply.
00:45:59The law needs to apply evenly to all actors.
00:46:04Let's talk about preemption.
00:46:07If I'm this small business person in the state of Washington and Congress passes a new data
00:46:14privacy law, but the preemption clause has enumerated exceptions for various reasons,
00:46:21how is that going to affect me?
00:46:23Once again, the people who are best equipped to deal with complex compliance regimes are
00:46:29very large companies that hire large numbers of lawyers.
00:46:32So we really need a compliance regime and a preemption regime that's easy to understand
00:46:38and is applicable when my businesses don't have a compliance team.
00:46:42Heck, they don't even have a compliance officer.
00:46:44It's probably the chief dishwasher is also the chief compliance officer.
00:46:48And I think that's an important thing to understand about small businesses is they don't have
00:46:51teams of lawyers.
00:46:52How about a broad private right of action?
00:47:00How is that going to affect me and my business?
00:47:05Well, I think it's interesting that you bring it up.
00:47:07When I testified before you on this same topic, we had great discussions about the fact that
00:47:11there may be needs occasionally for private right of action, but they should be limited
00:47:15in scope.
00:47:16And I think that's more true now than ever before.
00:47:19If we have a privacy regime that exposes small business and everyone else to individual
00:47:25private right of action from multiple states, it sets up small businesses to be the best
00:47:30victim for sue and settle.
00:47:32Nobody wants to go to war with Microsoft.
00:47:34But I can send you, a small business, a pay me now note for 50K.
00:47:38I'm going to go to my local small town lawyer and say, can you fight this?
00:47:42He's going to say it's going to be $150,000 to fight it.
00:47:45So you're going to stroke a check and you're going to not pay an employee or not hire someone
00:47:49for 50K.
00:47:50So we want to avoid a private right of action that leads to the ability for unscrupulous
00:47:55lawyers to make a killing off of sue and settle, particularly in small businesses where the
00:48:01cost of fighting is greater than the cost of the check they're asking for.
00:48:05Okay.
00:48:06You know, we're going to run out of time real quick.
00:48:09Mr. Callow, on page six of your testimony, you say expecting tech companies to comply
00:48:21with a patchwork of laws, and that would be small and large tech companies, a patchwork
00:48:27of laws depending on what state a consumer happens to access their services is unrealistic
00:48:32and wasteful.
00:48:34I hear people say this.
00:48:37Let's pass a nationwide data privacy act.
00:48:41But if a state like California wants to really, really protect even better, let's let them
00:48:47do so.
00:48:48Let's give states like that an option out ability.
00:48:57Doesn't that start the patchwork all over again?
00:49:03Senator, it's an excellent question.
00:49:06My view is that in a perfect world, the federal government would set a floor, and then the
00:49:13individual states, if they wanted to be more protective of their own residents, right,
00:49:19would be able to raise that floor for their own citizens, right?
00:49:25In this particular context, it is very difficult to deploy these large global systems in ways
00:49:33that differentiate depending on what state you're in.
00:49:37Okay.
00:49:38I'm sorry.
00:49:39I have to go.
00:49:40Mr. Reid, that's getting back to the patchwork, isn't it?
00:49:42So now you've got to comply with two.
00:49:45And then who's going to decide which is more protective?
00:49:50Absolutely.
00:49:51That's it in a nutshell.
00:49:52I think Mr. Callow just said it's not realistic to have the patchwork.
00:49:58I think his testimony is quite clear.
00:49:59He says you have to have a federal preemption.
00:50:02Yes, it does.
00:50:04I do.
00:50:05I think it does.
00:50:06And I think, Mr. Reid, I think you're- Okay, well, Madam Chair, since there are not
00:50:10dozens of us, but only a couple- We have one of our colleagues who's waiting,
00:50:16Senator Rosen, who actually has coded before.
00:50:18So I feel like we need to defer to her.
00:50:20So Senator Rosen.
00:50:22Well, thank you.
00:50:25You know, writing that computer code, it was a great experience, a great living, and
00:50:31I loved every minute of it.
00:50:32Now it prepares me for a lot of things I do here today.
00:50:35So thank you for holding this hearing.
00:50:37These issues are so important.
00:50:39I really appreciate the witnesses.
00:50:41I'm going to talk first a little bit about AI scams because Americans are generating
00:50:46more data online than ever before.
00:50:49And we know with advances in AI, data can be used in many ways, depending on the algorithms
00:50:55that are written.
00:50:56And of course, bad actors are going to use AI to generate more believable scams using
00:51:02the deep fake, cloning, and all the pictures.
00:51:07You've seen it.
00:51:08Everyone has seen it everywhere.
00:51:09And these particularly target our seniors and they target our veterans.
00:51:13And so I'd like to ask each Mr. CAC and Professor Callow, both of you, how can enacting a federal
00:51:21data privacy law better protect individuals from these AI enabled cyber attacks and scams
00:51:27that we know are happening every single day in every community?
00:51:33Thank you.
00:51:34We'll start with you and then we'll go to the professor.
00:51:37Thank you, Senator.
00:51:38It's interesting because when Chachi BD was released and there was all of this excitement
00:51:42about whether we were one step, who knows if it's tomorrow or in 50 years, but one step
00:51:46away from these sort of fantastical Terminator-like scenarios, what we were really more concerned
00:51:51about was had we just seen the creation of the most effective spam generator that history
00:51:56had ever seen.
00:51:57And so I think what is at the heart of these issues is that we are creating these technologies,
00:52:01they are moving to market clearly before they're ready for commercial release, and they're
00:52:07sort of unleashing these diffused harms, including, as you mentioned, the risk of
00:52:11sort of exacerbating concerns of deceptive and spam content.
00:52:17To your question on how would a federal data privacy law sort of nip these kinds of problems
00:52:22in the bud, I think it would do so in a very sort of structural sense.
00:52:25It would say that there need to be certain rules of the road, there need to be limits
00:52:29on what companies are able to collect, the ways in which they're training their models,
00:52:34so that these companies are not creating inaccurate sort of misleading AI tools, which
00:52:40are then being integrated into our most sort of sensitive social domains, whether that's
00:52:45banking or healthcare or hiring.
00:52:48So I think the sort of final thing I'll say on the kind of spam generation point is that
00:52:54we have known for the last decade, we have evidence like garbage in, garbage out.
00:52:58So when we see these failures, we should see them not as output failures, but as failures
00:53:03that go back to very crucial data decisions that are made right from the training and
00:53:08development stage of these models, all the way through to the output stage.
00:53:13And so the privacy risks-
00:53:14Thank you.
00:53:15Thank you.
00:53:16I only have a few minutes.
00:53:17So Professor Callow, if you could be really brief, because I want to get to a question
00:53:21about data ownership, who owns your data and who has a right to your data.
00:53:25So if you'd be really brief so I could get that question in, I'd surely appreciate it.
00:53:31I think that we need to empower federal regulators such as the Federal Trade Commission to go
00:53:36after all kinds of abuses that involve artificial intelligence, including such scams.
00:53:43Investing in the FTC and its expertise is the correct call in my opinion.
00:53:48Thank you.
00:53:49I appreciate that because I want to talk about something that everyone talks to me about
00:53:54is data ownership, AI transparency, because Nevadans today, people across this country
00:54:01do not have the right to access, correct, or delete their data.
00:54:05Who owns your data?
00:54:06What do they do with it?
00:54:07How do they use it?
00:54:08It matters to people.
00:54:10And so it's impossible for many to even understand, like I said, who holds their data and what
00:54:15they're going to do with it.
00:54:16So the supply chain of consumer data, it is full of loopholes, whereas third party resellers,
00:54:22they can just sell your data to the highest bidder.
00:54:25So Mr. Tiwari, can transparent AI systems exist without strong federal privacy regulations,
00:54:32including consumer control over your own personal data?
00:54:38Thank you, Senator.
00:54:39And in short, the answer is no.
00:54:41It is impossible for users to be able to effectively exercise the rights that they have, not only
00:54:46over their own data, but also their social experiences without knowing what companies
00:54:51are collecting, how that data is being used, and more importantly, what rights do they
00:54:55have if that harm is occurring in the real world.
00:54:58We've already in this hearing so far discussed various examples of harms that we have seen
00:55:03occur in the real world, but yet the actions that regulators and governments have been
00:55:07able to take to limit some of those harms have been constrained by the lack of effective
00:55:12and comprehensive federal privacy legislation.
00:55:16Thank you.
00:55:17I appreciate it.
00:55:18Madam Chair, I'm going to yield back with eight seconds to go.
00:55:21Thank you.
00:55:22Senator Blackburn.
00:55:23Thank you so much, Madam Chairman.
00:55:25I am so delighted we're doing this hearing today.
00:55:30And as we have talked so many times when I was in the House, and our Senate colleague
00:55:36Peter Welch was in the House, we took steps and introduced the first legislation to make
00:55:43businesses take steps to protect the security of our data, to require data breach notifications,
00:55:52and allow the FTC and the State Attorneys General to hold companies accountable for
00:56:00violations.
00:56:01And as Senator Rosen just said, it is so vitally important to know who owns the virtual you,
00:56:10who has that, and what are they doing with it.
00:56:14And now as we're looking at AI, I think federal privacy legislation is more important than
00:56:20ever because you've got to put that foundational legislation in place in order to be able to
00:56:27legislate to a privacy standard.
00:56:32And that is why we're working so carefully on two pieces of legislation.
00:56:38The No Fakes Act that Senator Coon, Klobuchar, Tillis, and I are working on to protect the
00:56:46voice and visual likeness of individuals from unauthorized use by generative AI.
00:56:54And then, Madam Chairman, you've mentioned the Copied Act that you and I are working
00:57:00on, which would require consent to use material with content provenance to train AI systems.
00:57:08And I want to come to each of you, and let's just go down the dais as we're looking at
00:57:16this.
00:57:17We're talking about ways that I'd like to hear from you all very quickly, ways that
00:57:24Congress is going to be limited in legislating if we do not have a privacy standard.
00:57:32And then how do we find the balance between that privacy and data security component so
00:57:41that people know they have the firewalls to protect their information and keep it from
00:57:48being used by open source and large language models.
00:57:53So let's just run down the dais on this.
00:57:59Thank you, Senator.
00:58:00I was really struck in my research for this hearing by the Pew Center's research.
00:58:07Their survey suggests that overwhelming percentages of Americans are worried that their data is
00:58:15out there and is going to be used in ways that is concerning and surprising to them.
00:58:21I think without passing laws that, per Senator Cantwell's remarks, define sensitive information
00:58:33to cover not merely already sensitive information like health status, but cover the inferences
00:58:42that can be made on top of that data, Americans are not going to feel comfortable and safe.
00:58:48I think that security and privacy go hand in hand.
00:58:52We could sit here and talk about the myriad horrible data breaches that have been occurring
00:58:57across our country.
00:59:00We can talk about ransomware and the way that hospital systems are being shut down.
00:59:05But ultimately, you know, it's all boils down to the fact that the American consumer is
00:59:12vulnerable and it needs its government to step in and set some clear rules.
00:59:19Thank you.
00:59:21Thank you, Senator.
00:59:22I'll say that the sort of incentives for irresponsible data surveillance have existed for the last
00:59:28decade.
00:59:29What AI does is it pours gasoline on these incentives.
00:59:31So if anything, we have a situation where all of our known privacy and security harms
00:59:36and risks are getting exacerbated.
00:59:38To the second part of your question, which is what is the connection between privacy
00:59:42and security, I would say those are sort of two sides of the same coin.
00:59:45So data never collected is data that is never at risk.
00:59:49And data that is deleted after it is no longer needed is also data that is no longer at risk.
00:59:54So I think having a strong data minimization mandate that puts what we would consider very
00:59:59basic data hygiene in place is absolutely essential, especially as you're seeing more
01:00:05kind of concerning bad actors use this information in nefarious ways, some of which the legislation
01:00:11you mentioned is going to clamp down on.
01:00:15Thank you.
01:00:16Thank you, Senator.
01:00:17Mozilla is an organization that has hundreds of millions of people that use its products
01:00:22because of its privacy properties.
01:00:24Without providing a consistent standard that allows American companies to compete globally,
01:00:30just like they do on innovation, but also on privacy, the Congress will be unable to
01:00:34ensure that Americans not only get the privacy rights they deserve, but also that American
01:00:38companies can have a high baseline standard with which they can compete with organizations
01:00:43around the world.
01:00:44Thank you.
01:00:45Thank you.
01:00:46I understand I'm over time here, but very quickly, privacy laws should have a data security
01:00:51provision.
01:00:52It's one of our four Ps of privacy.
01:00:53I think data hygiene is absolutely critical, but it's different than a prohibition on processing.
01:00:59So let's be careful on how we use the term data hygiene rather than no collection at
01:01:03all.
01:01:04So let's be smart about how we do that.
01:01:05Thank you.
01:01:07Thank you, Madam Chairman.
01:01:09Thank you.
01:01:10And thank you for your leadership on the COPY Act.
01:01:12I think your understanding of creators, artists, and musicians and being able to stick up for
01:01:17them has been really critical.
01:01:18So thank you.
01:01:21Senator Hickenlooper.
01:01:22Thank you, Madam Chair.
01:01:27State privacy laws and federal proposals agree that consumers should have more control, should
01:01:32have control over their personal data, including the right to have their data deleted, as has
01:01:38been pointed out.
01:01:40However, consumers really don't have the time or the expertise to go down and effectively
01:01:46manage all their data online, to fill out the forms, to, you know, cookie notices or
01:01:52these lengthy privacy agreements become more of a distraction.
01:01:58So Mr. Kala, let's start with you.
01:02:01The American Privacy Rights Act proposes, A, as has been discussed, minimizing personal
01:02:06data collection, but in addition, offering consumer-facing controls like data deletion
01:02:12requests.
01:02:13And how do these two approaches work in tandem to protect consumers rather than anyone alone?
01:02:23That's a great question.
01:02:25In other contexts where we're trying to protect consumers, we do give them information and
01:02:30choices.
01:02:31And, you know, we know that privacy preferences are not completely homogeneous.
01:02:36However, in other contexts, not only do we give people choice, for example, how much
01:02:42fat content do they want in their milk, but we also place substantive limits, like there
01:02:47can't be so much arsenic, you know.
01:02:49And so I think that needs to be a combination of both.
01:02:52People should have control over their data, and they should be asked before that data
01:02:56is used in a separate context, like to set their insurance premiums.
01:03:01But there also have to be baseline rules, because as you point out, consumers do not
01:03:06have the time or the wherewithal to police the market and protect themselves on their
01:03:10own.
01:03:11Right.
01:03:12Good answer.
01:03:13And, Ms. Keck, you can opine on that as well, but I've got another question.
01:03:16So let me start with the question.
01:03:20You guys have discussed a little bit the advances in generative and traditional AI and how those
01:03:28advances really are fueled by data now.
01:03:31I mean, we recognize that, but training AI systems can't be, it really can't be at the
01:03:40expense of people's privacy.
01:03:43And reducing the amount of personal sense of data, as you have all discussed the notion
01:03:47of minimization, does really reduce the likelihood that data privacy and data security harm could
01:03:57happen.
01:03:58And Senator Blackburn, among all the other bills she listed, these issues are covered
01:04:07extensively in various hearings we've had on our subcommittee that she and I chair,
01:04:11consumer protection, product safety, and data security.
01:04:14So, Ms. Keck, I wanted to say, how would you quantify, how often, or what types of
01:04:20quantification do you say when you say how often is the data of consumers unnecessarily
01:04:29exposed within the AI model, and do you believe that the strict data minimization requirements
01:04:37can significantly help control this, let's call it data leakage?
01:04:43Senator, there are two parts, two ways in which I'll answer that question.
01:04:49The first is to say we don't know, and that's sort of part of why we're here today, which
01:04:52is that we don't have basic transparency about whether our data is being used, how it's being
01:04:58protected.
01:04:59And what we do know is that companies like Meta and Google are sort of at will changing
01:05:03their terms of service to say, you know, heads up, we're now using your data for AI,
01:05:09at the same time as we're seeing these chatbots routinely leak the personal information they
01:05:14were trained on.
01:05:15But I think if we did, you know, in the absence of clear data from these companies and a regulatory
01:05:22mandate that allows us to ask them for it, I think what we can already see just from
01:05:27the most obvious lapses is that this irresponsible data collection and use is happening everywhere,
01:05:32it's happening all the time.
01:05:34And I think one of the ways in which the U.S. might benefit from being somewhat of a laggard
01:05:38when it comes to data privacy law is to look at what hasn't worked elsewhere.
01:05:42What hasn't worked is a consent-only based regime, that's why we need accountability
01:05:47in addition to consent.
01:05:48What has worked is that the Brazilian data protection regulator recently banned Meta
01:05:53from using user data to train their AI because they found that there was children's images
01:05:58in the training data and it was being leaked on the other side, right?
01:06:01So there's a lot to learn and there's a lot of, I think, a foundation from which to act from.
01:06:06Great.
01:06:07And Mr. Tiwari, just quickly, the General Data Protection Regulation, the GDPR of the
01:06:15European Union, it's been in effect since 2018, I guess, 2019, 2018.
01:06:22Since then, it's been amended.
01:06:24They're still sorting it out, I think, in terms of governance structure.
01:06:29Without a U.S. data privacy law, how can we resume, have some leadership on the global
01:06:34stage on these issues?
01:06:37Thank you, Senator.
01:06:38By most accounts, there are currently at least 140 countries around the world that have a
01:06:43national federal privacy law.
01:06:45By not having a privacy law, the United States of America is not allowing its companies to
01:06:51be able to effectively compete with the companies from these countries.
01:06:55And that's because privacy is now a competitive differentiator.
01:07:00Like I mentioned earlier, people use the Firefox product because they believe in the privacy
01:07:03properties of the Firefox product.
01:07:06And without a baseline of such standards, the small and medium companies that we've
01:07:10been talking about will be unable to compete with other small and medium companies around
01:07:14the world.
01:07:15Great.
01:07:16Thank you very much.
01:07:17I yield back to the chair.
01:07:18Thank you.
01:07:19Senator Moran.
01:07:20Chairwoman, thank you very much and thanks to our panelists.
01:07:23Very important hearing.
01:07:24Thank you for holding it.
01:07:26The passage of federal data privacy legislation is long overdue.
01:07:30I chaired the Subcommittee on Data Privacy that Senator Hickenlooper now chairs and Senator
01:07:35Blackburn is a ranking member.
01:07:37Senator Blumenthal was the ranking member.
01:07:39We have come so close so many times, but never just quite across the finish line.
01:07:45And the problems and challenges with our lack of success continue to mount.
01:07:51I don't know that the issues get more difficult to resolve, but we still haven't found the
01:07:57will to overcome the differences of the things that each of us think is pretty important.
01:08:03I reintroduced my Comprehensive Consumer Data Privacy and Security Act again in this Congress.
01:08:10Gives Americans control over their own data, establishes a single, clear federal standard
01:08:13for data privacy, and provides for robust enforcement of data privacy protections that
01:08:17does not lead to frivolous legal actions that would harm small business.
01:08:21I think these are common sense policies, and again, I think there's a path forward utilizing
01:08:25those.
01:08:26And I would, again, say in this hearing, as I've said numerous times over the years, I'm
01:08:30willing to work with any and all to try to find that path forward.
01:08:35I have a few questions for the panel.
01:08:37I think I start with you, Mr. Reed.
01:08:41Important that data privacy requirements, in my view, established by federal law are
01:08:46shared by consumer-facing entities, service providers, and third parties, all of which
01:08:51may collect or process consumer data.
01:08:55Exempting segments of this value chain from requirements or enforcement under the data
01:09:00privacy law, I think, places an unfair burden on consumer-facing entities, including particularly
01:09:06small business.
01:09:07Mr. Reed, is that something you agree with when it comes to data privacy?
01:09:11Unfair burden should be shared across each entity that collects or processes data?
01:09:17I do, but I want to be careful about one thing.
01:09:19I don't want to, I know this sounds weird coming from the business side, so to speak,
01:09:24but I want to be careful that we don't say that shared responsibility becomes everybody
01:09:27touching their nose and saying, I'm not it.
01:09:30So I think the point at which you give your data over, the point at which you have that
01:09:33first contact, whether it's my members through an application or through a store that you
01:09:38visit, is the most logical point for the consumer to begin to say, hey, I want my
01:09:42data to be retrieved, or I don't want my data used in this way.
01:09:45So I think, yes, there is shared responsibility across the chain, but I don't want a situation
01:09:50where the front person who interacts with the customer can then say, hey, that's down
01:09:55the food chain three third parties from there.
01:09:57And so I think that's important.
01:09:58You avoid that by?
01:10:00Well, you avoid that by actually having a clear and concrete conversation, so to speak,
01:10:04with the customer when they provide the data.
01:10:07Here's what you're getting in exchange.
01:10:08Here's what the rules of the road are.
01:10:09And that's why a federal comprehensive privacy bill, and we appreciate the bill that you've
01:10:13worked on already on data, moves us in a direction of having consumers nationally have an understanding
01:10:21of what their rights and what our responsibilities are with their data.
01:10:25So absolutely.
01:10:26But it has to start with an understanding at the first place and then the shared responsibility
01:10:30throughout the chain.
01:10:31One more for you, Mr. Reed.
01:10:32Nineteen states, I think, is the number that have passed their own data privacy laws.
01:10:36It's a patchwork of state data privacy laws, increasing compliance costs.
01:10:41One estimate projects that compliance costs for business could reach $239 billion annually
01:10:46if Congress does not implement a data privacy law.
01:10:50Incidentally, I came to Congress believing and still believe that government closest
01:10:54to home is better than government far away.
01:10:56But it seems like I spend a bunch of my time trying to figure out how we take care of a
01:11:00complex situation with 50 different states and 50 different standards.
01:11:04Kansas doesn't have a data privacy law, but borders two states that do.
01:11:09Would you describe the challenges for small business associated with working in states
01:11:13with different data privacy standards?
01:11:15Well, your state is one of the ones in particular that we find most interesting because you
01:11:21literally have businesses in your state that have a primary number of their customers are
01:11:25actually from the bordering state because you have that crossroads that exist.
01:11:29So for Kansas and Oklahoma, you have customers across the border literally every day.
01:11:34So having a mixture of these privacy bills, now to be clear, a lot of states have some
01:11:40sort of small business carve out, but it's different and the definitions are different
01:11:44in every state.
01:11:45So any business that participates is going to have to figure out who's going to tell
01:11:49them what they do.
01:11:50And if that customer walks in and says, my zip code is this, oh, sorry, I have to treat
01:11:54your data this way.
01:11:55If it's this, then I have to do it another way.
01:11:57It is soul crushing for a small business to have to interview each each purchaser and
01:12:04ask them what what county they live in or what state they live in.
01:12:07So it's a huge burden on small business.
01:12:09And unfortunately, it's one that's really a lot easier for a multinational company to
01:12:13handle.
01:12:14Let me quickly ask Mr. Tuareg and Ms. Cack, it's important for Americans to understand
01:12:22when their data is collected and processed by companies.
01:12:25This belief is reflected in our data privacy legislation, which requires covered entities
01:12:29to publish their privacy policies in easy to understand language and to provide easy
01:12:34to use means to exercise their right to control over data.
01:12:39How can a federal policy ensure consumers are aware of their data is being used, even
01:12:45as AI potentially increases the complexity of how consumer data is processed?
01:12:49Thank you, Senator.
01:12:51It is essential for us to recognize that purely relying on consent has proven to be
01:12:56ineffective to protect the privacy of the average consumer.
01:13:01Technology has now become such a complex endeavor that to expect an average consumer to understand
01:13:06everything that can happen when their data is collected is a burden that they will never
01:13:10be able to meaningfully fulfill.
01:13:12And therefore, any effective federal privacy legislation must include accountability measures
01:13:18that also place obligations upon these entities for what they cannot do regardless of whether
01:13:23the consumer has consented to that behavior or not.
01:13:27Only then can consumers be sure that the government is working effectively to protect their privacy
01:13:33rather than hoping that they understand everything that may happen to their data.
01:13:36It's a sad state of affairs, actually.
01:13:38We can't understand it well enough to protect ourselves.
01:13:42Anything to add?
01:13:43Yeah, the only thing I would add, Senator, is sort of what do we do as consumers with
01:13:46that information?
01:13:47And that's really why transparency isn't enough and why proposals like the bipartisan
01:13:52ADPPA and the APRA are so important because they're putting down sort of rules of the
01:13:57road that apply regardless of consent and what consumers choose.
01:14:02Thank you both.
01:14:04Yes, I think Senator...
01:14:13Senator Budd.
01:14:14Thank you, Chairman.
01:14:16So, thank you all for being here.
01:14:20Technological leadership is foundational to American dominance in the 21st century.
01:14:24You know, in this country, we innovate great products and services that consumers want,
01:14:30which increase productivity and it sharpens competition.
01:14:33This spurs a positive cycle of further innovation and, you know, the Internet's a perfect example
01:14:37of that.
01:14:38And one of the factors that differentiates America, it's a regulatory environment that
01:14:43protects public interest and safety, while at the same time giving talented entrepreneurs
01:14:48and specialists the space to try new things.
01:14:52I think that AI should follow this tried and true path.
01:14:55But Mr. Reid, thanks again to all of you for being here.
01:14:58Mr. Reid, thank you for your opening example a few moments back in your opening comments
01:15:02about hog farms.
01:15:05Being from North Carolina, I really appreciate that.
01:15:08And it makes me only wish I'd had a little more bacon for breakfast.
01:15:12Always a good call.
01:15:13Right.
01:15:14Well, you know, in your testimony, you talked about the different ways that App Association
01:15:18members are creating and deploying AI tools.
01:15:21In fact, you said that 75% of surveyed members report using generative AI.
01:15:26Do you believe that there is currently a healthy amount of competition in this space?
01:15:31Well, I think what's been most amazing is the level of competition against bigger players
01:15:37is profound.
01:15:39The news is always covering Microsoft's moves and Meta's moves and other players' moves.
01:15:46But I'm telling you right now that we're seeing more moves by small and medium-sized companies
01:15:49to use AI in important ways.
01:15:52And one quick and critical example, if you've got a small construction business in a town,
01:15:58right now to bid on an RFP that gets let, it's a lot of nights of coffee and desperate
01:16:02typing on your computer to try to get one RFP out.
01:16:05But if I can look at all of your RFPs with AI, look at what you do best, what you claim
01:16:10you do best, and help you write that RFP so that instead of filing one, you file ten.
01:16:16Maybe you win two bids.
01:16:17Now you hire three people.
01:16:19I used your own data.
01:16:21A lot of my panelists are talking about using consumer data and sucking in consumer data.
01:16:26What my members are doing with a lot of this is actually using your private data stores
01:16:30to look at what you're doing well and help you improve upon it.
01:16:34And I think when we talk about AI and privacy, we have to remember that the ability to do
01:16:39something like that, simply help you write ten RFPs, is a huge advantage that AI provides.
01:16:45And it's something, frankly, small businesses are doing better than the large businesses
01:16:48right now.
01:16:49I appreciate that example, especially the advantages for small businesses.
01:16:53I read about four newsletters a day on this very topic, so I'm fascinated with it.
01:17:00The question, Mr. Reed, is about the FTC's antitrust policy.
01:17:05It seems to be focused against vertical integration, and we think it may have a chilling effect
01:17:10on your members' ability to develop and roll out new and better AI services.
01:17:14Even with some of the bigs we see, like with open AI, of course, people read news about
01:17:18that every day, but Microsoft and Apple not even having a board presence there.
01:17:23I don't know if that's fear against antitrust legislation or what, but we see that there's
01:17:28potentially a chilling effect.
01:17:29Do you have any thoughts on that?
01:17:31Yes.
01:17:32Unfortunately, the Federal Trade Commission's proposed rule on Hart-Scott-Rodino is terrible.
01:17:38I probably should use better words here in the committee, but it really puts small businesses
01:17:43at a huge disadvantage because it essentially establishes a floor for the potential for
01:17:49an acquisition.
01:17:51And what that does for small AI companies that are trying to figure out how to make
01:17:54their way in the world is to seek venture capital.
01:17:57Venture capital, or even your parents, they have to believe in your dream.
01:18:01And part of that dream is that you can have a huge success.
01:18:05When venture capitalists are putting money in, they're looking at a portfolio, and they
01:18:09know nine out of ten are going to fail.
01:18:11But if the FTC is essentially saying that they're capping at $119 million, anybody's
01:18:16success level, then that tells the venture capitalists to change the nine companies they're
01:18:20investing in because they can't dream bigger than $119 million.
01:18:24We also think that the Hart-Scott-Rodino proposal, as put forth, violates the RegFlex
01:18:30Act because they actually didn't take into consideration the impact on small and medium
01:18:34sized businesses.
01:18:35So, yes, it has a huge impact on competition for small, new AI startups, and we're incredibly
01:18:41disappointed and we look forward to seeing if we can convince the FTC to do the right
01:18:45thing and change their way.
01:18:46I appreciate your thoughts on that.
01:18:49Back to your comments on small business, I don't know if that weighs into your answer
01:18:52on the next question, Mr. Reid.
01:18:54But how should this committee weigh the need for firms to be able to use responsibly collected
01:19:00consumer data with the serious concerns that consumers have about the fact that their sensitive
01:19:06data may be breached or improperly used?
01:19:09How should we look at that as a committee?
01:19:10Well, as a committee, the first thing to do, which you've heard from everyone about two
01:19:14dozen times, is pass comprehensive privacy reform in a bipartisan way because that gives
01:19:18businesses the rules of the road on how to behave with consumer data.
01:19:22And figuring out how we balance data hygiene, data minimization, and all those questions
01:19:27are going to be part of the hard work that the Senate has to do on this topic.
01:19:31But overall, I would anchor it in the concept of harms.
01:19:34What harm is being done, what's the demonstrable harm, and how do you use the existing law
01:19:39enforcement mechanisms to go after those specific harms?
01:19:43I think it's a lot easier to empower existing law enforcement action than it is to try to
01:19:49create a new one out of whole cloth and hope it works.
01:19:52I appreciate your thoughts.
01:19:54Thank the panel for being here.
01:19:55Chairwoman, I yield back.
01:19:57Thank you so much.
01:19:58Senator Klobuchar.
01:19:59Thank you very much.
01:20:01Appropriate that we're doing this remotely for a tech hearing.
01:20:05I wanted to, first of all, talk about the fact that AI, and I've said this many times,
01:20:11quoted David Brooks.
01:20:13He says he has a hard time writing about it because he doesn't know if it's going to take
01:20:16us to heaven or hell.
01:20:18I think there are so many potential incredible benefits coming from the state of the Mayo
01:20:22Clinic that we're going to see here, but we have to put some guardrails in place if we're
01:20:27going to realize those benefits and not have them overwhelmed by potential harm.
01:20:32I know a lot of the companies involved in this agree.
01:20:36We do have to remember when it comes to privacy, these are no longer just little companies
01:20:40in a garage that resisted any rules for too long.
01:20:44Whether it's competition policy, children's privacy, that we need to put some guardrails
01:20:51in place and it will be better for everyone.
01:20:55I want to start, Professor Kalo, your testimony discussed how companies can collect or buy
01:21:01vast amounts of a person's non-sensitive information from what's in their shopping cart to their
01:21:08posts on social media and use AI to process all that data and make sophisticated inferences
01:21:14about their private health information, such as pregnancy status, whatever, with alarming
01:21:20accuracy.
01:21:21Can you speak to how data minimization can ease the burden on consumers trying to protect
01:21:27their privacy?
01:21:31Thank you, Senator.
01:21:32Yes, it's a critical issue.
01:21:34So many privacy laws, almost all of them, differentiate between sensitive categories
01:21:42of information, such as health status, and less sensitive information, even public information.
01:21:50But the problem with these AI systems is that they're extremely good at recognizing patterns
01:21:57in large data sets, and so they can infer sensitive things.
01:22:03And so how would data minimization help?
01:22:05Well, data minimization would, of course, restrict the overall amount of information
01:22:11and the categories for which it could be used.
01:22:14I think the most effective tool is to define categories of sensitive information to include
01:22:20not just sensitive information itself that's collected from the consumer as sensitive or
01:22:25observed somewhere, but also those sensitive inferences that are derived from AI.
01:22:32I think that's the clearest way.
01:22:34That way you know, as a corporation, please.
01:22:38Very good.
01:22:39Good answer.
01:22:41Mr. Tiwari, any comments on that?
01:22:44You know that we have demand for data at an all-time high, and will comprehensive privacy
01:22:50legislation perhaps sort of shape what developers and deployers do to adopt more privacy-preserving
01:22:59systems as they develop things, which they're already doing?
01:23:03Just quickly.
01:23:04Thank you, Senator.
01:23:05We very quickly, Mozilla very recently acquired an organization called Anonym.
01:23:10And what Anonym does is it takes privacy-preserving technologies and within trusted execution
01:23:15environments performs the operations that would take place in large distributed systems
01:23:21in a way that nobody can see the data, not the entity that's giving the data.