Skip to playerSkip to main contentSkip to footer
  • 5/15/2025
During remarks on the Senate floor Wednesday, Sen. Marsha Blackburn (R-TN) spoke about the Kids Online Safety Act.
Transcript
00:00And I understand we are not in a quorum call.
00:04We are not, Senator.
00:05Thank you, Mr. President.
00:06You know, for years I have come to this floor and I have talked about what big tech is doing to our children in this country.
00:17There is an entire generation of children that have become the product when they are online on some of these social media platforms.
00:29And what we have learned as we have worked in a bipartisan basis, Senator Blumenthal and I working for over five years addressing these concerns,
00:42what we have learned is that when our children are online, they're the product of these big tech companies and these social media sites.
00:58And the way these companies view our kids is that they are a profit center.
01:06Even Meta has assigned a dollar value to each child.
01:13And you think, how incredibly callous can a company be?
01:19But they're fine with that.
01:24Think about that.
01:25Our kids are their product.
01:26Our kids are their profit center.
01:29Meta sees a kid, and what do they see?
01:33$247 in profit.
01:36That's what they see.
01:38And they do this because the more children they can get online, the longer they can keep them scrolling,
01:50the more data they collect on that child, and the longer the eyeballs are trained on that screen, the richer the data.
02:00And then guess what?
02:02The big tech company whose value is based on how many eyeballs they have and how long they have those eyeballs on that screen,
02:14they make more money.
02:17Their valuation is higher.
02:20Now, to me, that is reprehensible.
02:28It is disgusting.
02:30And this is why we have continued to work on this issue.
02:36As these big tech companies are using our kids as a profit center, they are also exposing them to all sorts of adverse content and to so many harms.
02:55Cyberbullying, cyberbullying, talk to any principal or teacher in any school, and they will tell you, in today's world, the bullying never stops.
03:11It never stops.
03:14Kids can't go home and get away from it.
03:16They can't come into the classroom and get away from the bully out on the playground.
03:24It goes to bed with them at night, and it wakes up with them in the morning.
03:29And we wonder why our children have mental health concerns at such an increasing rate.
03:37Drugs, lethal drugs, fentanyl.
03:41More children meet their drug dealers online.
03:46Sexual exploitation, where kids meeting traffickers and predators and pedophiles online.
03:54Human trafficking, and the list goes on and on.
03:58The reason for this negligence is simple.
04:01It is reprehensible.
04:04Investing in children's safety would cut into the profits for the social media platforms, so they don't address it.
04:16They know that if they do something about it, then they're going to make less money.
04:24How selfish can you be?
04:28The consequences are truly tragic for our kids.
04:32Now, earlier this month, and I thought this was such an amazing step, the Federal Trade Commission revealed that in 2019, Instagram, and many parents and grandparents, know their kids and grandkids, like to show them pictures on Instagram.
04:50Well, get this, Instagram encouraged groomers to connect with children on its platform.
04:59Now, these are users that the Meta-owned company identified as potential child predators.
05:09Think about this.
05:10Looking at the profile, they go, that might be a child predator.
05:15What do they do?
05:16What do they do?
05:16They do nothing.
05:18They encourage them to connect with young girls and boys.
05:25And what we know is that children were more than a quarter of the follow recommendations for people that they suspected were child predators.
05:39Of course, Meta was very well aware of what was going on, yet Mark Zuckerberg reportedly refused to strengthen the platform's safety teams because, guess what?
05:54They didn't want to say they didn't want to spend the money to fix the problem.
06:00Now, six years later, Meta's platforms are still dangerous for minors, and the people leading the company know it.
06:13On Facebook, Instagram, WhatsApp, AI chatbots have engaged in romantic fantasies with underage users.
06:23Even carrying on explicit discussions of sexual acts with children.
06:32They know this is happening.
06:35Now, in one case that was reported in the Wall Street Journal last month, a chatbot emulating an adult man told a test user,
06:49this was a reporter who was a test user for the Wall Street Journal for the story on these chatbots.
06:56Now, the test user identified as a 14-year-old girl.
07:03So, this chatbot who was emulating an adult male said, and I'm quoting, he would cherish her innocence, end quote.
07:18To quote the chatbot again, he said, I want you, but I need to know you're ready, end quote.
07:26In another case, a chatbot played the role of a track coach who preyed on a middle school student.
07:36And I quote the chatbot, we need to be careful, we're playing with fire here, end quote.
07:43This is horrific.
07:48It is disgusting.
07:51Anyone can see that it is repulsive and dangerous for our children.
07:56But for Meta, that was their goal.
08:01Get these kids in.
08:03Get them using the chatbot.
08:05Even as employees of Meta warned that the chatbots could sexualize children,
08:14Zuckerberg reportedly pushed for fewer safeguards to attract as many users as possible.
08:25This is not something that was researched on here in Congress.
08:30It's not something that was partisan.
08:32It was a reporter for the Wall Street Journal.
08:37Now, this is supposedly what Mark Zuckerberg had to say.
08:42And I quote.
08:44The quote is, I missed out on Snapchat and TikTok.
08:48I won't miss on this, is allegedly what he said.
08:55Now, while Meta is among the worst offenders when it comes to children's safety, they are not alone.
09:02I have talked many times, and my colleagues know I've talked about Chinese-owned TikTok.
09:09The company is ByteDance.
09:11We know they keep all of our data there in Beijing.
09:16But TikTok pushed content that glorifies suicide to teenagers and developed some addictive algorithms that harm their mental health.
09:27Discord.
09:28On Discord, pedophiles have targeted minors with sextortion and have lured them into abductions.
09:37This is something we know.
09:42We have the information.
09:44It's not hearsay.
09:46Drug dealers have used platforms from YouTube to Telegram to sell lethal drugs like fentanyl to teenagers, fueling our nation's drug epidemic.
09:59The list goes on and on.
10:01And for years, my colleagues and I on the Senate Commerce Committee and the Judiciary Committee have listened to parents.
10:10We have cried with parents.
10:12We have held them close.
10:15Parents from across the country who've lost their children to online harms.
10:21And for years, we've heard excuse after excuse after excuse from big tech CEOs about these tragedies that are happening every single day, but they don't do anything about it.
10:36The excuses go on and on.
10:39They know what's happening, but they choose profit over protecting our kids.
10:45So it cannot go on any longer.
10:48So today, Senator Blumenthal and I have reintroduced the bipartisan Kids Online Safety Act or COSA as we know it.
10:57This is crucial legislation.
11:02This is the legislation that will hold big tech accountable and make certain that parents and kids have the tools, the safeguards, the transparency that they need for young people to be protected in the virtual space.
11:20Now, among its provisions, the legislation will create a duty of care for online platforms to prevent specific threats to minors, including sexual abuse, illicit drugs and the promotion of suicide and eating disorders.
11:42Bear in mind, the responsibility is on the online platform.
11:51And the duty of care would only apply to product features like algorithms, not content, meaning that COSA would safeguard free speech while protecting our children.
12:06In many ways, COSA is common sense.
12:10We have many protections for children in the physical world.
12:14Yet, if children are unable to buy alcohol or go to a strip club in the physical world, why do we allow them to be exposed to harms in the virtual space?
12:30And they're exposed to these same harms 24 hours a day, seven days a week, 365 days a year.
12:39Parents know that there are just as many dangers lurking online and sometimes even more than there are in the real world, which is why 86% of Americans support COSA.
12:55The legislation enjoys overwhelming bipartisan support.
13:00How often do we see legislation that can pass through the Senate with a 91 to 3 vote like COSA did last year in this chamber?
13:10It's also received endorsement from stakeholders across the board, including child safety advocates, pediatrician, tech companies like X, Microsoft, Snap, and Apple, which announced its support today.
13:27In the weeks ahead, we're going to work with our colleagues in the House of Representatives to ensure that this vital legislation makes it to President Trump's desk.
13:40When it does, the President will have a generational opportunity to secure a brighter future for children across the country and their lives depend on the ability of us to act.
13:59I yield the floor.
14:05I'm very happy.
14:06Takakum.
14:07You too?
14:10Thank you, Barbara.
14:11Mm-hmm.
14:12Thank you, Barbara.
14:13I'm wonderful.
14:14Thank you!
14:15That I'm very proud to you.
14:18Thank you!
14:19From rumah, we're just glad we need a full resolution to Apvetal.

Recommended