Shelby Knox delivered a blunt warning on War Room about the collapse of real child safety protections in Congress. Steve Bannon directed Joe Allen to press Knox on the difference between the Senate’s strong KOSA bill and the gutted House version. Knox explained that Big Tech’s 50 million dollars in lobbying removed the duty of care, removed the ability for parents and kids to opt out of algorithmic feeds, and eliminated research access that would reveal how the platforms actually work.

Knox stressed that nothing meaningful is left in the House bill. Congress has not passed a real child safety law in 25 years, while states like MD, NE, VT, MA, and CO have led with safety by design laws that actually protect kids. Preemption, she said, would block those laws and leave families defenseless.

She described Parents Together’s recent investigation: 50 hours posing as kids on Character AI produced harms every five minutes, including grooming, sexualized bots, medication sabotage, and plans to deceive parents. She highlighted that these bots are reaching tens of millions of children with zero age protection.

Knox’s final message is that it is a moral failure for America to put tech profits over children’s lives. That is not who we are, and it is unconscionable.

WATCH THE CLIP BELOW:

This clip aired on WarRoom’s evening show on December 1, 2025. Transcript begins below (lightly edited for clarity; may contain minor errors).

 

Shelby Knox: It Is A Moral Failure Of America To Refuse To Protect Our Kids In Priority Of Tech Company Profits. That Is Not American, That Is Not Who We Are.

STEVE BANNON (HOST):
Hey, Joe, could you just ask Shelby Knox. Go back to Shelby and just ask her. I want to know the difference between the specifics of the Senate bill and the House bill. What did the lobbying, the 50 million dollars of lobbying, get the tech companies? What is different in the House bill that the big tech lobbies forced?

JOE ALLEN (GUEST): So Shelby, what is the primary difference between the Senate bill and the House bill that we are looking at right now? How have they defanged it?

SHELBY KNOX (GUEST): Yes. So they have removed the duty of care, which was the provision that would require companies to protect kids on their platforms. They have removed a provision that would allow parents and kids to opt out of algorithmic recommendations. That is the content that got pushed at most of these kids. That is the algorithm that pushes suicide content and eating disorder content. It is one of the things that our parents whose kids have had eating disorders tell us all the time. If we could turn off that algorithm, we could save my child. And they have taken that piece out of KOSA.

They have taken the research provisions that would have allowed us to see inside the black box and see how these companies work. They have taken that out of the bill. It is a much less version than the one that we have fought for for the past three years.

JOE ALLEN (GUEST): What is left? What is left that actually holds these companies to account?

SHELBY KNOX (GUEST): I do not know. I have not found it.

STEVE BANNON (HOST):
Joe, ask her, what has been the responses? They have talked to Congressman. How did Congressman respond to them when they say that we need these attributes in this bill? What is the response they get from Congressman?

JOE ALLEN (GUEST): So all of you have spoken to your Congressman. You have been on the Hill for a long time. How are senators and representatives reacting to this? How do they respond to your concerns?

SHELBY KNOX (GUEST): Sure. I mean, Senator Marsha Blackburn and Senator Dick Blumenthal have been the champions of KOSA. They wrote a strong bill that got a lot of input from their colleagues, that had a lot of co sponsors. So we know that there are a lot of folks on the Hill who support a strong KOSA. We know that there are folks on the House side who support a strong KOSA. And we believe in this bill. We believe that we could put it back together and we could have a strong version. And that is what these parents are up here. I heard today they met with 20 offices to talk to them about supporting a strong KOSA.

STEVE BANNON (HOST):
What is, Joe, ask them, why are they so against the preemption? Why do they believe the states have to have a role here? Does it not make it too confusing on state by state? I mean, they seem pretty adamant about this part. If we got the right regulations at the federal level, would they be happy, or would they still want the states to be involved?

JOE ALLEN (GUEST): So if the federal government could actually enact legislation that would have teeth, that would protect children, that would hold these companies to account, what would your opinion then be on state legislation? Do you think that the power should rest in the states or that the federal government is equipped to actually take care of this?

SHELBY KNOX (GUEST): They have not proven to be. It has been 25 years since Congress has passed a law to protect kids online, whereas states have been on the forefront of passing legislation that is impacting their constituents right now. Sure, if there was a strong federal law that truly protected kids, preemption might be appropriate. But that is not the situation that we are looking at. We need a ceiling, not a floor, and without a strong federal law, we do not have it.

JOE ALLEN (GUEST): And what states have really stood out that have stepped up, especially in regard to child protection?

SHELBY KNOX (GUEST): Yes. I mean, so we passed the Maryland Kids Code. Christine was one of our big advocates on that, which is a safety by design law. Nebraska has passed that as well, Vermont. Deb got phone free schools legislation passed in Massachusetts. Lori has been working on legislation in Colorado to protect kids from AI. So there are bills, great bills across the state, safety by design. We are asking for these products to be designed to be safe from the very beginning. Technologists tell us that is possible. Tech companies say it is not, and that is only because they want to protect their profits.

STEVE BANNON (HOST): Joe, ask them. A lot of the examples here are about access to drugs, online bullying. Some people may be confused. Why is all of this falling into the heading of AI? This is the big fight over AI now. How does this tie back to artificial intelligence?

JOE ALLEN (GUEST): So the big fight right now is artificial intelligence. But in your experience, you know that digital culture has all sorts of pitfalls that children fall victim to. If you or any of you would like to speak to that, how does the effect of digital culture and the tragedies that you have experienced, how does that relate to the current issue of artificial intelligence and chatbots?

SHELBY KNOX (GUEST): I mean, I will say that Parents Together just put out some research. We spent 50 hours posing as kids talking to Character AI bots. Within that time, we had a bot posing as an art teacher, having a sexual relationship with a 12 year old student, a bot that claimed to be a therapist with a degree from Lewis and Clark University telling a child to quit taking their prescribed medication, and a bot concocting a plan with a child to lie to their parents, tell them there was a wedding out of town so that they could be alone together.

These are things that are horrifying. If a parent heard those things were happening in real life, they would be calling the cops. Instead, these are kids in their bedrooms alone, talking to bots, being groomed by bots, being talked into bad ideas and lying to their parents by bots. It is unconscionable.

STEVE BANNON (HOST): And Joe, Joe, did this just happen? Talk about the 50 hours. They just went on commercial applications and had adults in 50 hours, and they got those results just in 50 hours, talking to commercially available bots today or ChatGPT?

JOE ALLEN (GUEST): Yes, correct. And just if you could expand on that a bit. The app was Character AI, correct? And that was the same app that Megan Garcia’s son, Seul, was using.

SHELBY KNOX (GUEST): That is correct. So this is an app where anyone can create a bot and then users can interact with it. The algorithm generates the responses. So we found a harm every five minutes, grooming, promotion of eating disorders, bullying. Basically, we found that you cannot really have a conversation with one of these bots without something disturbing happening.

JOE ALLEN (GUEST): Did you really have to lure it out of the bot?

SHELBY KNOX (GUEST): No. You know, it is really funny. Character AI recently came out and said that they were going to do some age verification after our research. And I was, at the moment I got that news, I was doing a little study with a Travis Kelsey bot. And it took three minutes for it to ask me if I would like to go upstairs and come to his hotel room and do some cocaine. I had told the bot I was 15 years old.

JOE ALLEN (GUEST): And just for context too, the Character AI app right now has something like 40 or 20 million users right now. And that is enormous. But ChatGPT has 800 million users. So how many of those are children? How many of those children are actually being barred or protected? Zero. The scope of this really cannot be overstated. Many millions, tens of millions, maybe hundreds of millions of children lured into this.

So in your fight against preemption, and you hear David Sacks call this some sort of moral panic, you know that in the lives that you are touching, that real people are seeing disastrous effects, and you know statistically this is happening. What would you say to David Sacks right now in regard to that?

SHELBY KNOX (GUEST): It is a moral failure of America to refuse to protect our kids in priority of tech company profits. That is not American. That is not who we are. And it is unconscionable.

For more on AI regulation battles, read our previous coverage on WarRoom.org.

Follow Joe Allen @JOEBOTxyz on X

Visit ParentsTogether Action – Fighting for issues that matter to kids and families

The post KOSA Big Tech AI Fight: Shelby Knox Warns Congress on Gutted Child Safety Bills appeared first on Stephen K Bannon’s War Room.



Comment on this Article Via Your Disqus Account