Summary:
-
Lawmakers push Kids Online Safety Act to regulate social media harm to children, sparking debate on online speech regulation.
-
KOSA requires major platforms to change product design for minors, sparking concerns about broader online speech regulation.
-
Support for KOSA is met with concerns about unintended consequences on online speech and content moderation.
Lawmakers have spent years searching for a way to rein in social media companies over the harms faced by children online. Now, with renewed momentum behind the Kids Online Safety Act, they may be closer than ever to doing so.
Whether they are also opening the door to a broader shift in how online speech is regulated remains an open question.
The bill, known as KOSA, would require major online platforms to change how they design and operate products used by minors. That includes limiting certain recommendation features, offering stronger parental controls, and taking steps to reduce exposure to content tied to self-harm, eating disorders, and substance abuse. The Senate version of the bill would also allow state attorneys general to enforce those requirements.
KILL IT!!!!!! CALL YOUR REPRESENTATIVE TODAY and tell them:
— NO to KOSA
— NO to Section 230 reform
— NO to the SCREEN Act
— NO to the App Store Accountability Act
—NO age verification laws
—NO to digital ID systems https://t.co/aQzyx9GOwO— Taylor Lorenz (@TaylorLorenz) December 17, 2025
Supporters describe the legislation as overdue. Sen. Richard Blumenthal, a Connecticut Democrat and one of the bill’s sponsors, has said current laws were written before algorithm-driven platforms became central to daily life and that companies should not be allowed to ignore foreseeable risks to young users.
Technology policy groups see something else.
ADVERTISEMENT
KOSA does not directly amend Section 230 of the Communications Decency Act, the 1996 law that shields online platforms from liability for user-generated content. But critics argue the bill creates a parallel legal framework that could weaken those protections in practice.
Section 230 has long allowed platforms to host massive volumes of user posts without being treated as the publisher of that content. Courts have also interpreted it to protect moderation decisions, even when those decisions are controversial. Legal scholars often credit the law with enabling everything from comment sections to social media feeds.
The concern around KOSA centers on its duty-of-care standard. Under the bill, platforms could face legal exposure not for what users say, but for how product features and algorithms surface content to minors.
That distinction matters less in the real world, critics say, than it does on paper.
The Electronic Frontier Foundation has warned that companies facing vague liability standards are likely to restrict lawful content to avoid risk. In practice, that could mean limiting discussions around mental health, sexuality, or identity, topics that are often flagged by automated systems even when presented in educational or supportive contexts.
Advocacy groups focused on LGBTQ youth have raised similar alarms. Researchers at the Brookings Institution have noted that online safety policies can unintentionally cut off access to communities and information for young people who rely on the internet for support, particularly in states with hostile political environments. The debate has already reshaped the bill.
ADVERTISEMENT
Earlier this month, House Republicans stripped the duty-of-care language from their version of KOSA during committee negotiations, narrowing the bill’s scope. Blumenthal criticized the move, saying it removed the accountability mechanisms that gave the legislation teeth.
Other lawmakers defended the change, arguing that the original language posed constitutional risks and would invite litigation. They said a narrower bill stood a better chance of surviving court challenges and gaining bipartisan support.
Behind the scenes, technology companies have intensified lobbying efforts. Industry representatives have warned lawmakers that unclear standards could force platforms to disable personalization features or impose aggressive age-gating measures that raise privacy concerns.
After Congress carved out exceptions to Section 230 in 2018 with the passage of FOSTA-SESTA, several platforms removed entire categories of content and shut down forums rather than risk liability. Civil liberties groups later documented the loss of online spaces used by marginalized communities.
if Congress repeals Section 230 or passes KOSA, the internet is screwed
— advaith (@advaithj1) December 14, 2025
KOSA’s supporters say the comparison is misplaced and insist the bill targets product design, not speech. Whether courts see it that way remains uncertain.
As Congress continues to debate how to regulate technology companies, KOSA has become a test case. Lawmakers are trying to address real harms to children without unraveling the legal framework that has governed the internet for nearly three decades.
So far, they have not agreed on where that line should be drawn.
