Controversial Kids Online Safety Act Reintroduced, Raising New Questions About Free Speech and Big Tech

In this photo illustration, the social medias applications logos, Twitter, Google, Google+, Gmail, Facebook, Instagram and Snapchat are displayed on the screen of an Apple iPhone on October 08, 2018 in Paris, France. Google has decided to close its Google+ social network after discovering a security vulnerability that has affected the data of at least 500,000 users.
Photo: Chesnot/Getty Images

A sweeping online safety bill designed to shield minors from harm on the internet is making its way back through Congress, reigniting a fierce debate over privacy, speech, and corporate accountability in the digital age.

The Kids Online Safety Act (KOSA), first introduced in 2022, was recently reintroduced in the U.S. Senate with bipartisan support. If passed, it would mark the most significant federal regulation of internet content since the Children’s Online Privacy Protection Act (COPPA) was enacted in 1998.

Backed by Sen. Richard Blumenthal (D-Conn.) and Sen. Marsha Blackburn (R-Tenn.), the legislation would impose a “duty of care” on major tech platforms, requiring them to take more aggressive action to protect children from online threats such as eating disorders, suicide, sexual exploitation, substance abuse, and exposure to harmful content.

The bill passed the Senate last year with overwhelming support but stalled in the House. Now, with mounting public concern about teen mental health and the influence of social media, lawmakers are taking another run at turning KOSA into law.

A Growing Push for Online Safety

Supporters of KOSA say the bill is long overdue. They argue that social media companies and digital platforms have failed to adequately protect minors, despite knowing the risks posed by their algorithms and data practices.

“Everyone has a part to play in keeping kids safe online,” Apple’s senior director of Government Affairs Timothy Powderly said in a statement released Tuesday. “Apple is pleased to offer our support for the Kids Online Safety Act. We believe this legislation will have a meaningful impact on children’s online safety.”

Apple joined a growing list of major tech companies that have expressed support for the measure. Microsoft, Snap, and X (formerly Twitter) have all voiced approval, with X CEO Linda Yaccarino reportedly contributing input to the latest draft.

ADVERTISEMENT

Meanwhile, Google and Meta—the parent companies of YouTube, Instagram, and Facebook—have not endorsed the legislation and remain opposed, citing concerns about operational burdens and vague regulatory standards.

Concerns From Civil Liberties Advocates

Despite high-profile endorsements and political momentum, KOSA continues to face resistance from digital rights advocates and civil liberties organizations, many of whom say the bill opens the door to censorship, surveillance, and undue government influence over online speech.

Groups including the American Civil Liberties Union (ACLU), the Electronic Frontier Foundation (EFF), and Fight for the Future have publicly condemned the bill, even after several amendments were introduced to address constitutional concerns.

“The bill’s authors have claimed over and over that this bill doesn’t impact speech,” Fight for the Future said in a statement. “But the Duty of Care is about speech: it’s about blocking speech that the government believes is bad for kids. And the people who will be determining what speech is harmful? They are the same ones using every tool to silence marginalized communities and attack those they perceive as enemies.”

Critics argue that the bill could give state attorneys general sweeping power to demand content removal or moderation, especially in more politically conservative states. They fear that marginalized communities—particularly LGBTQ+ youth—could be disproportionately affected if certain content is deemed “harmful” under subjective or ideologically driven standards.

What’s in the Bill?

Under the current draft, KOSA would:

ADVERTISEMENT

  • Require online platforms to implement default safety settings for users under 17

  • Establish an obligation to prevent and mitigate harm caused by content related to suicide, self-harm, eating disorders, addiction, and sexual abuse

  • Provide parents and guardians with more control over a child’s privacy settings, time limits, and data tracking

  • Allow state attorneys general to enforce compliance and seek penalties for violations

Supporters say these provisions are common-sense steps to create safer digital spaces for young users. Opponents, however, say the definitions of “harm” and “duty of care” are too broad and could be weaponized.

Public Pressure and Political Stakes

The reintroduction of KOSA comes at a time of heightened scrutiny for Big Tech. Multiple congressional hearings have spotlighted the mental health effects of social media on children and teens, and whistleblower reports have alleged that companies such as Meta were aware of the risks posed to young users by their algorithms but failed to act.

This renewed public pressure is bolstering bipartisan efforts to regulate tech, though lawmakers remain divided on how far the government should go.

“The powerful algorithms these companies use are not neutral,” Sen. Blumenthal said during a recent committee session. “They are designed to keep young users online longer, regardless of the mental or emotional toll. KOSA is a step toward putting responsibility where it belongs—on the platforms profiting from children’s vulnerability.”

As the bill moves forward, its fate remains uncertain. While it enjoys support in the Senate, where it previously passed with strong bipartisan backing, the House remains a tougher sell—especially given free speech concerns raised by civil liberties groups and some Republican lawmakers.

The current draft has undergone revisions to clarify that platforms are not required to remove specific speech, but critics say the threat of legal action could still result in over-moderation or preemptive censorship.

Even so, KOSA’s supporters believe that the tide is turning.

“Tech companies have had years to regulate themselves, and they’ve failed,” said one Senate aide involved in the bill’s drafting. “If we don’t act now, we’re only enabling further harm.”

Whether KOSA becomes law or not, the debate highlights a deeper question that society has yet to fully answer: Who should decide what’s safe for kids online—and at what cost to free expression?

More headlines