The Kids Online Safety Act (KOSA) has been one of the most debated pieces of tech legislation in years, and it finally went to a Senate vote in mid-2024 with overwhelming bipartisan support. The bill passed the Senate 91-3. That kind of margin is almost unheard of for tech regulation. But the legislation’s broad language has privacy advocates, digital rights groups, and tech professionals deeply concerned about what it actually means in practice — particularly for anonymous speech online.
The bill’s stated goal is straightforward: protect minors from harmful content on the internet. It would impose a “duty of care” on covered platforms, requiring them to act in the best interest of minors and to mitigate specific harms like promotion of suicide, eating disorders, bullying, sexual exploitation, and drug use. Platforms would need to enable the “strongest privacy settings” by default for users under 17 and provide parents with tools to supervise their children’s online experience. The Federal Trade Commission would enforce it.
Sounds reasonable on its face. The problem is implementation.
As CNET reported, critics argue that KOSA could effectively end internet anonymity. To determine who’s a minor and who isn’t, platforms would almost certainly need to verify users’ ages — and meaningful age verification generally requires some form of identity documentation. That means government IDs, facial recognition scans, or third-party verification services that collect sensitive personal data. For everyone, not just kids.
The Electronic Frontier Foundation has been one of the bill’s loudest opponents. The EFF has called KOSA a “censorship bill” dressed up in child-safety language, warning that it would pressure platforms to over-filter content and suppress speech on topics like LGBTQ+ identity, reproductive health, and mental health resources — the very information vulnerable teens often need most. The organization argues that giving government officials the power to define what’s “harmful to minors” creates an obvious tool for politically motivated censorship.
And that’s not a hypothetical concern. Multiple state attorneys general would have enforcement authority under KOSA. In states where officials have explicitly targeted LGBTQ+ content, this provision hands them a legal mechanism to pressure platforms into removing or restricting that material.
Senator Ron Wyden, one of only three senators to vote against the bill, didn’t mince words. He said KOSA would “silence the voices of marginalized young people” and described it as a vehicle for conservative officials to target content they find objectionable.
So what about the age verification piece specifically? This is where things get technically messy. There’s no widely deployed, privacy-preserving method for verifying age online at scale. The most common approaches — uploading a driver’s license, submitting to a facial age-estimation scan, or using a credit card — all involve handing over personal information that creates new attack surfaces for data breaches and surveillance. A centralized database linking real identities to online accounts is a privacy nightmare waiting to happen.
Some proponents point to emerging standards like zero-knowledge proofs or tokenized age credentials that could theoretically verify age without revealing identity. But these technologies aren’t mature, aren’t standardized, and aren’t deployed at anything close to the scale the internet requires. Building that infrastructure would take years and enormous investment, with no guarantee it would actually protect privacy in practice.
The tech industry’s response has been mixed. Major platforms have publicly supported the general concept of kids’ safety legislation — it’s politically toxic to oppose protecting children. But behind the scenes, companies have raised concerns about the compliance burden and the vagueness of the “duty of care” standard. What exactly constitutes a platform “acting in the best interest” of a minor? The bill doesn’t define that with precision, which means the FTC and state AGs would fill in the blanks through enforcement actions. That uncertainty is a liability headache for any company operating at scale.
KOSA’s companion bill, the Children and Teens’ Online Privacy Protection Act (COPPA 2.0), passed the Senate alongside it. COPPA 2.0 extends existing children’s privacy protections to teenagers up to age 16 and bans targeted advertising to minors. Together, the two bills represent the most aggressive federal attempt to regulate kids’ online experiences since the original COPPA in 1998.
Neither bill made it through the House before the end of the 118th Congress. The legislation stalled, partly due to opposition from digital rights groups and partly because House leadership didn’t prioritize it. But the bills have been reintroduced, and pressure from parents’ groups and bipartisan momentum mean they aren’t going away.
On X (formerly Twitter), discussion around KOSA has remained heated. Digital rights advocates continue to flag the anonymity implications, while supporters — including many parent advocacy organizations — argue that the status quo is failing kids and that perfect shouldn’t be the enemy of good. Senator Marsha Blackburn, one of KOSA’s co-sponsors, has repeatedly posted in support of the bill, framing opposition as siding with Big Tech over children’s welfare.
That framing is effective politically. It’s also reductive.
The real tension here isn’t between protecting kids and protecting Big Tech. It’s between two legitimate concerns: children’s safety online and the right to anonymous speech that has been foundational to internet culture and, in many cases, to the physical safety of vulnerable users. Whistleblowers, abuse survivors, political dissidents, LGBTQ+ youth in hostile environments — these people depend on anonymity. Any legislation that undermines it needs to reckon with those stakes honestly.
For industry professionals, the practical takeaway is this: if KOSA or something like it eventually passes, companies will face significant new compliance obligations around age verification, content moderation, and default privacy settings for minors. The technical and legal costs will be substantial. And the broader implications for how identity works online could reshape the internet in ways that go far beyond kids’ safety.
Worth watching closely. Not just as policy, but as precedent.
The Kids Online Safety Act Could Kill Internet Anonymity — Here’s What You Need to Know first appeared on Web and IT News.
ZenaTech Files Early Warning Report Pursuant to National Instrument 61-103 Vancouver, British Columbia–(Newsfile Corp. –…
HIVE Digital Announces Closing of Private Offering of US$115 Million of 0% Exchangeable Senior Notes…
ImagineAR Inc. Voluntarily Withdraws Common Shares from OTCQB Venture Market Vancouver, British Columbia–(Newsfile Corp. –…
Deveron Announces TSXV Delisting Date Toronto, Ontario–(Newsfile Corp. – April 21, 2026) – Deveron Corp.…
Titan Logix Corp. Reports Its Fiscal 2026 Q2 and YTD Financial Results (In $000’s of…
Educational Development Corporation Announces Fiscal Year 2026 Earnings Call, 2026 Annual Meeting of Shareholders and…
This website uses cookies.