Youth Online Safety: Four Bills to Watch in Congress


6 minute read | October.03.2023

Online safety for children and teens is squarely on the Congressional agenda. U.S. laws aimed at protecting kids online predate the age of social media and the creator economy, and many state and federal legislators have been pushing for additional protections for minors.  The Children’s Online Privacy Protection Act (COPPA), for example, is more than 20 years old and only applies to children under 13.

Federal legislators have introduced (and reintroduced) a plethora of bills with the stated goal of addressing online risks to kids and teens.  The four worth keeping an eye on this session are the:

Two of the Bills Enjoy Broad Bipartisan Support: KOSA and COPPA 2.0

Of these four bills, KOSA and COPPA 2.0 have gained the most traction and broad bipartisan support in the U.S. Senate.

The Senate Commerce, Science, and Transportation Committee unanimously approved both bills in July. That means the full Senate can now consider the bills, although the timing remains unclear. We expect the Senate to make additional changes to both bills, perhaps in an effort to garner broad bipartisan support and facilitate passage through the House.

In its current form, KOSA would require “covered platforms” to provide parents with more disclosures and tools to monitor their children’s internet use, requiring them to:

  • Disclose personalized recommendation systems [1] and targeted advertising information;
  • Publish annual reports about “foreseeable risks” of harm to minors from using the platform;
  • Refrain from promoting harmful behavior (e.g., suicide, substance use disorders, eating disorders, bullying) through content moderation methods (e.g., filtering and blocking) where a covered platform knows or reasonably should know a user is under 17.

KOSA would empower state attorneys general to sue operators of websites, apps, and covered platforms based on their subjective belief that hosted material may harm young people. Opponents point out that such a provision would perversely empower states to engage in viewpoint discrimination, including censoring content about politically controversial topics (e.g., LGBTQIA+ content).

If passed, the law will have consequences far beyond the tech giants. It would likely require a wide array of interactive video games, streaming services, and websites to age-gate their websites, applications, and covered platforms.

In its current form, COPPA 2.0 would require an “operator of a website, online service, online application, mobile application, or connected device” to do the following:

  • Extend the law’s privacy protections to teens ages 13 through 16 (the current law only covers minors under 13);
  • Prohibit operators of websites, online services, online applications, mobile applications, or connected devices from collecting minors’ and teens’ personal information without authorization from the minor’s parent or authorization from the teen;[2]
  • Ban targeted advertising directed at minors; and
  • Create an “eraser button” for users and parents of minor-users to permit the removal of personal information related to minor-users when technologically feasible.

Two Bills Remain Stalled: STOP CSAM and EARN IT

The STOP CSAM and EARN IT bills, by contrast, presently lack bipartisan support and remain stalled in the Senate Judiciary Committee. That said, the language in these bills provide indications of which areas of perceived online risk legislators are focusing on.

The STOP CSAM Act would require service providers subject to 18 U.S.C. § 2258A—i.e., providers with a statutory duty to report child sexual abuse material (CSAM) to the National Center for Missing and Exploited Children (NCMEC)—to submit annual reports describing efforts to promote a culture of safety for children on their services.

It would also broaden the CSAM reporting regime by requiring providers to report actual knowledge of any facts or circumstances of potential child exploitation offenses that are planned or imminent. (Under current law, this reporting is discretionary.) The bill would also provide a variety of tools to promote compliance, including civil penalties for failure to report, remove, preserve, or comply with the bill’s annual reporting requirement.

The STOP CSAM Act also would create a criminal provision prohibiting the use of online services to “promote or facilitate” online child sexual exploitation. (However, the bill does not define what it means to “promote or facilitate”.) Opponents argue that expanding the scope of potential liability creates strong incentives for online service providers to be overly conservative in content moderation and the removal of user-generated content, which is likely to block lawful, constitutionally protected speech.

The EARN IT Act would amend Section 230 of the Communications Decency Act to remove its protections when the content at issue is CSAM. Under the EARN IT Act, a provider’s use of encrypted services cannot serve as an independent basis for liability, but courts may still consider evidence about whether and how a provider employs end-to-end encryption as part of the analysis. Opponents argue that the risk that encryption could be used as evidence against them will discourage providers from offering it altogether.

The EARN IT Act would also add particularity to Section 2258A reporting obligations to specifically require identification of available facts and circumstances “sufficient to identify and locate each minor and other involved individual(s).” The bill would also expose providers to longer material preservation duties. A company’s knowing failure to abide by these reporting requirements could result in a civil penalty of up to $150,000 for a first-time violation and up to $300,000 for each subsequent violation.

The Bottom Line

To be sure, Congress is focused on online privacy and safety for children, and developments to the civil (and likely criminal) legal landscape seem inevitable. Should any of these bills pass, their competing requirements to collect and verify minors’ age, location, and consent data while protecting their privacy (and in some cases, refraining from collecting their data) raise thorny questions of implementation for tech companies.

Companies with any sort of online presence, particularly with users under 18, should stay vigilant and factor in these legislative headwinds as they design, execute, and scale up their services, so as to adjust quickly if need be. Orrick’s U.S. Online Safety Working Group is tracking proposed federal and state legislation, as well as relevant litigation, in real time.

Want to know more? Contact one of the authors.



[1] Defined as “fully or partially automated system used to suggest, promote, or rank information based on the personal data of users.”

[2] Under the bill, before personal information is collected, the parent of the minor or the teen must “freely and unambiguously authorize” the collection. Operators must use “any reasonable effort (taking into consideration available technology)” to ensure that the teen or minor’s parent authorizes collection.