Safety Online Pushing Likely Changes to Section 230


4 minute read | March.21.2023

Since its inception in 1996, Section 230 of the Communications Decency Act has protected internet platforms from liability for third-party content posted on the platform.  However, changes to Section 230’s protections are brewing, as the United States (U.S.) government grapples with critical issues raised by this longstanding immunity.  Orrick is tracking these new developments closely and will provide updates as they unfold.

Congress

On Tuesday, February 14, the U.S. Senate Judiciary Committee convened a hearing on "Protecting Our Children Online."  Chairman Dick Durbin and others argued that social media and smartphones can affect children’s mental and physical health – many referencing an uptick in cyberbullying and hate speech online, coupled with the use of algorithms purportedly designed to ensure young users stay logged in.  The Committee urged tech companies to “be part of the solution, instead of the problem.”

The Committee was heavily weighted toward witnesses who stressed the importance of tech companies cooperating with efforts to protect children online. 

  • Kristin Bride, whose son committed suicide after experiencing cyber-bullying from anonymous and harassing social media messages, filed a lawsuit against social media platforms, which was dismissed under Section 230.She testified that it should not take parents filing lawsuits to hold companies accountable due to the platform design.
  • Michelle DeLaune, President and CEO of NCMEC asserted that “it is no longer feasible to rely solely on online platforms to adopt voluntary measures, especially given their near complete immunity for activity on their sites, or to hope that they will design their platforms to avoid precipitating dangers to children from sexual exploitation, enticement, and revictimization.”
  • John Pizzuro, CEO of Raven, testified that children remain vulnerable on social media platforms due to poor moderation, absence of age or identity verification, or inadequate or missing safety mechanisms.But he highlighted even with those controls, law enforcement does not have enough resources, so tech companies with resources need to do what they can to keep children safe.
  • Josh Golin, Executive Director at Fairplay, testified how tech companies develop algorithms that are designed to maximize engagement and profit without—in his view—any regard for the user’s well-being or potentially harmful consequences.He urged the Committee to implement legislation that curbs dangerous and unfair platform design practices.

The Committee also discussed potential upcoming legislation to crack down on Child Sex Abuse Material (CSAM) including the End Child Exploitation Act, the EARN IT Act, Kids Online Safety Act, and others.

The Supreme Court

One week later, on Tuesday, February 21, the Supreme Court heard argument in Gonzalez, a case stemming from the death of 23-year-old Nohemi Gonzalez, an American student killed in an ISIS attack in Paris in 2015.[1]  The day after Gonzalez was killed in Paris, ISIS issued a written statement and released a video on an internet platform claiming responsibility for the attack.

Gonzalez’s father sued three major internet platforms claiming, among other things, that they should be liable for aiding and abetting terrorism by failing to prevent ISIS from using their platforms.

During oral arguments in Gonzalez, the Court grappled with the boundaries of Section 230 and whether an internet service provider could be held liable under Section 230 for its algorithm which recommended videos created by ISIS.  The Justices posed concerns about whether the Internet could be “destroyed” if Section 230 immunity is tightened and balked at the potential influx of litigation that could occur if the Court ruled against the defendant, Google.  The Court was focused on a potential legal distinction between simply hosting versus amplifying user content when it comes to assessing liability. 

The Department of Justice

The Department of Justice (DOJ) has signaled the need for reform of Section 230.  In September 2020, DOJ issued a letter to Congress outlining its proposed areas for reform of Section 230, which included proposed carve-outs for bad actors, child abuse, terrorism, cyber-stalking, and platforms with actual knowledge or notice that the content posted on the platform violated federal criminal law.



[1] On February 22, 2023, the Supreme Court heard argument in another similar case brought by the family of a Jordanian citizen, Nawras Alassaf, who died during an ISIS attack in Istanbul.  In Taamneh, the main issue was whether the internet platform knowingly provided “substantial assistance” to ISIS under the Anti-Terrorism Act because it arguably could have taken more “meaningful” or “aggressive” action to prevent terroristic acts on its platform.  The arguments explored the limits of “substantial assistance,” which the company argued should be limited to facilitating the use of the platform to plan, coordinate, and carry out a terrorist attack.  The more conservative Justices seemed more amenable to the company’s arguments, while the liberal Justices pushed back on the idea that the company should only be liable if the Court could link the help the company gave to ISIS to the specific terrorist attack that led to Alassaf’s death.