Shifting Trends in the Interpretation of Section 230 – New Strides in Social Media Liability

Written by Hannah Hapeman

Credited as “the law that enabled the rise of social media,” The Communications Decency Act (47 U.S.C. § 230), protects online platforms from liability regarding the content that is posted by the people who use their platform.

The first subsection guarantees that the provider of an interactive computer service will not be treated as the “publisher or speaker” of information that is posted by another content provider. Another subsection protects computer service providers from being held liable for the decision to “restrict access to or availability of material” which the provider deems to be “obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable,” regardless of the constitutionally protected nature of that content. The law does have some exceptions, such as protections for intellectual property-based claims, but the act has generally been interpreted broadly by courts.

In recent years, the 1996 law has been under increasing scrutiny from both sides of the political spectrum. Democrats have alleged that by giving immunity to online companies, the law allows platforms to ignore the spread of inaccurate and dangerous information. Republicans, on the other hand, take issue with a provision that allows social media companies to remove content that they deem objectionable. Much of the debate around Section 230 is focused on social media companies. Notably, at the time of the Act’s passage, social media did not exist. Lawmakers appear to be growing more open to the prospect of revising Section 230. In March, the heads of Facebook, Google, and Twitter testified to the House Commerce Committee about their content oversight practices, and were met with strong pushback from both sides of the isle.

In 2020, the Supreme Court declined an opportunity to hear a Communications Decency Act case. Section 230 has returned to headlines this month, however, following a Ninth Circuit decision. On May 4, The United States Court of Appeals, Ninth Circuit, held that the Communications Decency Act did not protect Snap, Inc., the creator of “Snapchat” from a negligence action. The case was brought by the parents of two boys who died in a high-speed car accident. At one point during the drive, the adolescents reached a speed of 123 MPH. The boy in the passenger seat had opened the Snapchat app shortly before the fatal crash.

The parents sued the social media company alleging that the app’s “speed filter,” which allows users to record the speed they are moving, incentivized young drivers to drive at high  speeds. The parents alleged that Snap knew or should have known of this danger. The U.S. District Court for the Central District of California dismissed the case, holding that the Communications Decency Act barred the claim. In reversing the decision, the Ninth Circuit emphasized that the parents’ claim sought to hold the app responsible solely for its own architecture – “that the app’s Speed Filter and reward system worked together to encourage users to drive at dangerous speeds.” The Ninth Circuit stated that granting immunity under Section 230 was improper, as the claim did not treat Snap as a “publisher or speaker” nor did it seek to hold the app responsible for content provided by another user.

Following the decision, law professor Jeff Kosseff noted a “growing divergence in how courts treat [Section 230] challenges,” and predicted that the Supreme Court might grant cert to a Communications Decency Act case if again provided with the opportunity. In the meantime, other California cases will test the application of the ruling.

One such case was brought by the mother of a teen who took his own life after months of bullying in the form of anonymous messages sent via the “Yolo” app. The teen’s mother alleges that the service providers “violated consumer protection law by failing to live up to their own terms of service and policies.” The complaint, filed in the Northern District of California, further alleges that apps with anonymous messaging “facilitate bullying to such a degree that they should be considered dangerous products.”

Sources

 Ryan Tracy, Social Media’s Liability Shield Is Under Assault, THE WALL STREET JOURNAL, Nov. 27, 2020.

47 U.S.C. § 230 (1996).

Ashley Johnson and Daniel Castro, The Exceptions to Section 230: How Have the Courts Interpreted Section 230?, INFORMAITON TECHNOLOGY & INNOVATION FOUNDATION, Feb. 22, 2021.

Anna Salvatore, Supreme Court Declines to Review Case on Section 230 (For Now), LAWFARE, Oct. 14, 2020.

Lemmon v. Snap, Inc., 995 F.3d 1085 (2021).

Bobby Allyn, Snapchat Can Be Sued Over Role In Fatal Car Crash, Court Rules, NPR, May 4, 2021.

Sam Dean, A Teen Who Was Bullied on Snapchat Died. His Mom is Suing to Hold Social Media Liable, LA TIMES, May 10, 2021.

Exit mobile version