Five Section 230 Cases That Made Online Communities Better

from the you-need-230-on-that-wall dept

The House Energy and Commerce Committee is holding a hearing tomorrow on “sunsetting” Section 230.
Despite facing criticism, Section 230 has undeniably been a cornerstone in the architecture of the modern web, fostering a robust market for new services, and enabling a rich diversity of ideas and expressions to flourish. Crucially, Section 230 empowers platforms to maintain community integrity through the moderation of harmful content.
With that, it’s somewhat surprising that the proposal to sunset Section 230 has garnered Democratic support, given that Section 230 has historically empowered social media services to actively remove content that perpetuates racism and bigotry, thus protecting marginalized communities, including individuals identifying as LGBTQ+ and people of color.
As the hearing approaches, I wanted to highlight five instances where Section 230 swiftly and effectively shielded social media platforms from lawsuits that demanded they host harmful content contrary to their community standards. Without Section 230, online services would face prolonged and costlier legal battles to uphold their right to moderate content — a right guaranteed by the First Amendment.

Section 230 Empowered Vimeo to Remove ‘Conversion Therapy’ Content

Christian Pastor James Domen and Church United sued Vimeo after the platform terminated their account for posting videos promoting Sexual Orientation Change Efforts (SOCE) (i.e. ‘conversion therapy’), which Vimeo argued violated its content policies.
Plaintiffs argued that Vimeo’s actions were not in good faith and discriminated based on sexual orientation and religion. However, the court found that the plaintiffs failed to demonstrate Vimeo acted in bad faith or targeted them discriminatorily.
The District Court initially dismissed the lawsuit, ruling that Vimeo was protected under Section 230 for its content moderation decisions. On appeal, the Second Circuit Court upheld the lower court’s dismissal. The appellate court emphasized that Vimeo’s actions fell within the protections of Section 230, particularly noting that decisions about content moderation are at the platform’s discretion when conducted in good faith. [Note: a third revision of the Court’s opinion omitted Section 230, however, the case remains a prominent example of how Section 230 ensures the initial dismissal of content removal cases].
In upholding Vimeo’s decision to remove content promoting conversion therapy, the Court reinforced that Section 230 protects platforms when they choose to enforce community standards that aim to maintain a safe and inclusive environment for all users, including individuals who identify with LGBTQ+ communities.
Notably, the case also illustrates how platforms can be safeguarded against lawsuits that may attempt to reinforce the privilege of majority groups under the guise of discrimination claims.
Case: Domen v. Vimeo, Inc., №20–616-cv (2d Cir. Sept. 24, 2021).

Section 230 Empowered Twitter to Remove Intentional Dead-Naming & Mis-Gendering

notion image
Meghan Murphy, a self-proclaimed feminist writer from Vancouver, ignited controversy with a series of tweets in January 2018 targeting Hailey Heartless, a transgender woman. Murphy’s posts, which included referring to Heartless as a “white man” and labeling her a “trans-identified male/misogynist,” clearly violated Twitter’s guidelines at the time by using male pronouns and mis-gendering Heartless.
Twitter responded by temporarily suspending Murphy’s account, citing violations of its Hateful Conduct Policy. Despite this, Murphy persisted in her discriminatory rhetoric, posting additional tweets that challenged and mocked the transgender identity. This pattern of behavior led to a permanent ban in November 2018, after Murphy repeatedly engaged in what Twitter identified as hateful conduct, including dead-naming and mis-gendering other transgender individuals.
In response, Murphy sued Twitter alleging, among other claims, that Twitter had engaged in viewpoint discrimination. Both the district and appellate courts held that the actions taken by Twitter to enforce its policies against hateful conduct were consistent with Section 230.
The case of Meghan Murphy underscores the pivotal role of Section 230 in empowering platforms like Twitter to maintain safe and inclusive environments for all users, including those identifying as LGBTQ+.
Case: Murphy v. Twitter, Inc., 2021 WL 221489 (Cal. App. Ct. Jan. 22, 2021).

Section 230 Empowered Twitter to Remove Hateful & Derogatory Content

In 2018, Robert M. Cox tweeted a highly controversial statement criticizing Islam, which led to Twitter suspending his account.
To regain access, Cox was required to delete the offending tweet and others similar in nature. Cox then sued Twitter, seeking reinstatement and damages, claiming that Twitter had unfairly targeted his speech. The South Carolina District Court, however, upheld the suspension, citing Section 230:
In other words, actions taken upon third-party content, such as content removal and account termination, are wholly within the scope of Section 230 protection.
Like the Murphy case, Cox v. Twitter emphasizes the importance of Section 230 in empowering platforms like Twitter to decisively and swiftly remove hateful content, maintaining a healthier online environment without getting bogged down in lengthy legal disputes.
Case: Cox v. Twitter, Inc., 2:18–2573-DCN-BM (D.S.C.).

Section 230 Empowered Facebook to Remove Election Disinformation

In April 2018, Facebook took action against the Federal Agency of News (FAN) by shutting down their Facebook account and page. Facebook cited violations of its community guidelines, emphasizing that the closures were part of a broader initiative against accounts controlled by the Internet Research Agency (IRA), a group accused of manipulating public discourse during the 2016 U.S. presidential elections. This action was part of Facebook’s ongoing efforts to enhance its security protocols to prevent similar types of interference in the future.
In response, FAN filed a lawsuit against Facebook which led to a legal battle that centered on whether Facebook’s actions violated the First Amendment or other legal rights of FAN. The Court, however, determined that Facebook was not a state actor nor had it engaged in any joint action with the government that would make it subject to First Amendment constraints. The court also dismissed FAN’s claims for damages under Section 230.
In an attempt to avoid Section 230, FAN argued that Facebook’s promotion of FAN’s content via Facebook’s recommendation algorithms converts FAN’s content into Facebook’s content. The Court didn’t buy it:
The FAN case illustrates the critical role Section 230 plays in empowering platforms like Facebook to decisively address and mitigate election-related disinformation. By shielding platforms that act swiftly against entities that violate their terms of service, particularly those involved in spreading divisive or manipulative content, Section 230 ensures that social media services can remain vigilant guardians against the corruption of public discourse.
Case: Federal Agency of News LLC v. Facebook, Inc., 2020 WL 137154 (N.D. Cal. Jan. 13, 2020).

Section 230 Empowered Facebook to Ban Hateful Content

notion image
Laura Loomer, an alt-right activist, filed lawsuits against Facebook (and Twitter) after her account was permanently banned. Facebook labeled Loomer as “dangerous,” a designation that she argued was both wrongful and harmful to her professional and personal reputation. Facebook’s classification of Loomer under this term was based on their assessment that her activities and statements online were aligned with behaviors that promote or engage in violence and hate:
Loomer challenged Facebook’s decision on the grounds of censorship and discrimination against her political viewpoints. However, the Court ruled in favor of Facebook, citing Section 230 among other reasons. The Court’s decision emphasized that as a private company, Facebook has the right to enforce its community standards and policies, including the removal of users it deems as violating these policies.
Case: Loomer v. Zuckerberg, 2023 WL 6464133 (N.D. Cal. Sept. 30, 2023).
Jess Miers is Senior Counsel to the Chamber of Progress and a Section 230 expert. This post originally appeared on Medium and is republished here with permission.
notion image

from the not-found-league dept

Are you an NFL fan? If you are, are there particular teams or games you want to watch? The obvious answer to that second question would be “yes”, though the answer to whether you’ll actually be able to watch those games is much less obvious and much more convoluted. It depends which team, and which game, and on which day, and with which device. That’s because the NFL product has become so impossibly fractured across all kinds of broadcast and streaming partners that you have to wade through a labyrinth just to figure out if you’re subscribed to the right product for a particular game.
We talked about this already earlier this year when the NFL put a single game exclusively on Peacock. That brand new experiment didn’t go terribly, but it also didn’t do great. For streaming events, the numbers it drew were huge. So too was it surprising to learn that more people than I would have expected let their brand new Peacock subscriptions remain weeks and weeks after the game. On the other hand, the game didn’t do anything like the numbers NFL games typically do on broadcast television or streaming services that closely resemble them, such as YouTube TV.
Meanwhile, you could currently watch NFL games on cable, if you were in the right geographic area, on YouTube TV, on Amazon for Thursday night games, on Peacock, on ESPN Plus, or on the NFL’s NFL+ (but only if you are okay watching it on a phone or tablet). And now, on Christmas Day for the next several years, you’ll have to be a subscriber to Netflix to watch those games as well.
When it comes to televised live events, the NFL is obviously king. If you look at the top 30 televised events for 2023, you will notice that 29 of them are NFL games. So it’s obvious why any streaming partner would want a piece of the game to attract those eyeballs to their platforms.
But at some point the product has to become so fractured to have a negative viewership effect, right? This cannot go on forever. And for all the fanfare around the Peacock exclusive game last year, there was also a lot of anger and frustration from folks that thought it was quite shitty that they had to sign up for yet another streaming service just to watch a single game due to the NFL’s insatiable appetite for cash.
But if the NFL is hoping to eventually unify its streaming rights, it won’t be able to do so until 2029.
While true, at some point NFL fans are just going to want to be able to know where they can watch the damned games. Having seven to ten places to have to either subscribe to and/or navigate to find that game is an opportunity cost that will eventually have an effect.