Can You Kick the Trolls Out Of Your Online Forum? U.S. Supreme Court to Decide - Internet Society

notion image
Can You Kick the Trolls Out Of Your Online Forum? U.S. Supreme Court to Decide Thumbnail
Should the governments of Texas and Florida decide whether and how online discussion sites can moderate their posts?
Let’s say you have an online community about the town you live in, and someone starts posting off-topic messages about some other town or city, should you be able to remove those messages? Or if you have a Facebook Group discussing cars, and someone starts posting about politics, against your community rules, should you be able to remove those postings? Or if you set up a Mastodon instance specifically for people who only want to post about dogs, and someone starts posting about cats, again, against the community rules, should you be able to remove those messages?
Today this kind of content moderation is possible—and is critical to building strong online communities and providing a positive user experience online.
But if the governments of Texas and Florida get their way, those states—and probably 48 other U.S. state governments—will be able to impose their own moderation rules on online discussion sites.
A case before the United States Supreme Court, combining two cases known as Moody v. NetChoice and NetChoice v. Paxton, targets the laws passed in Texas and Florida. Today the Internet Society filed an amicus brief supporting the request to have the two state laws overturned.

Content Moderation Enables a Better Internet for All

We firmly believe that the Internet empowers individuals globally to speak, share ideas, learn, and connect. Online discussion sites—including social media platforms and other online communities—are among the important ways this happens.
Content moderation provides a better user experience, enables safe discussions, removes harmful posts, and prevents threats. All of these things encourage individual participation on the Internet. Almost all online communities, forums, subreddits, or groups establish some kind of community guidelines, conventions, or rules that provide boundaries for discussions.
Without this kind of moderation, online conversation sites have in the past—and would in the future—become less useful and less productive places to engage in conversation. We have seen that completely unmoderated forums rapidly fill up with posts from scammers, spammers, trolls, and those seeking to feed into outrage and chaos. We’ve also seen forums fill up with malware and fraudulent posts that endanger people’s security. Very often, all of this leads users to give up on the sites.
Without some form of moderation, online communities die.

The Far-Reaching Impact of the Texas and Florida Laws

The Texas and Florida laws were passed because their governments believed that online platforms were censoring certain political points of view. The laws seek to eliminate or strictly limit a platform’s ability to moderate content.
However, while the laws are written to try to target large platforms such as Meta and X, the statutory language reaches much farther than the largest commercial platforms. All platforms generally seek to grow, and smaller platforms such as Mastodon or BlueSky might suddenly find themselves needing to comply. In fact, the broad definitions in these laws could mean they wind up applying to larger sites such as Wikipedia, StackExchange, and—sites that most would not consider social media platforms.
We join with many other organizations and companies across the Internet in asking the U.S. Supreme Court to invalidate these laws.
The impact of a decision from this Court upholding Texas and Florida laws would have global implications as millions of users around the world use U.S.-based companies’ platforms. If allowed to stand, other countries may view it as license for more extensive regulation, and such a decision would weaken the United States’ authority to speak out against extreme regulation of speech on social media platforms in other countries. It would also embolden other states and regions, in the U.S. and other countries, to enact their own regulations—which could create a nightmare of compliance—and ultimately stifle and suppress online conversations.
The Internet is at its best when individuals worldwide can connect with each other and safely and usefully share information and ideas. Content moderation is one key piece of creating thriving online communities.

Please share this post and our amicus brief with others so that they are aware of these cases. We do not yet know when in 2024 the U.S. Supreme Court will hear arguments about these cases. We will provide updates as soon as we know more.

Image credit: Mark Fischer
Disclaimer: Viewpoints expressed in this post are those of the author and may or may not reflect official Internet Society positions.