Reddit’s I.P.O. Is a Content Moderation Success Story - The New York Times

Credit...Fran Caballero
notion image
A decade ago, no one in their right mind would have put “Reddit” and “publicly traded company” in the same sentence.
At the time, Reddit was known as one of the darkest parts of the internet — an anything-goes forum where trolls, extremists and edgelords reigned. Light on rules and overseen by an army of volunteer moderators, Reddit — which once hosted popular communities devoted to nonconsensual pornography, overt racism and violent misogyny, among other topics — was often spoken of in the same breath as online cesspools like 4chan and SomethingAwful.
Few could have predicted back then that Reddit would eventually clean up its act, shed its reputation for toxicity and go public, as it is expected to on Thursday at a $6.4 billion valuation.
Today, Reddit is a gem of the internet, and a trusted source of news and entertainment for millions of people. It’s one of the last big platforms that feel unmistakably human — messy and rough around the edges, sure, but a place where real people gather to talk about real things, unmediated by algorithms and largely free of mindless engagement bait. Many people, me included, have gotten in the habit of appending “Reddit.com” to our Google searches, to ensure we actually get something useful.
There are a lot of lessons in Reddit’s turnaround. But one of the clearest is that content moderation — the messy business of deciding what users are and aren’t allowed to post on social media, and enforcing those rules day to day — actually works.
Content moderation gets a bad rap these days. Partisans on the right, including former President Donald J. Trump and Elon Musk, the owner of X, deride it as liberal censorship. Tech C.E.O.s don’t like that it costs them money, gets them yelled at by regulators and doesn’t provide an immediate return on investment. Governments don’t want Silicon Valley doing it, mostly because they want to do it themselves. And no one likes a hall monitor.
But Reddit’s turnaround proves that content moderation is not an empty buzzword or a partisan plot. It’s a business necessity, a prerequisite for growth and something every social media company has to embrace eventually, if it wants to succeed.
In Reddit’s case, it’s no exaggeration to say content moderation saved the company.
In its early years, Reddit — much like a certain, Musk-owned social network today — styled itself as a free-speech paradise. Its chief executive from 2012 to 2014, Yishan Wong, proudly defended the site’s commitment to hosting even gross or offensive content, as long as it was legal.
But eventually, amid growing scrutiny, Reddit decided that it needed to police its platform after all. The company put in place rules banning harassment and nonconsensual nude images, nuked thousands of noxious communities and signaled it would no longer allow trolls to run the place.
Redditors howled at these changes — and Mr. Wong’s successor as C.E.O., Ellen Pao, was chased out by a horde of angry users — but the company’s pivot to respectability was an undeniable success. Reddit’s image has gradually improved under a co-founder, Steve Huffman, who came back in 2015 to run the site as chief executive, and Reddit was able to build the ad-based business model that sustains it today.
In particular, I want to single out three steps Reddit took to clean up its platform, all of which were instrumental in paving the way for the company’s public debut.
First, the company took aim at bad spaces, rather than bad individuals or bad posts.
Reddit, unlike other social media sites, is organized by topic; users can join “subreddits” devoted to gardening, anime or dad jokes. That meant that once the company made new rules banning hate speech, harassment and extremism, it faced an important question: Should we enforce the new rules user by user or post by post, as new violations are reported, or should we proactively shut down entire subreddits where these rules have been consistently broken?
Reddit, to its credit, decided on the less popular option. It nuked thousands of offensive and hateful subreddits, attaching culpability not to individual posts or users but to the spaces where toxic things frequently happen, on the theory that online spaces, like offline ones, often develop customs and norms that are hard to dislodge.
Harsh as it was, the approach worked. Years later, when researchers studied these changes, they found that Reddit’s subreddit bans had led to a measurable reduction in overall toxicity on the site. Users who had frequented the banned communities largely either left Reddit entirely or changed their behavior. The toxic spaces didn’t reconstitute themselves, and rule-abiding Redditors got the benefits of a cleaner, less hateful platform.
The second good decision Reddit made, when it came to content moderation, was to empower an army of volunteer moderators, rather than trying to do it all itself.
Most social media sites are policed in a centralized, top-down way, with either employees or paid contractors doing the daily janitorial work. But Reddit already had thousands of unpaid moderators who were often experts in their forums’ subjects and were passionate about keeping their communities clean and safe.
It tapped those existing moderators to help enforce its new, stricter rules, and built tools to help them root out bad behavior, such as an automated tool known as Automoderator, which moderators can customize to take certain actions on their behalf. It also allowed them to set rules for their forums that went beyond Reddit’s base-line policies.
At times, Reddit’s decision to empower moderators has been a double-edged sword. Last year, the company faced a revolt from moderators after it made changes to its pricing structure for third-party apps, charging developers more if they wanted access to the company’s data. (Reddit stood its ground, and most of the moderators eventually relented.)
But on the whole, the company has benefited enormously from giving its volunteer moderators wide latitude to create and enforce their own rules.
Finally, Reddit policed behavior rather than morality, and it did so without worrying too much about being seen as capricious or biased.
One impressive thing about Reddit’s approach to content moderation is that — rather than tying itself in knots trying to maintain the appearance of political neutrality and avoid angering right-wing partisans, as executives at Facebook did for years under heavy pressure from the company’s lobbying and policy division — Reddit focused on getting rid of users who were making things worse for other users, regardless of their politics.
This approach was memorialized in a 2018 article in The New Yorker that is required reading for anyone interested in the story of Reddit’s revival. The article showed Reddit employees grappling with tough decisions about barring Nazis, racists and violent ideologues from their platform. As they contemplated these moves, they weighed trade-offs and considered the implications. But they didn’t feel the need to, say, ban a left-wing subreddit for every right-wing subreddit they banned, or hide the fact that they were making what amounted to judgment calls.
“I have biases, like everyone else,” Mr. Huffman, Reddit’s chief executive, told the magazine. “I just work really hard to make sure that they don’t prevent me from doing what’s right.”
I don’t want to paper over Reddit’s flaws. Its users still have plenty of complaints, and the site itself isn’t exactly Disneyland. (Among other things, it hosts one of the web’s largest repositories of porn, a fact that has hurt it with advertisers.) Reddit has also struggled to make money, and lags behind larger social media networks when it comes to introducing new features.
But by taking content moderation seriously earlier than many of its rivals, and coming up with a sensible, scalable plan to root out despicable behavior, Reddit was able to shed its image as the sewage pit of the internet and become a respectable (if not yet profitable) public company.
Other social media companies should take note.