Thriving is a continuous effort, not a one-and-done achievement. Moderation is how we continue to reaffirm our values and cultivate an ongoing positive experience for our community.
Big takeaways
- As a discipline, moderation encompasses not only content moderation but also conduct and user moderation, and it can be approached in different ways.
- Effective moderation must follow public, transparent standards (a Code of Conduct) in order to reaffirm your community values.
- To make the most of moderation, consider the different ways you can identify misbehavior as well as how you’ll notify your userbase when it occurs.
- For success, moderators should be invited into conversations from the early stages of game development to align values, policies, and game design.
In trust and safety, we often talk about “content moderation” as a key part of our efforts. But these efforts might look extremely different depending on the game or online platform, so it’s critical to have a clear understanding of the types of moderation work we might find ourselves doing.
What is moderation?
In general, moderation is the act of regulating in-game behaviors in order to give your players the kind of experience they originally signed up for. Doing any kind of moderation requires a clear Code of Conduct — not just for your users, but also for your internal use, so that your whole team is aligned on what kind of experience you want to offer. For example, you may want to consider:
- Is your game family friendly?
- How do you feel about intense — but still mutually accepted — trash talk?
- Are any topics off limits, like politics or mature themes?
- If your game involves violence, where do you draw the line on in-game threats that might be scary outside the game?
Types of moderation
Within the space of moderation are several different types of behaviors you may need to concern yourself with. While most people equate “moderation” with “content moderation,” in truth content moderation is just one of a few subtypes of overall moderation.
Content moderation
Content moderation involves looking at individual materials generated by your users and analyzing them in isolation. In video games, this can include things like:
- Individual text or voice messages
- User-created stages (e.g., in Super Smash Bros)
- Custom-built experiences (e.g., on Roblox)
- User-generated content (e.g., a castle built in Minecraft)
All of these can constitute “content.” As a result, effective content moderation can be quite complex, requiring unique tools for each type of media (text, voice, UGC, etc). But the commonalities are that you are reviewing individual pieces of content in isolation — you’re determining if the content is inherently good or bad.
Often, it’s easy enough to make that good / bad determination — adult content is never acceptable in a children’s game, for instance. But in other cases it’s more nuanced — some words that are often used as slurs can have a more positive meaning in certain groups or situations, so defining them as unilaterally bad can actually strip away autonomy from groups that have reclaimed those slurs.
Conduct moderation
With conduct moderation, we’re concerned not with the specifics of what someone did / created / shared, but about how the act of distributing that content impacted the community. Consider:
- Conduct is not inherently good or bad, but becomes good or bad in a given context. For example, was that sexual comment interpreted as flirting (which is acceptable), or as sexual harassment (which certainly isn’t)?
- Conduct is harder to manage than content, because conduct requires you to deeply understand the cultural norms and dynamics within your community. Is someone “boosting” another user because they are genuinely supportive, or as part of a set up for grooming or a complex bullying campaign?
- Detecting conduct tends to rely on the responses of your users — either through reporting or by monitoring how they respond. For example:
- Scanning for signs that one of your users is feeling harmed can help you recognize harassment as different from welcomed flirting.
- Scanning for signs that someone is feeling seen and appreciated by their teammates can help you differentiate from more sinister relationship-building such as grooming or radicalization.
Both conduct and content moderation ultimately result in you deciding whether to penalize the content (e.g., take it down) and / or the user (e.g., temporarily suspend their account). But they can both inform more nuanced interventions as well. When matchmaking, for example, you may wish to pair your new users with others who have a track record of avoiding offensive behavior (or better yet, who have a history of coaching newbies). In order to do this well, you usually need to start forming an opinion about not just a single piece of content or someone’s conduct in certain scenarios, but overall what kind of user you have on your hands.
User moderation
User moderation involves building an overall user profile (often including a “reputation score” which might be directly influenced by, e.g., user reports of this individual’s misconduct) to understand what sort of player they are, and whether you even want them in your community in the first place.
Note that this is not necessarily the same question as “Is this user a bad person?” In the physical world, spaces like sports bars exist and “exclude” children, most non-drinking adults, and even sometimes fans of the opposing team. But nobody really argues that these venues are suggesting those groups are “bad people,” and nobody gets angry at the sports bar for making that decision. (Especially because it’s transparent from the get-go what kind of space the sports bar is — if it was pretending to be a kids’ playground, and turned out to be a sports bar on the inside, folks would be much more upset.)
As with the sports bar, the goal for any game is not necessarily to be a space for every single possible user. You might decide your game is meant only for kids, or for adults. Perhaps you want to tell a story or make a space that will speak more meaningfully to those of a certain identity group. Maybe you just want to create a reprieve for those sick of talking politics; or alternately, you want to gamify political debate to bring folks from both sides of the aisle together.
Whatever it is, you’ll need to be prepared to make clear decisions, on an ongoing basis, about whether the sorts of people in your game’s community are the ones you were searching for. A group of kids cannot raid a sports bar and demand it install a swing set; neither should you feel an obligation to cater to anyone who steps into your space.
Defining your moderation methodology
Content, conduct, and user moderation can be approached in different ways:
- Some games focus on punitive responses to problematic behavior, while others prefer supportive interventions that promote and reward good behavior.
- Some games lean on a reactive approach, where they won’t intervene unless a player submits a report to bring something to their attention, while others focus on proactive techniques to identify unreported harms faster and more consistently.
- Some games prefer to take silent actions (or at least only alert the offending individual), while others choose to notify the harmed users once they’ve taken action, especially after receiving a report from one of those harmed users.
Collaborating with other teams for success
All of this moderation work is, itself, dependent on your work more broadly across the T&S landscape. In particular, while moderation takes place after the game has been deployed, it needs to be paired with techniques like:
- Safety by Design — Building games with conscientiousness around how certain mechanics or options might be misused.
- Expectation-setting — Publishing a clear code of conduct.
- Community management — cultivating a sense of unity and trust between the platform and its players, both in-game and more broadly across game-adjacent spaces.
Let’s talk about how moderation teams can collaborate with each of the above efforts:
Safety by Design
Safety by Design is usually a mentality promoted by product managers, executive producers, and (at larger studios) perhaps a Trust and Safety lead or compliance coordinator. By definition, it comes into play well before the game is live and being experienced by real users. Even in those early days, however, it’s useful for game teams to consider moderation by design as well. For example:
- Building a robust user-reporting flow (which, crucially, is easy for players to utilize) should be handled at the design stage.
- For games contemplating user-generated content (UGC), part of Safety by Design should involve considering how easy it will be to moderate problematic content created by that UGC.
- Moderators looking to support product teams on Safety-by-Design initiatives can particularly assist by sharing real-world examples of how some functionality can be misused by bad or ignorant actors, as well as advising on the feasibility of things like report and appeal flows.
Expectation-setting
Expectation-setting is often handled by legal teams (for the formal Code of Conduct) and marketing teams (for other messaging), but this often results in language that’s either too opaque for users to understand, or too generic and showy to really tell them if this is the kind of space they are looking for. Moderation teams should be looped into discussions around this messaging (as should community managers!), as nobody wants a player to be subjected to a penalty for behavior they genuinely didn’t realize was problematic.
On the flip side, moderation teams can benefit from both legal and marketing expertise when composing explanations for moderated users, detailing why they’ve received a penalty and how to do better next time — games like Apex Legends have found profound benefits to offering such explanations.
Community management
Finally, community management is often a separate team focused on managing social media and community forums, speaking directly to the player community outside the game. These teams are a key complement to moderation teams — while moderation teams provide “source of truth” feedback about what the game wants from its players in-game, community management is offering the same kind of guidance outside the game. As a result, it’s critical for community management and moderation teams to work closely with each other to ensure the guidance they are each offering don’t contradict each other.
What does good moderation look like?
The goal of any moderation efforts is to ensure each of your users has a consistent, positive experience in line with your listed values as a platform. So, if you want to know how successful your moderation is, ask how consistently your users are enjoying themselves.
ㅤ | Success signs | Warning signs |
Individual | Knows what to expect when logging inUnderstands why their behavior was a violation, when they are punished | Is surprised by the behaviors / norms in your spaceFeels your moderation is arbitrary or unfairly targets them |
Group | Trust in the platform to matchmake and moderate efficientlyReinforced norms between players (e.g., “Hey man, that’s not what this space is about”) | Distrust of matchmaking or heavy use of muting rather than reportingConstant disagreement / debate about who or what the platform is for |
Community | Organic growth of playerbase and player connectednessPositive user sentiment about the overall experience and about the platform’s moderation practices | Players siloed into small groups of pre-existing friendsContinuous complaints about the other players or about the trustworthiness of the moderation |
Now what?
For a deeper discussion of all things content moderation, check out Tarleton Gillespie’s Custodians of the Internet.

Mike Pappas
CEO
Mike Pappas (he / him) is the CEO of Modulate, which works with studios including Activision, Rockstar, Riot, and Rec Room to foster safer and more inclusive online voice chat. Mike also serves on the board of the Family Online Safety Institutes and has contributed to several working groups focused on aligning safety regulation with pragmatic best practices for games. Outside of work, his passions include creating themed cocktail lists, long-distance running, and playing Nintendo games.

Cartoon people in a circle reaching out to lend a hand to each other