The roi of t&s

What is the return on investment for trust and safety? In 2024, “The ROI on T&S” was the name of my GDC talk and this year, in 2025 it was the theme of the workshop I held at DICE. At the upcoming Trust and Safety Summit in the UK there is a keynote panel and multiple other panels on this topic. Why are we seeing an uptick in these conversations? Why are we now seemingly united in our efforts to make a clear monetary argument as to why we should be prioritizing making the (gaming) world a better place? And does this mean I’m a trendsetter?
I first erected the ROI on T&S shaped soapbox back in 2023, when Dr. Elizabeth Kilmer and I held a series of Roundtables at GDC as part of an two-year research project seeking to better understand the social dynamics, community building, and moderation efforts in online games. What we discovered from those roundtables was unexpected:
The good: Nearly everyone we spoke to expressed a want to make online games safer spaces with lower rates of what we often refer to as “toxicity”
The bad: The social leaders in the space (community managers, moderators, social media folks) were in dire need of resources to better foster resilient communities, mitigate harm, and protect their own mental health
The reality: Most of the people we interviewed shared stories where they had previously tried and failed at getting buy-in for innovation or prioritization of trust and safety initiatives due to a lack of a business case.
And thus, my quest appeared. Challenge accepted. How do we make a business case for trust and safety innovation?
Fast forward two years later… and I have some answers (but not all of them).

What we know about the ROI on T&S

Games are not safe and the lack of safety in games translates into behavioral shifts for players that have bottom line consequences for companies.

The social spaces in and around games are not safe

In December 2024, an editorial was published on Slate entitled “Silent but Deadly”. In this piece, the author talks about how we all used to play online games for the community. The author says, “I met some of my closest friends through multiplayer games then a strange thing happened ..everyone turned (literally) speechless”. He continues, “with online games, We wanted to conquer the world, and we wanted to do it together” …but now everyone is muted because they don’t want to engage at all because the social communities can be kind of awful (paraphrasing that last part).
And, he’s not wrong. We know that hate and harassment are seen as normalized experiences in games, because they are normalized. In my own research, I have consistently found it is more the norm than the exception to be a direct target and/or witness to various forms of toxicity in gaming spaces, including sexual harassment, hate speech, and threats of violence. And while that is an objectively awful set of statistics to throw out there, it is even worse when you see what that actually looks like and feels like for players (For a glimpse, I highly recommend checking out the “Through Her Eyes” campaign from Maybeline… yes, the cosmetics company).

A lack of safety translates negatively impacts company reputation, player engagement, and direct revenue streams

If players do not feel safe, they lose trust. And if they lose trust, you lose engagement.
In this context, player trust refers to the trust they have in the publisher, platform, community manager, or moderator, to effectively and reliably protect them from hate and harassment (or address it quickly and consistently when it does happen).
By player engagement, I mean through both play time (the amount of time they invest in your game or on your platform) and brand evangelism. Brand evangelism is the extent to which someone evangelizes about your brand, which is operationalized as purchase intentions “(I’m going to buy all the things”) and positive referrals (“I’m going to make everyone else buy all the things too”).
Here is an example of brand evangelism: Me and the Witcher.
notion image
I won’t shut up about this brand. I won’t shut up about this show. I will buy all the things and make other people buy all the things too. Yes, that is a picture of my friend Sarah at her wedding holding up the Witcher book I published. My love for it knows no bounds.
However, when you lose trust, you lose engagement, and that includes losing evangelism. Am I going to buy the Liam Hemsworth Funko pop for the new season of the Witcher? Unlikely. Because trust was broken (Doug and Henry will always be my Geralt to me!). That is an example of lost money.
However, lost trust from players translates into a range of losses beyond broken fangirl hearts.

Loss of Reputation

7 out of 10 players report they avoid playing certain games because of the reputation of the community.
That is, 70% of players report they will not even give your game or platform a chance if it has a reputation for having a toxic community. Repetitional damage in and of itself is an important loss to consider but it is also important to understand it is not necessarily a prerequisite for a loss of revenue. Historically, there has been an assumption that financial loss due to toxicity came only after repetitional damage but new research shows that is not the case. Even if your brand isn’t seen as particularly toxic, you still could be suffering financial loss because of the nature of the community. It is worth noting that this paper also found that financial and repetitional damage driven by toxicity can both be mitigated by the presence of a game moderation system. When players felt supported by the platform, had the policy, tools, and knowledge to use them effectively to mitigate the harm (to be resilient) that reduced the impact of the toxicity on the repetitional and financial damage.

Loss of Engagement

6 out of 10 players report they have quit a match or game permanently because they were subjected to hate and harassment in that community.
So seven out of ten won’t play your game, but if they do play it and experience hate and harassment, six out of ten are going to leave.
Interestingly, we also see this 6 out of 10 also reflected in the out-of-game community building around games with toxic social spaces.
notion image
This comes from Player XP is an organization that uses machine learning to monitor and analyze player feedback and sentiment through community posts. We see toxicity in yellow which indicates scope and intensity of toxicity in the chatter around the game, this is based on keyword and sentiment analysis along with categories like obscenity, sexism, abuse, homophobia, etc. In orange, we see community size in terms of number of active users discussing APEX on the third party forums they monitor (e.g., Discord, Reddit, Youtube, X, Steam). You can see that when toxicity increases in scope and intensity, community size decreases in terms of the number of people discussing the game in the community. Looking specifically at May 2023, we start to see this jump in toxicity, we see a decrease in community size of almost half. Some may say close to six out of ten?
What I find really interesting about this example is it shows objective data – server driven data – reiterating the subjective data we are generating from player responses that there is a significant amount of people disengaging from gaming communities – both in and out of game – because of toxicity.

Loss of Revenue

6 out of 10 players report that they choose not to spend money in a game because of other players have treated them in that community.
In addition to avoiding communities, and leaving communities, six out of ten players also said they chose not to spend money AT ALL because of how other players treated them in that community. Given that in-game purchases are projected to drive a significant amount of games’ revenue ( 97% of mobile gaming revenue and 27% of console revenue ), lost spending directly undermines core monetization models.
How much money are they not spending? A lot.
In 2023 Constance Steinkeuler published a paper looking at spending patterns across “toxic” and “non-toxic” gaming communities and found quite a disparity. Monthly spend on toxic games was $12.09 on average. For non-toxic games? $21.10. That is a 54% difference. So when we talk about players spending less money and what companies are losing (or what companies have the potential to gain from better mitigating toxicity in their spaces) we are talking about a significant amount of revenue.
It is also worth noting that Constance would be the first to tell you that this might not be the best metric, but we have to start somewhere.

…and no it isn’t just targets of harassment who are leaving.

Now it is at this point where people usually say to me, “Yeah, but it is just the women who are leaving.” Which, first of all - rude - we need to care about the women, we make up about half the game playing population. But second of all, actually not. Male players, and players under the age of 18 are the most likely to take action against hate and harassment in terms of reducing spending and engagement due to a toxic social environment.

What we don’t know about the ROI on T&S: Lingering Questions

While the last ten minutes of your life may have led you to believe we have the ROI on T&S all figured out, that is not the case. There are a lot of lingering questions. While we can quantify some of the bottom line costs, we still do not know how to best measure trust and safety success in games, the roadblocks for prioritization of these initiatives in games or what existing innovation we should be drawing inspiration from.
These are questions that I tried to answer this year at DICE alongside Dr. Elizabeth Kilmer, where we hosted a workshop on trust and safety innovation. Specifically, we were looking to answer the mysteries behind successfully measuring and operationalizing the ROI on T&S and community resiliency efforts, the roadblocks they face in this process, and identify successful strategies to overcome these roadblocks. While we did not discover all the answers in our 90 minutes together, we did glean some insight into these lingering questions relating to all the things we don’t yet know about the business case for trust and safety.
notion image

What are the KPIs we should be looking at to better operationalize T&S efforts and quantify success?

KPI’s for the ROI centered a lot around engagement and player sentiment, but brand reputation also came up from quite a few groups. Measurements of community resilience were less clear, likely speaking to the lack of clarity around how to both foster and empower communities to be resilient to disruptions in their community. There was a lot of chatter around community rewards and reprimands, some discussing engagement (AFK players versus community champions/high engagement), and suggestions around surveying the moderators and community managers about their subjective experiences and assessments of the community.
Determining the metrics for how to operationalize and measure T&S success, is the first (very necessary) step to making a clear, business case around the importance of these initiatives by being able to measure the success of our efforts.
An external source of inspiration for figuring out what these KPIs might look like, can be drawn from the International State of Safety Tech Report from late last year, which mapped the ROI on T&S onto five key metrics: improved operational performance, reduced operational expenses, increased revenue and engagement, network impact, and risk mitigation. While these might not be applicable to every gaming space or platform in a 1:1 it is a great starting point for articulating what it is we want to be specifically measuring and impacting with our (product, policy, and educational) efforts.
notion image

What are the roadblocks to prioritizing and innovating trust and safety in games and what are the strategies to overcome them?

Participants at the workshop identified several roadblocks and strategies to prioritizing trust and safety innovation.
Surprising no one, a common roadblock was executive sentiment and a lack of integration across teams. There were also more technical factors noted like players circumventing the tools you do put in place by misusing them or simply making new/multiple accounts. The good news? These are roadblocks are relatively universal, so if we can find a strategy to get around, over, and/or through them, we should be able to implement that across many studios and contexts.
So what are those strategies? Perhaps my favorite post-it from the whole session was the one that said “Step 1: Start, Step 2: Iterate”. Really it is about trying something. Investing enough time, money, and energy to try something. And if you are going to try something, perhaps it could be one of these sticky-note suggestions that came from the group:
  • Proactively build your community with T&S in mind (Um, yes. PREACH)
  • Engage with content creators, work with them to channel a massage about toxic environments
  • Engage your community management team
  • Focus on authentic member engagement
  • Preventative measures - thinking about T&S from the beginning
  • Setting examples internally
This sticky note all by its lonesome also caught my eye: Reframe what community means, support community managers and other community leaders.
notion image
This one really struck me because it is important to recognize that these big conversations about trust and safety in games is relatively new and largely uncharted territory (at least in the context of games specifically). Games used to be a product that you buy, take off a shelf, and enjoy when your older brother is away at camp or sleeping because he never let you play Day of the Tentacle otherwise. Wait, maybe that was just me (if it was also you too, you may enjoy Chapter 4 of the latest edition of Well Played).
But over time, games became social platforms (not social media, there is a difference). And there is some level of responsibility that we have to take as an industry for the community we are creating. Part of taking that responsibility includes tooling and policies to keep them safe. But the other part of that is thinking about how we can invest in our community leaders to create more resilient communities. In the end, this will translate into more fun and engaging for players and, given everything I’ve said thus far, will make more profitable companies.

What are existing innovations in trust and safety in games (and how can we all implement them)?

Perhaps the best news of this whole post is that as an industry, we are actually seeing more innovation that ever before in terms of refining tools we already have as well as thinking outside the box and creating something new
In fact, in celebration of that I published a piece in Wired, January of this year talking about the innovation we are seeing across the gaming industry. In it, I highlighted some of this work, such as the Digital Thriving Playbook and Cabined Accounts from Epic Games. Here are some other sources of innovation we are seeing that I didn’t have the opportunity to mention (#wordlimitproblems)
…while this is amazing to see and worth celebrating all of this innovation that is happening, it is also worth noting that innovation is still largely happening in silos (The Digital Thriving Playbook being an exception as that is a multi-stakeholder project). We are losing a lot in the ecosystem. We need to work together to figure out what is working in one space, what is not working in another, and if it makes sense for industry wide innovation and application.
For example, the implementation of AI voice moderation in Call of Duty has been incredibly successful for the Call of Duty community and Activision. In 2023, they published a report saying that after the implementation of their voice moderation they saw an 8% reduction in repeat offenders, 50% reduction in exposure to severe instances of disruption, and more than 2 million accounts have been enforced via voice based on their updated code of conduct.
In October 2024 they released an updated report finding not just that exposure to toxicity dropped (giving us safer communities) but also that 80% of players issued a voice chat enforcement warning did not reoffend. On top of that, Activision confirmed that retention of players improved, as did the overall experience and levels of engagement for players in online multiplayer play. This is hard evidence, real numbers, that demonstrate that trust and safety innovation is a growth point.

What does this mean and where do we go from here?

A lack of effectively protecting the safety of players from hate and harassment is directly translating into lost revenue.
Companies that fail to address toxicity are hindering their growth in an industry where positive player experiences increasingly influence consumer choice.
(yes, you can quote me on that)
By not prioritizing trust and safety, games and platforms may struggle to attract new audiences, including the new player groups (under 18s), and hinder growth among (what many consider the core demographic of games) male players. While it is worth noting that innovation and prioritizing trust and safety is also starting to become more of a business necessity in light of new legislation, compliance is not enough and over time a failure to secure safe gaming environments will shrink market share relative to competitors who invest in trust and safety.
Safer communities does not mean no harm. It means having the systems (policy, tooling, resources and education) to mitigate harm as best we can and for players to be able to utilize the systems to recover quickly when harm does occur. People want to feel safe in their play spaces and they don’t. And when they don’t, they aren’t spending their time or money there… and in many cases not even giving you a chance to spend their time and money there.
Remember the editorial I mentioned at the start - “Silent but Deadly”? He ends his piece the same I want to end my piece today. He says:
“I may not have solutions, but I do know that the world is a better place when a dwarf and an elf can meet in the whirring gears of a multiplayer server and…become best friends. Surely, we can’t be so far away from reclaiming that dream.”
Together, let’s reclaim that dream.
Psychgeist Newsletter is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

Other places you can find my musings…

  • Personal Website, for more about my research and upcoming events. Which I recently updated! Take a look if you haven’t been there in a while (#OpenToWork)
  • Psychgeist, my YouTube channel dedicated to the science of digital games. Subscribing is FREE and is the easiest way to support my work by increasing engagement and making the algorithm happy.
  • BlueSky, for those who have migrated over