Online Hate to Offline Behavior: Games Edition

Psychgeist Newsletter is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
Since the Netflix show Adolescence has gotten us all to talk about the power of digital culture, it seems like a good time to return to the ongoing conversations about the impact of digital culture, particularly when it comes to hateful rhetoric. If you’ve been here for a while, you know that I have long spoken about the challenges of hate and harassment in gaming spaces, specifically in relation to the potential impact it can have on all spaces (because “when hate normalizes in one space, it normalizes in all spaces”, which is a phrase I have said on many a stage).
But let’s start at the beginning. Hate and harassment in online games is a problem because:
  • Normalization of hate spreads beyond gaming. The saturation of hate and harassment in gaming communities has contributed to the normalization of this language within and outside of gaming spaces.
  • Cultural norms drive behavior. The norms that have rooted themselves in many gaming communities hold the potential to (negatively) influence the way people see and interact with the world.
This last point is my focus here.

What We Know About Gaming Culture's Impact

Hate, harassment, and other specific forms of extremist rhetoric (racism, misogyny, anti LGBTQIA sentiment) are have become normal, commonplace, and in many cases expected within gaming communities. How do we know this? Because we asked players and majority shares of them stated this is the case:
  • 3 out of 4 players have witnessed misogyny, racism, and anti-LGBTQIA+ sentiment in games, and most players agree that the expression of these hateful sentiments are normal.
  • Two-third of players have witnessed the expression of white nationalism in games, with 40% of players agreeing that the expression of this kind of rhetoric in games is “normal”.
Other studies have since replicated these findings. But how does this translate into behavioral shifts?

The Radicalization Pipeline

We often talk about the radicalization pipeline, which is the general process of radicalization in a digital (or non-digital) space.
Image from Kowert, R. & Kilmer, E. (2023). Extremism in Games: A Primer. A white paper by Take This.
notion image
The radicalization pipeline helps to illustrate and understand how players may become radicalized; and, in the context of our discussions here, through the social community of games and game adjacent spaces specifically. As seen in the image above, this pipeline explains how players can be funneled from the general player population through cognitive radicalization processes (changing their thoughts) and through to mobilization, or what is often referred to as behavioral radicalization (changing their behaviors).
However, when discussing the process from online to offline harm, we also need to consider the victim-to-perpetrator pipeline.
Image from Kowert, R. & Kilmer, E. (2024). Trust and Safety Innovation in Games: Building resilient players and safer (digital) playgrounds. DICE : Las Vegas.
notion image
When players witness or experience toxicity, one of two things can happen. They stay, or they leave. Some players who stay tolerate the toxicity, block, mute, report, move on. However, some players who stay begin to emulate the toxic behavior they have seen or experienced. In fact, recent research has found that players with the highest levels of engagement, witness the most verbal toxicity, are also the ones most likely to display verbal toxicity. In this sense, the culture is self-perpetuating. They stay, adopt these behaviors, and then push others out (notably, even if they do not internalize these beliefs or behaviors). he transition from victim to perpetrator in games can occur rapidly and be a cyclical, self-perpetuating phenomenon. For example, studies examining platforms like Minecraft have shown how players gradually adopt more aggressive communication styles over time, even when they initially entered these spaces with positive intentions.

The Psychology Behind the Pipeline

What are the psychological mechanisms behind the victim/bystander -> perpetrator pipeline? There are a few things at play here.

Social Learning Theory: Learning from Norms

Social norms in any particular social context are formed over time, developed on three key factors:
  1. How a person thinks they should act
  1. How that person thinks others should act
  1. The capacity of the group to enforce the rules/set the norms
These three factors interact over time to create norms across settings and groups, both small and large.
If hate and harassment in games were considered deviant you would expect their actions to be abnormal; that is, the exception to the norm. The community itself should also be clear and consistent on “how they think they and others should act”; that is, it should not be an expected behavior in the space. But due to its normalization in these spaces, the expression of hateful rhetoric is not often actively or consistently discouraged. Case in point: Less than 10% of the active player base submit reports of toxic behavior.

The Online Disinhibition Effect

When traditional social controls are weakened or absent (as they often are in anonymous online spaces), individuals may adopt behaviors they would never consider in face-to-face interactions. this idea of “you don’t know me” and “you can’t see me” can lead to a disinhibition effect, which can explain why people may feel more embolden to behave in “toxic” or hateful ways.

Interpersonal Factors

A range of interpersonal factors have been identified as being more prevalent in those who choose to engage in online harassment, including hostile sexism, ambivalence towards toxicity, Machiavellianism, and national identity.

The Real-World Consequences of Hateful Behavior in Games

Worst Case Scenario: Escalation to Extremism

In the most severe cases, the normalization of hate speech and harassment in gaming spaces can serve as a pathway to more serious forms of extremism and offline violence.
The radicalization funnel (as discussed above) represents a systematic process through which individuals progress from initial vulnerability to potential violent extremism. This model illustrates how extremist groups exploit digital spaces, including gaming spaces, to gradually escalate individuals' ideological alignment and commitment to extreme beliefs.
Image from Kowert, R. & Kilmer, E. (2023). Extremism in Games: A Primer. A white paper published by Take This.
notion image
One example of this can be found in the 2022 study from Koheler and colleagues where they discussed specific case studies of extremist actors systematically targeting young users through Roblox, radicalizing them to become involved in extreme-right criminal behavior, including the plotting of a terrorist attack. As discussed in the paper, young people are gradually exposed to extreme-right content and potentially mobilized toward violence through coordinated efforts by radical groups who have identified these platforms as fertile ground for recruitment and radicalization of vulnerable young minds.
Cynthia Miller-Idriss, a prominent scholar and expert on extremism, radicalization, and far-right movements, recently published an article specifically calling out how the normalization of misogyny in gaming can translate into violent extremism. Specifically, calling out games as “misogyny incubators”, her work delves in to the role that the propagation, and consequent internalization, of this one specific ideology may be the most problematic and insidious. To be clear, misogyny is not games specific (it is being recognized as a problem across social spaces on the internet more generally) but games are contributing (and in unique ways) to this “new normal.”
Also, to all those people out there reading my newsletter who sometimes think “Rachel is being overly dramatic”, I once again assure you I am not. Recently, a researcher who looked at digital extremism and gender-based violence analyzed more than 100 manifestos written by people who carried out mass shootings, stabbings, vehicular attacks and other attacks of ideologically, politically, and religiously motivated extremism. The cosnsit ideology that ties them all together? Misogyny.

Best Case Scenario: A Toxic Environment for Everyone

Even in the "best case" scenario, we're left with what we have now—a persistently hostile environment that drives away players, particularly those from marginalized communities, and normalizes harmful behaviors that players carry with them into their offline lives. The statistics paint a stark picture of this digital toxicity epidemic: 82% of people who played online multiplayer games experienced some form of harassment and 83% experiencing some sort of extremist content, and 68% of players report experiencing more severe abuse, including physical threats and stalking. Among adolescents, the problem is equally pervasive, with two-thirds of adolescents aged between 10 and 17 years experiencing harassment from other players in online multiplayer games.
The psychological toll is severe and measurable. While the most common response to witnessing or being a direct target of these behaviors is feeling angry and/or uncomfortable, more severe consequences have also been noted. 1 in 10 game players reported having depressive or suicidal thoughts as a result of harassment in online multiplayer games, while nearly 1 in 10 (8%) reported having to take steps to reduce the threat to their physical safety. This isn't just about hurt feelings or temporary frustration—it's about real psychological harm that extends far beyond the gaming session.
The burden falls disproportionately on marginalized communities. Marginalized groups including women, minorities, and individuals within the LGBTQ+ communities are the targets of toxicity at higher-than-normal rates. 77% of women gamers experience gender-specific discrimination when gaming, including name-calling, receiving inappropriate sexual messages, gatekeeping and dismissiveness. Among women who experience abuse, 72% said the abuse was misogynistic, while none of the male respondents reported any gender-based discrimination or abuse.
Perhaps most concerning is how this toxicity becomes normalized and accepted as an inevitable part of gaming culture. Over a third of multiplayer online game players in one study admitted to having bullied others verbally, suggesting that toxic behavior has become so commonplace that players themselves perpetuate it. This normalization is contagious, and is literally breeding the normalization of sexism, homophobia, and other extreme viewpoints.
As mentioned above, the normalization of toxicity has created a dangerous feedback loop where harmful behaviors are not only tolerated but actively reproduced, creating what researchers describe as a culture where toxicity becomes the default rather than the exception. The result is a gaming ecosystem that systematically excludes and harms vulnerable populations while teaching participants that harassment, discrimination, and abuse are acceptable forms of social interaction.
There is also the whole business case situation, for those industry folks who have their business hat on while reading this.

What This Means for the Future

The research is clear: what happens in gaming spaces doesn't stay in gaming spaces. The behaviors, language patterns, and social norms that players learn and reinforce in games have the potential to shape how they interact with the world beyond their screens.
This isn't to say that games are inherently harmful or that all players become perpetrators of harassment. Rather, it's a recognition that these digital spaces are powerful cultural forces that need the same attention and care we give to other social influences. I’ve said it before and I’ll say it again: online and offline is a false dichotomy. The question isn't whether gaming culture influences offline behavior, it is what we are going to do about it.

And yes, I have some thoughts about that too:

  • …and to help get buy-in from those not convinced this is both an ethical and financial imperative in games: The ROI on T&S
Psychgeist Newsletter is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.