How Polarization is Turning Disagreement into Danger

I’ve spent the last 20 years trying to promote constructive disagreement and dialogue between groups in conflict. Early in graduate school, I studied polarization, how it leads to supporting the use of violence in pursuit of some moralistic goal, and incites retaliatory radicalization from the target of that violence. And, the cycle continues.
Eventually, with Ravi Iyer and Jon Haidt, I co-founded a non-profit to help community groups intervene before the radicalization got too far and people became too polarized and entrenched with their co-partisans. Somewhere along the way in the mid-2010s, a radio show host asked me whether the political violence we were seeing in other countries could make its way to the US. I said ”absolutely,” and hoped I was wrong. Some of the other panelists laughed and dismissed my statement as alarmism; it wasn’t.
At the time, though, the temperature was significantly cooler. Sure, people argued over politics, unfriended each other, spent less time at Thanksgiving dinner in ideologically-divided places, and on rare occasions, threatened each others’ lives. Those occasions are becoming more frequent and pervasive. The number of federal charges of threatening a public official more than tripled between 2013 and 2022. The 2023 and 2024 data are still being analyzed, but already show record-breaking numbers of charges against people threatening public officials.
Image from the Combating Terrorism Center’s Sentinel .
notion image
This alarming trend correlates well with the rising numbers of Americans who condone the use of violence to save their country. From 2021 to 2023, the number of Americans condoning political violence rose 53%. Today, 23% of Americans agree that they may have to resort to violence in order to get the country back on track. Perhaps even more alarmingly, this sentiment is even more common among gun owners. 42% of assault rifle owners, 44% of recent gun purchasers, and 56% of people who “always or nearly always carry loaded guns in public” support using violence for political ends. A related study also found a staggering 58% of Republicans who voted for former President Trump in 2020 and deny the election results support using political violence. To be clear, though, this isn’t limited to gun-owners or people on the political right. Twenty-five percent of Democrats also consider violence to be justified to advance political objectives.
But, this story isn’t about Democrats and Republicans, or gun owners and non-gun owners. It’s about how the temperature is rising across social divides globally, and how the heat affects regular, unwitting people who happen to get targeted by culture warriors. These regular people have no interest in engaging in political fights, but their lives or work may be too close to something that conveniently fits some political narrative. And, every narrative needs a good villain.
notion image
Renee DiResta, former research manager at the Stanford Internet Observatory, is one of these regular, unwitting people who cares about researching a societally important topic: harmful experiences on social media. One harmful experience that she studies is misinformation. Misinformation happens to be a topic that triggers some of the more politically extreme and sensitive people who believe that social media sites use that label to censor conservative political speech. Those believing this claim are required to ignore that conservative figures tend to get far more views on Facebook and Twitter (now X) than liberal figures. Regardless, those studying misinformation are made into boogeymen (or boogey-people?) and viewed as a threat. The fact that Renee’s summer job in college was at the CIA (doing entry-level, non-sensitive work) only made her a more perfect option to be the villain in this story. Over time, this narrative spread from more fringe political actors all the way to the US House of Representatives and Jim Jordan’s “Select Subcommittee on the Weaponization of the Federal Government.” Onlookers may have taken this growing attention and used it to legitimize the (incorrect) claim that DiResta was a threat, and proceeded to send her death threats.
This isn’t an isolated event. Kate Starbird, another disinformation researcher and director of the University of Washington’s Center for an Informed Public, has also been targeted and threatened. People like Renee and Kate are subject matter experts and skilled researchers, but have no control over what multi-billion or trillion dollar companies decide to do with misleading content. I was literally on the Civic Integrity team at Meta, and we didn’t even have the power to demote or remove this content. We could recommend it to the decision-makers way above our paygrade, but they made up their own minds. Too often, in my opinion, they opted to leave the harmful content online.
For a short while during the COVID-19 pandemic, the company cracked down more on health-related misinformation. I was one of the researchers working on this ad hoc team that would regularly brief the White House Coronavirus Taskforce. It was exciting. We studied ways to identify health misinformation, like dangerous and ineffective “treatments” proposed by people who had large followings and no medical expertise, and then changing weights in the algorithm to reduce the visibility of this content. This work literally saved lives.
Yet, one culture warrior data center technician was upset when he read about our experiments testing the best ways to implement the model we built. He decided to leak our unredacted reports and our personnel profiles to Project Veritas, and talk about it on Alex Jones, Sean Hannity, Tucker Carlson, and probably many other outlets. In the days that followed, anti-vaccine extremists made good use of our personnel profiles and started stalking our team of about 6 people. One of the engineers on our team owned public property and had a kid in public school, which made it easy for them to find him/her/them. The extremists went to the kid’s school at the end of the day and tried to prevent the engineer from picking up their progeny. After that happened, the company’s global security team decided that we each needed security details. They had a team that patrolled my block monitoring for suspicious people. If I wanted to go out, I had to inform them so they could follow to ensure my safety. I’m grateful that they took my security seriously, but the fact that I needed security at all is terrible.
As a lifelong peacemaker committed to trying to understand people who believe and think differently than I do, I gained firsthand experience that made me less interested in peacemaking with them. Everyone has the right to believe whatever they want, but nobody has the right to accost someone trying to pick up their kid from school or to threaten another’s life. I’m still fascinated by how people get this far down the radicalization path; I just don’t have the energy or motivation to build relationships with them.

What’s in the pipeline?

  1. The Washington Post rejected our op-ed on meaningful transparency, so I resubmitted to the New York Times… though, I suspect events over the weekend will warrant a rejection.
  1. As part of my work with the Integrity Institute, I’ve been working on guidance for the UK’s online safety regulator (OfCom) regarding how to approach children’s access to online services. In the process, I’ve read a bunch on age assurance and age verification systems, and have started sketching out a future post on these topics.
  1. I’ve been invited to join the Knight-Georgetown Institute Expert Working Group on recommendation system optimization. We will be writing a white paper to help lawmakers better understand how ranking and recommendation systems work on social media and in search.
  1. I finished analyzing the Neely Social Media Index data from Poland and drafted a summary report. I’ll share that as soon as the powers that be at the Neely Center sign off on it. I’m also excited to share that we are in the process of collecting data from Kenya, and we are approved to collect data in Somalia. If you’re interested in partnering with the Neely Center and collecting data on social media experiences in another country, please do let me know.
  1. As part of the Platform Data handbook I’m building with some colleagues, we are including a databrary containing links to all of the publicly available social media data we can find. It’s a work in progress, but should give you a taste of what is to come in the following weeks. If you know of data that we haven’t yet included, please message or email me.

What I’m reading