The Moderation Arcana | tarot

Category
Resource
Created time
Jan 1, 2024 03:11 PM
notes
The Moderation Arcana is a set of provocations aiming to provide social media platforms with user-centered and research-informed recommendations to improve the design and effectiveness of their flagging and appeals functionalities. These provocations were developed as part of Dr Carolina Are's postdoctoral project at the Centre for Digital Citizens, Northumbria University.
Research has shown platforms' flagging and de-platforming functionalities are inadequate at tackling online abuse, and provide malicious actors with opportunities to exploit strict platform governance to de-platform users they disagree with. This disproportionately affects marginalised users like sex workers, LGBTQIA+ and BIPOC users, nude and body-positive content creators, pole dancers, but also journalists and activists.
Content moderation fails to take the human experience into account. Being de-platformed from social media often leaves users unable to access work opportunities as well as their communities and networks. Users are designed out of spaces they need – and research has found this has adverse mental health and wellbeing impacts.
To address this issue, this project sources policy design recommendations through round-table workshops with end-users with lived experiences governance inequalities to design them back in through facilitating empathy via tarot card archetypes and their own solutions.
notion image
notion image

Background

The report connected with these provocations originates from three two-hour, Zoom-hosted virtual workshops with 15 participants each. One workshop focused on designing policy for the lived experiences of users posting nudity and sex-related content, one on the experiences of LGBTQIA+ user communities and one on the experiences of journalists and activists.
Following Chatham House rules and through a World Café framework based on The World Wide Web Foundation’s Tech Policy Design Lab (TPDL) Playbook, participants were invited to reimagine solutions to improve flagging and de-platforming policies arising drawing both from findings from my previous CDC project and from their lived experiences.

What is de-platforming and who does it affect?

De-platforming is a form of content moderation, or the practice of deleting and/or censoring online content, and a crucial aspect of platform governance without which social media would be unusable.
Through content moderation, social media and internet platforms make curatorial decisions over the visibility of what is posted on their spaces, enforcing rules established via ‘community guidelines’ or ‘standards’ based upon which a blend of human moderators and algorithms are trained to make decisions.
Content moderation – and particularly automated moderation – has so far disproportionately targeted marginalised users, over-focusing on nudity and sexuality instead of on violence particularly after the approval of the 2018 United States law known as FOSTA/SESTA – the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) and the Stop Enabling Sex Traffickers Act (SESTA) respectively.
FOSTA/SESTA removed the Section 230 exception to the US Telecommunications Act which ruled social media companies were not liable for what was posted on them.
notion image
“You can tell that these platforms have been made and are run by cisgender white people, because that's what influences community guidelines. And while TikTok is a Chinese platform it's influenced by the tech world, which is predominantly mostly cisgender and usually white men. I just wish that there was more of a level playing field: I want big social media platforms that are made by queer people.”

– Interview participant

How has FOSTA/SESTA changed content moderation?

Although Section 230 kept – and at the time of writing still keeps - online services immune from civil liability for the actions of their users, in 2018 FOSTA/SESTA reversed this immunity for to content that may facilitate sex trafficking, de facto making platforms liable for at least a portion of what was posted on them[5]. While fighting online sex trafficking may appear like the best moderation choice, a closer look at the groups campaigning for the approval of FOSTA/SESTA shows the joint bill was pushed into the US Congress by right-wing pressure groups and religious extremists, who used sex trafficking as a cover to push an anti-porn, anti-sex agenda.
Therefore, FOSTA/SESTA instead resulted in platforms over-censoring posts by sex workers, LGBTQIA+, plus size and BIPOC users, athletes, lingerie, sexual health brands, sex educators, disabled content creators and activists worldwide to avoid being accused of facilitating trafficking, applying this US legislation to content posted around the world.
The law changed platform governance through increasingly sex-averse community guidelines focused on nudity, sexual activity, and solicitation. This conservative approach to censorship has been linked to platforms’ wish to protect their own commercial interests through being overzealous in following recent legislation[8] and to platform governance’s tendency to replicate offline inequalities by over-sexualising and pathologizing content by queer, plus size, disabled and BIPOC users due to a largely male, heterosexual, white, able-bodied and cisgender workforce.
notion image
notion image