How can civil society help gaming companies build safer experiences?

Created time
Aug 13, 2024 12:50 PM
Posted?
Posted?
Two weeks ago, the Kids Online Safety & Privacy Act (KOSPA) passed in the United States Senate with resounding bipartisan support by a vote of 91-3. While it still needs to make its way through the House, the vote was a rare moment of solidarity across all political parties and elicited a statement of support from President Joe Biden, who pointed to “undeniable evidence that social media and other online platforms contribute to our youth mental health crisis”. There is still a long way to passage as the bill will need to make its way through the House - where its prospects are far from certain - become law, and withstand key legal challenges.
I spoke with NPR about the challenges for companies in addressing some of the harms raised in the legislation like cyberbullying due to the highly coded and secretive nature of the harm, the disparity between genders when it comes to talking about their experiences with cyberbullying, the way in which offline social dynamics against girls from marginalised communities spread into online interactions, and the need to better understand young boys’ experiences with these harms.
As a team focussed on helping companies, civil society, and governments build more responsibly for young people, we recognize that this is a whole-of-society project. In earlier issues, I’ve written about the need to meaningfully centre youth voices, the role of government is securing children’s rights, and how companies can adopt frameworks for age-appropriate design. Yet we’ve also seen a historic tendency to assign the bulk of responsibility to just one or two of these actors; this used to be parents, caregivers, and educators. Last month’s Senate vote, coupled with robust child safety regulation globally, shows that the pendulum has swung significantly in favour of regulatory bodies urging companies to design better online experiences for children.

But first…

If you were unable to attend the annual Trust & Safety Conference, it was a packed three days of industry practitioners, third party vendors, and governments discussing how we can advance some of the key safety issues of the day. Citizens & Tech Lab published a great liveblog of one of the panels I participated in: Trust & Safety x Responsible AI: Bridging the Gap. Alongside thoughtful panelists like Henriette Cramer (PaperMoon.ai), Alex Rosenblatt (SafetyKit), and Dave Willner (Stanford Cyber Policy Center), we discussed the challenges of ensuring these fields talk more to one another, the need for companies to understand the safety challenges throughout their users’ life cycles, and the possibility of leveraging generative AI to improve overall safety and user experience across platforms.

Case study: How civil society can support young boys in online gaming

Online gaming is a great case study for how we can employ a whole-of-society approach, leveraging existing research, helping civil society groups understand the complexity of designing products for young people, and supporting companies in ingesting and translating these insights into actionable product changes.
We recently supported a leading child safety NGO in understanding the role they can play in protecting children and young people, specifically boys, helping them seek advice and support in the face of sexual abuse and exploitation. The project began with the hypothesis derived from available data that
  1. Pervasive and persistent gender norms in the physical world as well as in online environments deter young boys (both children and teens) from disclosing instances where they have been abused or experienced harm, or seek support from their community, online platforms, or from trusted adults in their lives.
  1. Gaming - in particular multiplayer gaming - offers young boys significant opportunities for engagement and community, including opportunities to create support, shift norms around how to process harms they experience online, and incentivise boys to seek help when they are experiencing these harms.
notion image
Note: This is an evolving field and the experiences of non-binary and gender-diverse children and teens in gaming continues to be researched. At the time of this project, we have limited information available to understand their experiences, but hope this will change, and quickly.
Nearly all teens in the US play video games and other markets like the UK, South Korea, and Norway see similarly high levels of gameplay. Gaming presents young people with a valuable way for players to build relationships, explore new worlds, and enjoy a good mix of competition and collaboration, regardless of gender. As physical third spaces for young people to meet, connect, and experiment rapidly disappear, more children have turned to online experiences to meet these critical needs.
Yet three in four teens and pre-teens in the US also experienced harassment in these games in 2023, up from 67 percent in 2022. These experiences also spill over into popular discussion servers or forum chats dedicated to different games. Gaming platforms that offer multiplayer experiences build their community guidelines around topics like (a) safeguarding children from exploitation, (b) prohibiting harassment and bullying, (c) prohibiting sexual content, (d) prohibiting physical threats of violence or glorification of violent acts, (e) prohibiting hateful conduct and discrimination on the basis of, among other characteristics, gender, and (f) prohibiting financial threats or scams.
notion image
However, most policies and product interventions at gaming companies are gender agnostic, despite players of different gender both perceiving and perpetrating harm in significantly different ways. This is due to a number of factors:
  • Under-resourced trust and safety teams struggle to keep pace with the evolving needs of a diverse and expanding player base
  • Moderators frequently do not receive sufficient training to recognise abuse as it plays out in different populations, with under-training being a well-known problem in the industry
  • While third party scaled moderation tools such as AI voice moderation technologies are rapidly becoming more available, they also moderate in a gender-agnostic fashion

What can civil society do about it?

We worked with our client to understand how they can directly support product teams in their game design and iteration, strategise about the most effective business and regulatory arguments for safer gaming experiences, and develop their internal roadmap of projects and initiatives to prioritise to advance their mission of supporting young boys in their online gaming experiences.
One of the broad takeaways from this work is that more data is needed on how children of different genders experience harassment, how they seek help, and how game developers can update their policies and product interventions to develop targeted remediations for abuse.
This is where civil society groups can play a more significant role. There is a chasm between research and policy on gender-based best practices and integration into the actual gaming experience for children. Instead of waiting for companies to share specific types of data that they may not have (for very valid reasons like data minimisation requirements), civil society can complement companies’ ongoing work by providing specific insights into the lived experiences of young people across different genders and different types of gameplay. Civil society groups with expertise in engaging with children are uniquely positioned to convene youth voices and feedback on their experiences in online games, helping companies better understand the trends taking place across their own player base.
Civil society groups also have the opportunity to become much more specific about the product and policy mitigations they think can support positive online gaming experiences. They can develop gender-specific resources to direct young people to when they are undergoing a harmful experience in a game, and partner with companies and policymakers to provide thoughtful guidance in this space.
The public may assume companies and policymakers are awash with data, insights, and solutions to address safety issues in online gaming, but the reality is quite different. There is a pressing need to better understand how gender-based abuse plays out across online gaming services, signifying a rich open space for civil society organisations to complement the work of companies and policymakers through research and recommendations.

What else I’m reading

Since this newsletter started in February, this list has been getting increasingly longer, highlighting the focus on youth tech policy issues across so many different markets. I have been debating making this section a biweekly update to make it more digestible - it would be great to get your thoughts on whether this would be helpful or too much information.

Regulation

Ofcom, the UK regulator, fined TikTok £1.875 million (USD 2.4 million) for failing to accurately respond to a formal request for information about its parental controls safety feature. Ofcom stated that Tiktok’s data governance processes had “a number of failings” and that the company had “insufficient checks in place leading to an inaccurate data submission to us.” Ofcom drew a direct connection between the data inaccuracies and its own regulatory work, and has made clear that it will continue to take swift enforcement action against companies that fail to share accurate data with them in a timely manner.
The Department of Justice recently filed a lawsuit against TikTok on the FTC’s behalf, alleging that it continued to host millions of underage users and collect their personal data even after knowing about their presence on the platform. The lawsuit alleges that TikTok did not honour requests from parents who wanted to delete their children’s accounts, and chose not to delete accounts even when they knew the accounts belonged to children under the age of 13.
The UK Information Commissioner’s Office has conducted a review of 34 social media and video streaming platforms against the Children’s Code (previously known as the Age Appropriate Design Code). They found that nearly a third of the companies sampled had “poor children’s privacy practices” and has now put them on notice. While the names of the companies on notice were not publicly shared, the full list of companies sampled is visible at the link. The ICO’s decision to withhold the names of the companies notified suggests that they want to continue being seen as partners with industry in improving product practices.
The EU AI Act came into force and Article 28 specifically calls out the potential for AI to be misused for manipulative, exploitative, or social control practices. The Act considers these actions to contradict the rights of the child, which means that any AI system with the potential to do this is immediately classified as “high risk”. High risk systems are subject to a rigorous set of obligations including but not limited to risk assessment and mitigation systems, activity logging and traceability, and appropriate human oversight measures to minimise risk.

In the news

The Financial Times reported that Google and Meta struck a secret ads deal to target teenagers on YouTube, going against Google’s new policy since 2021 prohibiting personalised ads to teenagers. They found a loophole by targeting a category of users with “unknown” demographic data - but Google had enough data points from users’ activities to determine that many of those in this “unknown” group were under the age of 18. The story was a great illustration of why new announcements about policies are not enough; we need to encourage more transparency around how these policies are implemented in practice.
5Rights Foundation launched a legal challenge against Meta after a police report discovered predators using Instagram to share AI-generated child sexual abuse images. This comes just three months after Meta committed to new generative AI principles, promising to guard against the creation and spread of AI-generated child sexual abuse material.
The Guardian published a thoughtful piece on why children and teens seem increasingly addicted to video games, and the newspaper’s video game editors later published an op-ed on the offline reasons why children may be turning to these games. These two pieces are worth reading together; the first piece frames gaming addiction as something to be treated as a serious medical condition and the treatment options available, while the second addresses factors like a lack of adequate mental health support, disappearing third spaces where children and teens can interact, and heavily surveilled childhoods.
Prince Harry and Meghan Markle launched The Parents' Network to provide a safe and free peer support network for parents whose children have been negatively impacted by social media. The initiative advocates for social media platforms to prioritise safety by design and believes that parents alone cannot fully protect their children. I expect that the network will eventually more vocally tell the stories of these children and advocate for social media companies to adopt more rigorous standards when designing for children and teens.
After a months-long public campaign warning Meta about the widespread criminal activities of financial sextortion group the Yahoo Boys on its services, Meta announced that it would be banning them and removing accounts and pages related to them. Scammers primarily target young teenage boys by pretending to be attractive girls, coerce victims into sending sexually explicit images of themselves, then threaten to widely share the photos unless the victim pays a ransom. The FBI reports that financial sextortion led to at least 20 teenage suicides between October 2021 and March 2023.
Discord rolled out a teen safety charter developed in partnership with teens to represent the expectations teens have of each other and of Discord. While no new product or policy changes were announced as part of the charter, Discord states that it will use these principles to inform future product and policy improvements. The charter comes as concern grows over predatory groups coercing children into self-harm on Discord servers, predatory men coercing young girls to livestream sexual acts on Discord calls, or teachers catfishing students on Discord, grooming them to share self-generated child sexual abuse imagery.
After a shooter tried to assassinate Republican presidential candidate Donald Trump, players in games like Fortnite, Minecraft, and Roblox took to recreating the scene. These include clips of a character taking a shot at a Trump look-alike, accompanied by real audio from the event. I was most interested in Roblox’s policy loophole, where players can use Roblox Studio to create an experience, but never actually publish it to Roblox. Since these players then shared videos of the draft gameplay on social media, it would be off-platform behaviour, and Roblox only intervenes where the off-platform behaviour is particularly egregious. You can still find guidance on how to recreate the assassination scene on Roblox in TikTok videos like this one.