Five Cautionary Notes For Successful Implementation of the DSA for Children’s Best Interests | TechPolicy.Press

Anne Collier is the founder and executive director of the nonprofit Net Safety Collaborative, and Ioanna Noula is the founder of the Children’s Online Redress (COR) Sandbox.
notion image
With the European Union’s Digital Services Act (DSA) now in force, the focus has shifted to meaningful implementation. This phase creates an opportunity to turn stakeholders’ intentions into transformative outcomes. Those of us whose work focuses on the well-being of the Internet's youngest users are noticing potential obstacles in the path toward effective enforcement of the DSA's provisions for children's rights to safety, privacy, and security. We argue that these obstacles can be addressed with practicable guidelines that maximize agility and consistency in regulatory practices throughout the EU.

1. A fundamental disconnect

There is a glaring disconnect between the DSA’s imaginary of childhood and the view of childhood expressed in the Charter of Fundamental Rights of the European Union, which the law invokes. The disconnect is evident in the first sentence of Article 28, and any assumption that this article by itself represents child redress requirements in the DSA would be a significant challenge to rights-based implementation. By referring only to children’s rights of safety, privacy, and security, Article 28 focuses on their vulnerability and protection rights, omitting their strengths-focused rights of expression and participation as spelled out in the Charter.
Prioritizing protection over participation rights poses the risk of regression in upholding children’s rights – at a time when many stakeholders have been calling for innovation around the United Nations Convention on the Rights of the Child’s Article 12 (on children’s right to have a say in matters that affect them) in regulatory practice. Implementing the law so that the full range of minors’ rights are upheld remedies the disconnect by ensuring that Internet users under 18 participate in developing policy decisions that concern them.
Article 28 should also not be the sole focus in the law for child-focused compliance because the rights and best interests of the child are referenced in other parts of the DSA as well, including in relation to risk assessments, the statement of reasons for moderation decisions, service design appropriate to children’s ages and requirements for parental controls.

2. Trusted by children, trusted flaggers are key

The DSA’s provision for trusted flaggers (Article 22) is vital to child protection and redress and deserves more attention in implementation discussions. To date the EU’s list of accredited trusted flaggers comprises 10 organizations. Only three of those are experts in child protection. A lack of funding and clear direction are the primary deterrents for organizations that could take on this role. With regard to the protection of minors online, we propose that entities accredited as trusted flaggers must fill a gap in online redress by connecting other redress parties – e.g., platforms, ombudspersons, and regulators – to the details and offline context of the individual complainant’s case. They must be the entities that, as their title indicates, are entrusted by the EU to competently report harmful content.
As important, they are also trusted by the young people, parents, and care providers who contact them. Trust between youth and those to whom they report is essential to providing the offline context of the content they’re reporting – the context that, by the nature of the Internet, platforms and regulators simply don’t have to help them take effective action. Among the most suitable candidates for the role of trusted flaggers in the case of children are the helplines throughout the EU. To fulfill their role, trusted flaggers need to be resourced not only for alignment with DSA requirements but also for receiving training in platform Terms of Service and how content moderation systems work at scale.
For example, they can help children and their caregivers make their reports of harm actionable for the platforms for whom most such reports are “false positives” or non-actionable based on their terms of service. Any lack of clarity around credentialing, training, and resourcing trusted flaggers undermines the help they can provide children (complainants), platforms, ombudspersons, and other helpers. In an ideal world, reporting systems will be standardized across all platforms, but until that happens, both children and platforms need entities they can trust for the system to work – platforms for reliable offline context on reports of harm and children for trained offline support. In fact, that need will not go away even after standardization happens.

3. Diversity of DSCs’ expertise

Across EU countries, different types of entities with different forms of expertise have been assigned the role of Digital Services Coordinator based on the priorities of each government and how it views the impact of the Internet on social and political rights, media, telecommunications, consumer protection, and competition. With the exception of Ireland – which established an Online Safety Regulator and passed the Online Safety and Media Regulation Act (OSMR) that complements the DSA’s requirements and adapts them to national priorities – most EU states have added the DSC portfolio to pre-existing roles that represent minimal understanding of the complexity of child online care and redress (e.g., telecommunications regulators or consumer protection authorities). That inconsistency promises inconsistency and delay in regulatory practice across the EU and potentially places an undue burden on regulators who are educated in children's online protection, practices, and rights.
The law’s implementation would benefit from the attention of the European Board for Digital Services (composed of member states’ DSCs) to the challenges this diversity of backgrounds and knowledge represents and educate their members accordingly, with youth participation included. Research and coordination are needed to ascertain what DSCs need to know and where training is needed to effectively implement the child-focused aspects of the DSA.

4. Digital literacy and parental tools are not a panacea

Parental tools and digital literacy education for children have been the primary focus of child online safety discussions and legislation at local, state/provincial, and national levels the world over – not only of the DSA, of course – and they are essential but not sufficient. In order to improve well-being and reduce harm as well as develop effective curricula and parental tools, all parties to the discussion need to be educated as to the actual harms youth face, how they use technology, and the degree to which technology can enhance their safety and privacy. This is invaluable education for all, which happens through research insight, direct exposure to youth perspectives, and multi-stakeholder discussion.
Both education and tech tools are necessary components of a larger collection of essential resources for care and redress, including parental tools, content moderation, trusted flaggers, regulation, research, law enforcement, and ombudspersons and other offline mediation providers. It is helpful to consider how each fits into the prevention/intervention equation to grasp the full spectrum of care and redress. Parental control tools and education in the form of media literacy, digital literacy, and social-emotional literacy typically fall on the prevention side of the equation.

5. Accountability on the part of all stakeholders

To ensure effective implementation of the Act, transparency and independent monitoring and evaluation mechanisms need to be put in place with respect to the practices of all parties to child online care and redress, not only those in the private sector. Assessing the efficacy of guidelines and procedures, as well as the performance of platforms, trusted flaggers, regulators, and redress practitioners will increase the accountability and value of each party and the entire system of care and redress. Introducing agility into regulatory design with collaborative multi-stakeholder discussion and innovation that includes youth – through the use of a regulatory sandbox, we propose – will help all parties learn from one another, keep up with developing technology, help the Trust and Safety community keep up with emerging harms and empower children and their care providers.
  • **
These cautionary notes are offered in the spirit of maximizing the legislation’s potential for good where children and young people’s interests are concerned. We hope they help bring regulation closer to responses and redress that are timely, readily accessible, equitable, and worthy of young users’ trust–now that social media is 20 years old and the global discourse on child online safety is even older. Do others feel the urgency we feel about this?