Considering New York's Stop Addictive Feeds Exploitation for Kids Act | TechPolicy.Press

Tim Bernard / Jun 26, 2024
notion image
notion image
June 21, 2024 - Albany, New York: Gov. Kathy Hochul signs S.7694A/A.8148A establishing the Stop Addictive Feeds Exploitation (SAFE) For Kids Act to require social media companies to restrict addictive feeds on their platforms for users under 18. Source
Last Thursday, Gov. Kathy Hochul signed two bills into New York State law, both aimed at the protection of children from online harms. These bills are amongst hundreds considered—and several passed—by US states over the last few years, including age-appropriate design codes, laws requiring parental approval for use of social media, and legislation governing access to pornographic websites.
The focus of this article is the Stop Addictive Feeds Exploitation (SAFE) for Kids Act (S7694), relating to social media; the New York Child Data Protection Act (S7695B) is a law with broader scope, instituting limitations of collection, use, and sharing of data by online service (or connected device) users who are minors between 13 and 18 (the law stipulates that for under-13s the rules on user data are as defined by the Federal COPPA law). For these users, a short list of “strictly necessary” purposes are defined, and services must gain “informed consent” (as defined in the bill) for collection, use, or sharing of data outside of these purposes. Notably, the standard for establishing the age of a user is actual knowledge, relying on user declarations, or signals from the user’s device or browser (it should be noted that this is not yet a common standardized technology, though some commentators have proposed it as a preferable model for age assurance). The minor-level strictures, however, do also apply to services primarily directed at minors.

Inside the SAFE for Kids Act

The SAFE for Kids Act appears to take the approach of a few previous bills: rather than instituting a whole code imposing a hodgepodge of risk assessments, specific safety features, and generalized responsibilities, it zeroes in on one aspect of social media that the legislators considered harmful: the potential for addiction, in this case, as instantiated in nighttime notifications and algorithmic feeds. While there have long been critiques of such feeds because of the harmful content they serve to minors, the focus here (perhaps in light of the First Amendment) is solely on their potential to be “addictive.”

Who is a minor?

As with the New York Child Data Protection Act, the standard for establishing who is a minor, which for this law means under 18, is actual knowledge. However, platforms are required to implement the restrictions unless the “operator has used commercially reasonable and technically feasible methods to determine that the covered user is not a covered minor.” The definition of “commercially reasonable and technically feasible methods” is one of the key issues that will require clarification in rulemaking from the NY attorney general to evaluate its impact.

Restrictions

  1. Content feeds that rely on the user’s data (such as their prior engagement with content, and seemingly their demographics or known interests too) for their ranking are prohibited for minors. This will force social media companies to only allow these users to access feeds that are just chronological or that are ranked on non-personalized criteria and simple searches. (While some platforms, most famously TikTok, rely heavily on personalized algorithmic curation, major services already have non-algorithmically personalized feeds available, at least in Europe, in compliance with the EU’s Digital Services Act, Article 38. An exception is made for any algorithmic ranking actions taken to reduce the visibility of problematic content, though these are unlikely to be linked to the user in specific in any case.)
  1. The second restriction for minor users is a prohibition on sending them notifications regarding feed activity between midnight and 6 am Eastern Time.

Parental consent

Either or both of the above restrictions can be lifted with “verifiable parental consent.” As with age verification, the acceptable operationalization of this is again left to the NY attorney general’s office to define.

Enforcement

The attorney general can bring actions under the law on behalf of the people of New York State to obtain damages, restitution and also to require the surrendering of any profits and destruction of illegally acquired data and algorithms trained on such data, in addition to up to $5000 per violation. Individuals will be able to make complaints about platforms to the attorney general via a website.

The Act and Freedom of Expression

Despite the aforementioned focus on addiction and not content, the SAFE for Kids Act may run afoul of the First Amendment by interfering with the platform’s expressive rights to arrange its content as it sees fit (though this is not indisputable) and the minor users’ rights to consume that arrangement of content. While the impact of the law appears less dramatic than recently enacted parental consent laws in Florida and Ohio, several of the key points in the district court’s decision enjoying the Ohio law, such as the law being intrinsically relevant to expression and content-based due to its applicability solely to social media platforms, and that the right of minors to access content is constitutionally protected. New York State may have some advantages over those laws in this regard by tailoring the law somewhat more narrowly and making it less vague. The question of whether they can show a compelling state interest is discussed more below.

Operational Challenges

Defining “commercially reasonable and technically feasible methods” of age assurance and “verifiable parental consent” is left to the attorney general’s office, following a number of other laws that hand off the difficult trade-offs between accuracy, privacy, equity, convenience and cost connected to age assurance to regulatory bodies. Regarding age assurance, the Act itself stipulates a number of considerations for the attorney general to take into account, and requires that several acceptable methods are approved. As the law is to come into effect 180 days following the date that the attorney general issues its rules, this process may cause some significant delay, given the complexity of the task.

What’s the Harm?

No doubt, social media platforms use notifications and algorithmic feeds to keep users (of all ages) engaged. After all, that is their basic business model. And it may be the case that some people (including minors) use social media “too much” by their own standards or by those of their friends and family. But it is far from clear that social media, far less these particular features, are causing “addiction” according to a robust, let alone clinical, definition, or that the courts would deem it a “compelling interest” necessitating some limits on freedom of expression.
We do have evidence from an academic study and leaked internal research that simple reverse-chronological feeds on Facebook and Instagram lead to reductions in usage of that platform, and perhaps that is good enough for the supporters of the New York law. The question is, where is the line between addictive and merely appealing? All makers of consumer goods and services try to make them as appealing as possible, but consumer protection rules that place limits on this are typically for cases of demonstrable and direct physical harm, which is arguably absent here.
  • **
While this law may well be subject to legal challenges, it could also present an opportunity to creatively consider how news and recommendation feeds can be ranked if not according to the demographics or engagement history of the user. Some platforms already incorporate quality scores: how might these be further developed or enhanced? There may be ways to select from broadly popular content to move beyond what one researcher described as “the lowest common denominator of human culture” Perhaps new algorithms might actually improve society, as entrants into this current competition hope. Bluesky's open approach to feed algorithms could also be a promising model, substituting user choice for platform-determined personalization.
Many of us find ourselves (doom)scrolling on social media without truly enjoying it, despite being “engaged” according to standard platform metrics. Who knows, maybe this law could even spur platforms to develop ways to make their feeds more enduringly appealing than engagement-based personalization does today.