Instagram makes teen accounts private by default

notion image
The new experience for teenagers on Intagram. (Meta)
Programming note: The first edition of the week is landing on Tuesday morning to accommodate some news. A second edition will go out to paid members this afternoon covering the Snap Partner Summit in Los Angeles, which I am attending.
I.
Instagram announced a sweeping overhaul of its privacy and safety features for teenagers on Tuesday, making accounts for tens of millions of users under 16 private by default and requiring that they get parental permission to change many settings. Younger users will also be opted into the most restrictive settings for messaging and content recommendations, and will start receiving a prompt encouraging them to close the app after using Instagram for an hour each day.
The full set of changes, which come amid mounting global concern that social media is detrimental to the mental health and safety of young people, represent the most significant voluntary changes an American social media company has made to limit teens’ access to an app. But Meta stopped short of some of the changes that lawmakers around the United States have called for, including requiring parental permission to create accounts and placing more limitations around recommendation algorithms.
For that reason, it’s unclear to what degree Instagram’s new experience for teenagers will satisfy regulators — or begin to calm the broader societal backlash against social products used by young people. But Meta’s move will likely create significant new pressure on TikTok, Snapchat, and other social apps popular among young people to adopt similar measures.
“Our north star is really listening to parents,” said Naomi Gleit, head of product at Meta, in an interview with Platformer. “They're concerned about three areas: One is who can contact their kids. Two is what content their kids are seeing. And three is how much time they're spending online. And so teen accounts are designed to address those three concerns.”
Teenagers who create new accounts starting today will be opted in to the new experience. It will come to teens with existing accounts in the United States, Canada, United Kingdom, and Australia in 60 days, and is expected to roll out globally beginning in January.
The changes include:
  • Placing all users who are under 16 into private accounts. Teens will have to accept new followers in order to interact with them or make their content visible. (Users who are 16 or older will not have their accounts made private automatically by Meta.)
  • Creating new parental supervision tools that let parents and guardians manage their children’s Instagram settings and see who their teens are messaging. (Parents cannot view the contents of those messages.)
  • Making it so that by default, teens can only be messaged by people that they follow or have already messaged. They also will not be able to be tagged or mentioned by accounts that they do not follow.
  • Placing teens into more restrictive content settings, limiting the types of recommendations they get from accounts they do not follow in Reels or Instagram’s explore page. Meta will also filter offensive words and phrases from their comments and direct messages.
notion image
The new Explore page for teens. (Meta)
  • Showing teens a new “daily limit” prompt after they have used Instagram for 60 minutes.
  • Introducing “sleep mode,” which will automatically turn off Instagram notifications for teens between 10PM and 7AM.
  • Asking teens to choose from more than 30 safe topics they are interested in so that Instagram can show them more of those on Reels and in in-feed recommendations. (“They can basically go to Explore and see a whole page of cats, if they want to,” Gleit said.)
Meta also said it is developing new age verification requirements and building new automated systems to find underage accounts and place them in more restrictive settings.
To change those settings, teens will have to designate a parent or guardian’s Instagram account and allow an adult to supervise their settings. In practice, Meta will have no idea whether the adult account linked to a teen’s is actually their parent or guardian. And it fully expects some teens to try to circumvent the requirement.
“Teens are very creative,” Gleit said. “So they're obviously going to find workarounds.”
Still, she said, the company is prepared for the more obvious hacks, and plans to take various steps to prevent evasion.
II.
The new teen experience comes just under a year after 41 states and the District of Columbia sued Meta, alleging that Instagram is addictive and harmful — itself the product of a two-year investigation. The investigation began after revelations from documents leaked by Meta whistleblower Frances Haugen, particularly some internal research showing that a minority of teenage girls reported a negative effect on their mental health after using Instagram.
The question now is whether the changes announced today will stave off more drastic action from regulators around the world. In the United States, Congress has failed to pass a single law that would place new child safety requirements on platforms. (The Kids Online Safety Act, which passed the Senate in July, has stalled in the House of Representatives.) A growing number of states have passed digital child safety laws of their own, but for the most part they have been blocked from being implemented on First Amendment Grounds.
When Meta was sued last year, the company publicly pointed to 30 features it had introduced in an effort to make their apps safer for young people. Privately, executives complained that their company has been caught up in a moral panic. Meta CEO Mark Zuckerberg seemed to hint at this last week during a taping of the “Acquired” podcast when he referred to “allegations about the impact of the tech industry or our company which are just not founded in any fact” — and said that the company would do less apologizing going forward.
The changes announced today have been under development for many months. But it’s hard not to read them as a tacit admission that on many key dimensions, Meta’s critics have been right. Letting teenagers create and manage their own accounts has left them vulnerable to bullies and predators, and filling up their feeds with whatever an automated system predicts they will engage with has in many cases been detrimental to the health of their users.
These concerns have been raised internally for years, as Haugen’s leaked documents revealed. But Meta dismissed some of that internal research by saying that the sample size was small, and that it had been taken out of context.
“It is simply not accurate that this research demonstrates Instagram is ‘toxic’ for teen girls,” Pratiti Raychoudhury, then Meta’s head of research, wrote in 2021. “The research actually demonstrated that many teens we heard from feel that using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced.”
Of course, with more than 1 billion users on Instagram, both things are true. “Many teens” use Instagram to find community and for positive self-expression — and many other teens get bullied, or fall victim to grooming or sextortion, or experience other harms. The question has always been to what extent Meta itself would address those harms of its own accord, and to what extent it would have to be dragged into doing so by regulators.
Gleit played down the idea that the changes seemed to validate long-standing complaints about child safety on Instagram.
“I would say that in areas like this, there's always more that we can do to give teens and parents more of what they want and less of what they don't want,” she said. “It really is just sort of an evolution of the protections that we've had.”
At the same time, Gleit told me the company believes that the changes will meaningfully reduce usage of the app among teenaged users.
“I think that teens might use Instagram less in the short term,” she said. “I think they might use Instagram differently in the short term. But I think that this will be best for teens and parents over the long term. And that's what we're focused on.”
Correction: This post originally said Meta is also developing new technologies to identify and remove users who are 13 and under. Today's changes only apply to 13-17-year-olds.
notion image

Governing

Industry

Those good posts

For more good posts every day, follow Casey’s Instagram stories.
notion image
notion image
notion image

Talk to us

Send us tips, comments, questions, and teen safety tips: casey@platformer.news.