F.T.C. Study Finds ‘Vast Surveillance’ of Social Media Users

The Federal Trade Commission building in Washington. The F.T.C. said it started the study nearly four years ago to look into the business practices of some of the biggest online platforms.
notion image
The Federal Trade Commission said on Thursday it found that several social media and streaming services engaged in a “vast surveillance” of consumers, including minors, collecting and sharing more personal information than most users realized.
The findings come from a study of how nine companies — including Meta, YouTube and TikTok — collected and used consumer data. The sites, which mostly offer free services, profited off the data by feeding it into advertising that targets specific users by demographics, according to the report. The companies also failed to protect users, especially children and teens.
The F.T.C. said it began its study nearly four years ago to offer the first holistic look into the opaque business practices of some of the biggest online platforms that have created multibillion-dollar ad businesses using consumer data. The agency said the report showed the need for federal privacy legislation and restrictions on how companies collect and use data.
“Surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms, from identify theft to stalking,” said Lina Kahn, the F.T.C.’s chair, in a statement.
Tech giants are under intense scrutiny for abuses of privacy and have in recent years been blamed in part for contributing to a mental health crisis among young people and children that has been linked by some social scientists and the surgeon general to the rampant use of social media and smartphones. But despite multiple proposals in Congress for stricter privacy and children’s online safety protections, nearly all legislative attempts at regulating Big Tech have failed.
Efforts by the companies to police themselves also haven’t worked, the F.T.C. concluded in its report. “Self-regulation has been a failure,” it added.
Google, which owns YouTube, “has the strictest privacy policy in our industry — we never sell people’s personal information and we don’t use sensitive information to serve ads,” said José Castañeda, a spokesman for Google. He added, “We prohibit ad personalization for users under 18 and we don’t personalize ads to anyone watching ‘made for kid content’ on YouTube.”
Discord’s head of U.S. and Canadian public policy, Kate Sheerin, said in a statement that the F.T.C.’s report “lumps very different models into one bucket and paints a broad brush.” She added that Discord does not run a formal digital advertising service.
TikTok and Meta, which owns Whatsapp, Messenger and Facebook, did not immediately respond to requests for comment.
In December 2020, the agency opened its inquiry into the nine companies that operate 13 platforms. The F.T.C. requested data from each company for operations between 2019 and 2020, and then studied how the companies had collected, used and retained that data.
Included in the study were the streaming platform Twitch, which is owned by Amazon, the messaging service Discord, the photo- and video-sharing app Snapchat, and the message board Reddit. Twitter, now renamed X, also provided data.
The study did not disclose company-by-company findings. Twitch,Snap, Reddit and X did not immediately respond to requests for comment.
Companies have argued that they’ve tightened their data collection policies since the studies were conducted. Earlier this week, Meta announced that the accounts of Instagram users younger than 18 will be made private by default in the coming weeks, which means that only followers approved by an account-holder may see their posts.
The F.T.C. found that the companies voraciously consumed data about users, and often bought information about people who weren’t users through data brokers. They also gathered information from accounts linked to other services.
Most of the companies collected the age, gender and the language spoken by users. Many platforms also obtained information on education, income and marital status. The companies didn’t give users easy ways to opt-out of data collection and often retained sensitive information much longer than consumers would expect, the agency said.
The companies used data to create profiles on users — often merging the information they gathered with information on habits collected on other sites — to serve up ads.
The agency also found that many of the sites claimed they restricted access to users under the age of 13, but many children remained on the platforms. Teens were also treated like adults on many of the apps, subjecting them to the same data collection as adults.
Many of the companies couldn’t tell the F.T.C. how much data they were collecting, according to the study.
The F.T.C. last year proposed changes to strengthen child privacy regulations and lawmakers are seeking to raise child privacy protections to users under 18. In 2022, Ms. Khan opened a regulatory effort to create rules for companies that show advertising based on users’ browsing or search history.
The agency has previously filed complaints against several tech companies for privacy violations, and reached a settlement of $520 million with Epic Games in late 2022 for allegedly violating a child privacy law and deceiving consumers with unwarranted charges. That same year, the F.T.C. fined Twitter $150 million for using security data about users for behavioral advertising.
Privacy and Safety Online