Analysis | How TikTokers think about misinformation

Happy Patriot Day. Gen Z is too young to remember Sept. 11 — so they’ve turned it into a meme. Send news tips and questionable toddler gymnastics mementos to: will.oremus@washpost.com.
Below: A House Republican questions a Google lawyer’s role advising Harris. First:
A new report examines how TikTok users decide what to believe.
notion image
TikTok CEO Shou Zi Chew leaves a House Energy and Commerce Committee hearing in Washington in March 2023. (Jabin Botsford/The Washington Post)
As Americans wake up to the aftermath of Tuesday night’s presidential debate, millions will learn about and process the key campaign event at least partly via TikTok.
The legal battle over the video app’s future hasn’t curtailed its reach and influence as a source of news and political information, particularly for young people. Pew Research reported in June that TikTok is an increasingly popular source of news for American adults, particularly those ages 18 to 29. And last week, it reported that support for banning the app continues to fall, with just 32 percent of Americans now wanting it gone.
With a pivotal election approaching, TikTok’s still-growing influence might alarm those who worry about its reliability as an information source. In 2022, researchers at NewsGuard found that one in five videos shown in TikTok search results for certain hot-button topics contained what they deemed misinformation.
But now some news literacy experts say the problems of misinformation and echo chambers on TikTok are more nuanced than they’re often made out to be.
It’s true that TikTok users regularly encounter false, misleading or biased information, said Claire Wardle, a communication professor at Cornell University. But many of them take for granted that some of what they see is likely to be false — and they’ve developed their own ways of navigating that.
Wardle oversaw the research for a new report from the Weber Shandwick Collective, a communications advisory firm, on how TikTok users process news and political information. The report, shared with the Tech Brief ahead of its publication Wednesday, is based on diaries and screen recordings kept over a five-day period by 30 American TikTok users from different backgrounds, as well as interviews with a dozen TikTok creators and a broader, opt-in survey.
While the report’s findings aren’t statistically representative, Wardle said the combination of methods provided a fascinating window into how at least some TikTok users think about the credibility of an app they cite as a primary source of news and opinion. Here are a few of the takeaways:
  • TikTok users don’t believe everything they see. Most of those interviewed reported that they have come to accept that some of what they see on the app will be misleading, fake or otherwise unreliable, and to view it through a skeptical eye. Whether well-founded or not, many expressed confidence in their ability to spot misinformation on the app and suss out the truth.
  • To check facts, some stay on TikTok. When users aren’t sure about a video they just watched, some users said they’ll search Google for more reliable sources. But quite a few reported turning regularly to the video’s comment section for context or debunkings, or searching within TikTok for other videos on the same topic.
  • Many trust TikTok creators’ personal experiences over mainstream media or experts. Wardle recounted a time when she recommended a New York Times article about the causes of the Maui wildfires to one of her classes — and got pushback from students who said they had heard the real story via TikTok from indigenous Hawaiians on the ground in Maui.
  • They tend to view TikTok as an echo chamber — and they like it that way. While several respondents said they believe TikTok exposes them to a diversity of voices, many reported that they appreciate how its famously personalized “For You Page” insulates them from content they really can’t relate to. “What came through from our respondents is how much they enjoyed getting content that reaffirmed their world view,” Wardle said. Some said they use the app as a source of “talking points” for opinions they were already inclined to hold.
Those insights sound both plausible and, in some cases, worrying, said Hannah Covington, senior director of education content at the nonprofit News Literacy Project, which played no role in the research.
“I'm glad that some of the people surveyed indicated they expect to encounter mis- and disinformation on TikTok, and even expressed confidence in spotting it,” Covington said. “But people are often overconfident in their own ability to fact-check.”
That could be even more of a problem on TikTok than elsewhere, Covington added, because people are especially susceptible to believing falsehoods that reinforce their preconceptions.
Ahead of the November election, both former president Donald Trump and Vice President Kamala Harris have built sizable TikTok followings. But Wardle said authenticity and humor often play better on the platform than polished messaging, meaning that content created by the candidates’ followers is likely to prove more influential than what the candidates post themselves.
That will almost certainly include users’ remixes and commentary on clips from Tuesday night’s debate, she said. “I would not be surprised if, in the debate prep on both sides, they will be thinking about the kinds of lines that might end up being meme-able.”
In response to a request for comment, a spokesperson for TikTok pointed to the company’s recent announcement that it is expanding its efforts to protect the integrity of the platform’s discourse around the 2024 election. Among those are a series of videos aimed at boosting users’ media literacy.

From our notebooks

House Republican questions Google lawyer’s role advising Harris
Republicans are dialing up pressure over Harris’s ties to Silicon Valley, with one top lawmaker calling into question whether a Google lawyer’s role preparing the Democratic nominee for Tuesday’s debate creates an ethical conflict, your co-host Cristiano Lima-Strong reports for the Tech Brief.
House Judiciary Chairman Jim Jordan (R-Ohio) on Tuesday called on the Justice Department to brief the committee on how it is “working to combat potential conflicts of interest and political bias” amid reporting that a lawyer representing Google against the Biden administration in court has also emerged as a prominent adviser for Harris’s presidential campaign.
Lawyer Karen Dunn kicked off Google’s defense against a DOJ lawsuit alleging that the giant has monopolized digital advertising in federal court Monday, just a day before Harris took the stage to debate Trump, a session Dunn helped her prep for. The dynamic has sparked concern among both Democrats and Republicans critical of the tech giants.
“This apparent conflict of interest raises serious concerns about whether Dunn’s relationship with key figures in the Biden-Harris Administration creates a conflict of interest that could inappropriately bias the Department’s approach,” Jordan wrote in a letter Tuesday.
The Justice Department confirmed receipt of the letter but declined to comment.

Government scanner

Inside the industry

Workforce report

Trending

Daybook

  • The Center for Democracy and Technology holds an event, “The Future of Speech Online 2024: AI, Elections, & Speech,” next Monday and Tuesday.

Before you log off

If you've ever wondered why most business software sucks, it's for the same reason as this cartoon.
The person responsible for buying the software isn't using it in the way the end users are. pic.twitter.com/TkIw7svALl
That’s all for today — thank you so much for joining us! Make sure to tell others to subscribe to Tech Brief. Get in touch with Cristiano (via email or social media) and Will (via email or social media) for tips, feedback or greetings.