Meta's data dance with its Oversight Board

Created time
Mar 1, 2024 11:45 AM
notion image
(Hakan Nural / Getty Images)
In July 2022, a Facebook user posted a cartoon of Iran’s Supreme Leader, Ayatollah Khamenei, in a group dedicated to Iranian freedom. The caption on the cartoon read “marg bar ( meaning “death to,” or “down with”) Iran’s “filthy leader Khamenei.” Meta took down the post under its incitement to violence policy.
Five months later, the independent, company-funded Oversight board overruled Meta’s decision and issued a recommendation: “marg bar Khamenei” should be allowed in the context of Iranian protests, and the company should restore any posts including the phrase that were wrongfully taken down.
Policy recommendations like this one are the most important tool the three-and-a-half-year-old board has to influence the company that created it. The board hears only a small number of cases each year, and whatever impact might have come from an individual post being removed or restored is blunted by the fact that members take several months to make their rulings. To have any real impact, the board has to use rulings about individual posts to get Meta to change its policies.
But there’s no guarantee Meta will listen. While the board’s rulings about individual cases are binding, Meta has no obligation to implement any of its policy recommendations. The paper shares lessons it has learned from years of working with the company, emphasizing the importance of obtaining and properly analyzing data to determine whether Meta is listening.
In the Iran case, Meta told the board that it had implemented the recommendation. But the company “did not provide any evidence to support their claim,” according to a research paper by four board employees that was published today in the Journal of Online Trust and Safety. This put the board in an awkward but not unusual position — trying to determine whether Meta had in fact implemented its recommendations and, if so, figuring out what the effect of changing its policy had been.
The board used CrowdTangle, a public analytics tool that the company acquired in 2016 (and has exasperated Meta ever since). Using CrowdTangle, the board found a 30 percent increase in the number of posts containing the “marg bar” phrase on Instagram after the board’s decision. (There was no statistically significant difference in the number of posts on Facebook). The board inferred that the increase was due to Meta restoring posts that had been taken down after its recommendation.
The paper highlights an ongoing tension between Meta and the board. To demonstrate its effectiveness, the board regularly asks the company for data to demonstrate that its recommendations have been implemented. But Meta is not always forthcoming with internal data, and the board sometimes has to turn to CrowdTangle, the Meta Content Library API, or other methods.
“When they set up the board, no one thought we were going to be doing this level of detailed tracking,” said Naomi Shiffman, head of data and implementation at the Oversight Board, in an interview with Platformer.
The board hopes that sharing the lessons it has learned about working with Meta will prove relevant to governments around the world that are trying to regulate Big Tech.
“We’ve done the hard part of getting it wrong or not asking for the right things at first, but eventually refining our approach in a way that's really valuable for other people to look at in terms of the best practices,” said Dan Chaison, an Oversight Board spokesperson.
After Meta has had a chance to review its policy recommendations, Shiffman’s team rates the company’s responses as “comprehensive,” “somewhat comprehensive,” or “not comprehensive. But getting the data to determine whether Meta is listening to the board can be difficult.
When it can get the relevant data, though, the board argues that its recommendations are working.
In 2021, protesters supporting Russian opposition leader Alexei Navalny were posting on Facebook when one user called another a “cowardly bot.” Meta took down the post under its anti-bullying rules. The Oversight Board eventually overturned the decision, noting that while the takedown was in line with Meta’s bullying and harassment policy, the policy was disproportionately restricting free speech.
The board recommended that when Meta removed a post because of a single word or phrase in a larger comment, the company should tell users what rule they had broken and give them an opportunity to repost the message without the offending word. The company implemented the recommendation. Later, the board followed up to ask for impact data. Meta responded and said that in a 12-week period in 2023, the company notified users regarding more than 100 million violating pieces of content, with over 17 million notifications related to the company’s bullying and harassment policy. These notifications prompted users to delete their posts more than 20 percent of the time.
“In this case study, we see users actually changing their behavior when they're told, ‘Hey, you're violating the bullying and harassment standard here. Do you want to make a different choice?’” Shiffman says. “And they are, they're changing their behavior.”
I asked Shiffman if easier access to data would speed up the Oversight board’s famously slow processes. But the board primarily uses this kind of data to analyze the impact of its recommendations rather than to aid in its rulings, she said.
“Most of the data in question is used to know whether a recommendation was implemented after a decision or to measure its impact,” Shiffman explained. “While tools like CrowdTangle and the Meta Content Library are used for case development, most of the data described in this paper is retroactive and would be difficult to apply to future decisions. So faster data access would not affect the pace of the board’s case decisions.”
Given that Meta sometimes takes months to supply relevant data to the board, do employees there ever feel like Meta is intentionally dragging its feet?
“I'm sure there are parts of Meta that are always slow-rolling everybody, because it’s easier,” Shiffman said. “But our counterparts at Meta are fighting for us hard, and are not trying to slow roll us. They're navigating the bureaucracy, just like everybody is.”
In response to a request for comment on the paper, a Meta spokesperson referred us to the company’s Quarterly Update, which states: "[T]he Oversight board’s recommendations continue to serve as critical guidance for teams across the company developing and implementing policies, working to mitigate systemic risks and providing greater transparency and accountability."
While the board’s paper identifies cases in which its recommendations have led to better policies on Facebook and Instagram, its suggestion that its approach offers a roadmap for regulators can feel grandiose. Unlike the board, regulators can subpoena the data they’re looking for — and can write rules that carry the force of law. The board may have learned some valuable lessons since 2020, but the burden of proof remains on its members to demonstrate that those lessons have changed Meta in more than marginal ways.

X in court

A critical hearing in X’s case against the Center for Countering Digital Hate kicked off today in San Francisco. To recap, X is suing the nonprofit research organization for publishing its analysis of a spike in hate speech on X following Elon Musk’s acquisition of the company. X accused CCDH of “actively working to assert false and misleading claims about X and actively working to prevent public dialogue.”
CCDH is trying to get the case thrown out under California’s anti-SLAPP law. So far, Senior District Judge Charles R. Breyer has not seemed moved by X’s arguments.
"I can't think of anything more antithetical to the First Amendment than this process of silencing people from publicly disseminating information once it's been published,” he said this morning. He later called X’s argument “one of the most vapid extensions of law that I’ve ever heard,” according to NPR’s Bobby Allyn.
Zoë Schiffer

Cash management 101

notion image
Crafting your startup’s cash management strategy? First, divide your funds into three categories: liquid, short-term, and strategic. From there, it’s about optimizing for risk and yield according to those time horizons. But how much should you put in each category?
Learn more about how to create the best cash management strategy for your startup.
  • Mercury is a financial technology company, not a bank. Banking services provided by Choice Financial Group and Evolve Bank & Trust®; Members FDIC.
Platformer has been a Mercury customer since 2020.
notion image
On the podcast this week: Kara Swisher stops by to burn Kevin and me in honor of the publication of her buzzy new memoir, Burn Book. Plus, internet policy expert Daphne Keller joins us to talk through this week's oral arguments in the Supreme Court content moderation cases.



  • X added live video to Spaces. Live video on a network that doesn’t believe in content moderation. What could go wrong? (Amrita Khalid / The Verge)

Those good posts

For more good posts every day, follow Casey’s Instagram stories.
notion image
notion image
notion image

Talk to us

Send us tips, comments, questions, and data requested by the Oversight Board: and