Oversight Board Overturns Meta's Original Decision in Post in Polish Targeting Trans People Case | Oversight Board

notion image
The Oversight Board has overturned Meta’s original decision to leave up a Facebook post in which a user targeted transgender people with violent speech advocating for members of this group to commit suicide. The Board finds the post violated both the Hate Speech and Suicide and Self-Injury Community Standards. However, the fundamental issue in this case is not with the policies, but their enforcement. Meta’s repeated failure to take the correct enforcement action, despite multiple signals about the post’s harmful content, leads the Board to conclude the company is not living up to the ideals it has articulated on LGBTQIA+ safety. The Board urges Meta to close enforcement gaps, including by improving internal guidance to reviewers.
About the Case
In April 2023, a Facebook user in Poland posted an image of a striped curtain in the blue, pink and white colors of the transgender flag, with text in Polish stating, “New technology … Curtains that hang themselves,” and above that, “spring cleaning <3.” The user’s biography includes the description, “I am a transphobe.” The post received less than 50 reactions.
Between April and May 2023, 11 different users reported the post a total of 12 times. Only two of the 12 reports were prioritized for human review by Meta’s automated systems, with the remainder closed. The two reports sent for human review, for potentially violating Facebook’s Suicide and Self-Injury Standard, were assessed as non-violating. None of the reports based on Hate Speech were sent for human review.
Three users then appealed Meta’s decision to leave up the Facebook post, with one appeal resulting in a human reviewer upholding the original decision based on the Suicide and Self-Injury Community Standard. Again, the other appeals, made under the Hate Speech Community Standard, were not sent for human review. Finally, one of the users who originally reported the content appealed to the Board. As a result of the Board selecting this case, Meta determined the post did violate both its Hate Speech and Suicide and Self-Injury policies and removed it from Facebook. Additionally, the company disabled the account of the user who posted the content for several previous violations.
Key Findings
The Board finds the content violated Meta’s Hate Speech policy because it includes “violent speech” in the form of a call for a protected-characteristic group’s death by suicide. The post, which advocates for suicide among transgender people, created an atmosphere of intimidation and exclusion, and could have contributed to physical harm. Considering the nature of the text and image, the post also exacerbated the mental-health crisis being experienced by the transgender community. A recent report by the Gay and Lesbian Alliance Against Defamation (GLAAD) notes “the sheer traumatic psychological impact of being relentlessly exposed to slurs and hateful conduct” online. The Board finds additional support for its conclusion in the broader context of online and offline harms the LGBTQIA+ community is facing in Poland, including attacks and political rhetoric by influential government and public figures.
The Board is concerned that Meta’s human reviewers did not pick up on contextual clues. The post’s reference to the elevated risk of suicide (“curtains that hang themselves”) and support for the group’s death (“spring cleaning”) were clear violations of the Hate Speech Community Standard, while the content creator’s self-identification as a transphobe, alone, would amount to another violation. The Board urges Meta to improve the accuracy of hate speech enforcement towards LGBTQIA+ people, especially when posts include images and text that require context to interpret. In this case, the somewhat-coded references to suicide in conjunction with the visual depiction of a protected group (the transgender flag) took the form of “malign creativity.” This refers to bad actors developing novel means of targeting the LGBTQIA+ community through posts and memes they defend as “humorous or satirical,” but are actually hate or harassment.
Additionally, the Board is troubled by Meta’s statement that the human reviewers’ failures to remove the content aligns with a strict application of its internal guidelines. This would indicate that Meta’s internal guidance inadequately captures how text and image can interact to represent a group defined by the gender identity of its members.
While the post also clearly violated Facebook’s Suicide and Self-Injury Community Standard, the Board finds this policy should more clearly prohibit content promoting suicide aimed at an identifiable group of people, as opposed to only a person in that group.
In this case, Meta’s automated review prioritization systems significantly affected enforcement, including how the company deals with multiple reports on the same piece of content. Meta monitors and deduplicates (removes) these reports “to ensure consistency in reviewer decisions and enforcement actions.” Other reasons given for the automatic closing of reports included the content’s low severity and low virality (amount of views the content has accumulated) score, which meant it was not prioritized for human review. In this case, the Board believes the user’s biography could have been considered as one relevant signal when determining severity scores.
The Board believes that Meta should invest more in the development of classifiers that identify potentially violating content impacting the LGBTQIA+ community and enhance training for human reviewers on gender identity-related harms.
The Oversight Board’s Decision
The Oversight Board overturns Meta’s original decision to leave up the content.
The Board recommends that Meta:
  • Clarify on its Suicide and Self-Injury page that the policy forbids content promoting or encouraging suicide aimed at an identifiable group of people.
  • Modify the internal guidance it gives to at-scale reviewers to ensure flag-based visual depictions of gender identity that do not contain a human figure are understood as representations of a group defined by the gender identity of its members.
For Further Information
To read the full decision, click here.
To read a synopsis of public comments for this case, please click the attachment below.

Attachments