(1) Beyond engagement-based ranking: The case of Pinterest

notion image
A key criticism of social media companies is that they rank and recommend content based on ‘clicks’, even when it’s harmful, divisive or poor quality.
But how ELSE could companies rank content? And if the ranking changed, would the companies lose the attention of users who are the key to their business model?
Thanks for reading Tech and Social Cohesion! Subscribe for free to receive new posts and support my work.
That’s where Pinterest comes in. In a recent blog the company shares how it combines other signals with engagement-based ranking with the aim of offering users a safer feed. “It is widely known that optimizing purely for user engagement can surface content that is low-quality (e.g., “clickbait”), or even harmful,” wrote Leif Sigerson, Pinterest’s Senior Scientist and Wendy Matheny, Senior Lead Public Policy Manager. “Pinterest is proud to have partnered with experts from UC Berkeley and the Integrity Institute to organize the Field Guide.
The ‘Field Guide’ is called ‘What We Know About Using Non-Engagement Signals in Content Ranking.’ It’s the result of a group of technologists and academics who gathered for a day-long workshop, came up with 19 alternatives to purely engagement-based ranking and then pressure-tested their applicability across platforms. The paper admits that engagement-based ranking is good at increasing user retention. But there could be other ways to do it more safely. This includes incorporating other signals, asking users what they want to see and assessing content for quality.
“We’re passionate about Non-Engagement Signals,” write Sigerson and Matheny. “We rely on them to benefit our users and our business, and we think the internet would be a better place if all content ranking platforms could more easily apply them.” They also point to how helpful it was to explore a range of options with an inter-disciplinary group from within and outside tech companies.

Tuning the AI towards positivity

While less popular than Meta or TikTok internationally, Pinterest has been amongst the tech companies accused of causing serious harm, including suicide, by prioritizing profits over safety. In June 2022 Pinterest publicly apologized for allowing harmful content related to self-harm, depression and suicide flood the social media feed of 17-year old Londoner Molly Russell before she took her own life.
In an interview earlier this year, CEO Bill Ready acknowledged how AI-driven content algorithms exploit our brain stem’s attraction to fear, anger envy and greed. He says to counter this Pinterest is “tuning the AI specifically for positivity — not just for maximising view time, but also for maximising emotional wellbeing, for making people feel better, not worse.” He says the platform also captures and seeks out diverse signals which offer more insight into user preferences, beyond what they click on. This can include in-app surveys as well as other user behaviors, such as what users saved.
Ready appears to understand that many of the harms people experience on social media weren’t because of ‘bad people’, but because of bad design.
“Positivity for social media doesn’t have to just be charitable, it can be a good business model,” says Pinterest CEO Bill Ready.
According to the interview with Ready, earlier this year the platform moved to private-by-default for users under 18 and private-only for users under 16. While he admits that it reduced engagement and revenue in the short term, he says, “We were willing to make that trade because we felt that, in the long term, not only was it morally the right thing to do [but we can] build the business on safety — it can be a real differentiator for the business.”
Pinterest is a founding signatory to the Inspired Internet Pledge, aimed particularly at making the internet healthier for young people. The site, which includes TikTok as a signatory while Meta is noticeably absent, offers three principles:
  1. Tune for emotional wellbeing, in the way products, services, policies and advertising is built and designed, for healthier experiences on and offline.
  1. Listen to and act on insights from people who have experienced harm online to inform the business
  1. Share lessons collaboratively, including best practices, key research findings, and creative solutions.
This hits many of the points advocates of safety-by-design are asking for, but what do Ready’s investors think? “For our broader investor community, we are showing that positivity for social media doesn’t have to just be charitable, it can be a good business model. And that can actually change an industry. I would love that to be the outcome of our work,” he says.
Perhaps it’s only when there is enough transparency to be able to access and analyze user experiences across multiple platforms, including understanding the influence of recommender algorithms and other design choices, that we will know for sure.
Lena Slachmuijlder is Executive Director of Digital Peacebuilding at Search for Common Ground and Co-Chair of the Council on Tech and Social Cohesion.
Thanks for reading Tech and Social Cohesion! Subscribe for free to receive new posts and support my work.