Instagram also hides search terms—but only if the term or phrase itself is promoting or encouraging self-harm, says Tara Hopkins, head of EMEA public policy at Instagram. “For other search terms related to suicide/self-harm that aren’t inherently violating, we show a message of support before showing any results.” The company declined to share how many search terms were blocked.
Instagram-owned Meta says it is juggling concerns about child safety with young people’s free expression. The company admitted that two posts seen by Molly and shown to the court would have violated Instagram’s policies at the time. But Elizabeth Lagone, head of health and well-being policy at Meta, told last week’s inquest that it is “important to give people that voice” if they are struggling with suicidal thoughts. When the Russell family lawyer, Oliver Sanders, asked Lagone if she agreed that the content viewed by Molly and seen by the court was “not safe,” Lagone responded: “I think it is safe for people to be able to express themselves.”
These comments embody what researchers say are major differences between the two platforms. “Pinterest is much more concerned with being decisive, being clear, and de-platforming content that does not meet their standards,” says Samuel Woolley, program director for the propaganda research lab at the University of Texas, Austin. “Instagram and Facebook … tend to be much more concerned with running up against free speech.”
Pinterest has not always operated like this. Hoffman told the inquest that Pinterest’s guidance used to be “when in doubt, lean toward … lighter content moderation.” But Molly’s death in 2017 coincided with the fallout from the 2016 US presidential elections, when Pinterest was implicated in spreading Russian propaganda. Around that time, Pinterest started to ban entire topics that didn’t fit with the platform’s mission, such as vaccines or conspiracy theories.
That stands in sharp contrast to Instagram. “Meta platforms, including Instagram, are guided by the dictum of wanting to exist as infrastructural information tools [like] the telephone or [telecoms company] AT&T, rather than as social media companies,” says Woolley. Facebook should not be “the arbiter of truth,” Meta founder and CEO Mark Zuckerberg argued in 2020.
The inquest also highlighted differences between how transparent the two platforms were willing to be. “Pinterest helpfully provided material about Molly’s activities on Pinterest in one go, including not just pins that Molly had saved but also pins that she [clicked] on and scrolled over,” says Varney. Meta never gave the court that level of detail, and much of the information the company did share was redacted, she adds. For example, the company disclosed that in the six months before her death, Molly was recommended 30 accounts with names that referred to sad or depressing themes. Yet the actual names of those accounts were redacted, with the platform citing the privacy of its users.
Varney agrees both platforms have made improvements since 2017. Pinterest results for self-harm search terms do not contain the same level of graphic material as did they five years ago, she says. But Instagram’s changes have been too little, too late, she claims, adding that Meta did not prohibit graphic images of self-harm and suicide until 2019.