This magazine takes you directly into the future!

Meta mental health initiative
Published on
8 min read

Meta’s Thrive Mental Health Initiative to Fight Self-Harm Content

Social media platforms- Snapchat, Meta, and TikTok- have launched Thrive Mental Health Initiative. According to Techradar, the initiative was designed in partnership with the Mental Health Coalition.

Thrive is aimed at flagging harmful content on self-harm or suicide and keeping it from spreading on social media platforms.

Sharing Signals

Through the Mental Health Coalition partnership, Meta, TikTok, and Snapchat will share signals about violations of self-harm content with each other. This will enable the platforms to investigate and take action where the same or similar content is posted on their applications.

In a blog posted on its website, Meta acknowledged that self-harm content isn’t limited to its platforms only. Meta’s spokesperson described Thrive Mental health program as a database that’s accessible to all participating companies. Meta said that any self-harm content found on social media will be removed and flagged in the database so that other companies can take action.

“We’re prioritizing this content because of its propensity to spread across different platforms quickly. These initial signals represent content only, and will not include identifiable information about any accounts or individuals,” Meta Global Head of Safety, Antigone Davis wrote in the post.

The social media giant said it will be using its technology, alongside the Lantern program to share harmful data securely on the Thrive platform. Lantern program is a Tech Coalition platform designed to make technology safe for children. It includes tech giants like Google, Apple, OpenAI, and Discord.

Legal Actions

For a long time, Snapchat, Meta, TikTok, and other social media platforms have been criticized for not taking sufficient action to moderate the content that teens consume online. This includes images and videos of self-harm.

Previously, parents and communities have taken legal action against the three platforms that are now partnering under the Thrive Mental Wellbeing program for allowing content that led to suicides. In 2021, internal research that leaked to the public revealed that Meta knew its Instagram platform could have harmful impacts on teenage girls.

Studies have suggested that teens who self-harm are active on social media. They have also shown that increased use of social media by minors has led to increased suicidal ideation and depression among those groups.

A Welcome Move

Meta’s mental health initiative has been welcomed by tech safety advocates. Acknowledging the good intentions of Thrive, technology watchdogs say the program appears to be a safety measure that companies take when they’re pressured by advocates and lawmakers.

“We are glad to see these companies working together to remove the types of content associated with self-harm and suicide. Without proper regulatory guardrails, the jury is out on whether this will have a significant impact on the harms that kids and teens are facing online,” Chief Advocacy Officer at Common Sense Media, Daniel Weiss said.

Earlier this year, Meta announced that it would be removing and limiting sensitive and inappropriate content from teenagers’ feeds. The social media giant also said it was planning to hide terms and search results relating to eating disorders, suicide, and self-harm from all users.

On Thursday, September 12, Meta reported that it pulled down 12 million content pieces featuring self-harm and suicide from Instagram and Facebook between April and June 2024. The company hopes that Thrive will help keep graphic content off the three social platforms that are participating in the programs.

Diane Hicks
Scroll to Top