Meta mental health initiative
Published on
8 min read

Meta’s Thrive Mental Health Initiative to Fight Self-Harm Content

Social media platforms- Snapchat, Meta, and TikTok- have launched Thrive Mental Health Initiative. According to Techradar, the initiative was designed in partnership with the Mental Health Coalition.

Thrive is aimed at flagging harmful content on self-harm or suicide and keeping it from spreading on social media platforms.

Sharing Signals

Through the Mental Health Coalition partnership, Meta, TikTok, and Snapchat will share signals about violations of self-harm content with each other. This will enable the platforms to investigate and take action where the same or similar content is posted on their applications.

In a blog posted on its website, Meta acknowledged that self-harm content isn’t limited to its platforms only. Meta’s spokesperson described Thrive Mental health program as a database that’s accessible to all participating companies. Meta said that any self-harm content found on social media will be removed and flagged in the database so that other companies can take action.

“We’re prioritizing this content because of its propensity to spread across different platforms quickly. These initial signals represent content only, and will not include identifiable information about any accounts or individuals,” Meta Global Head of Safety, Antigone Davis wrote in the post.

The social media giant said it will be using its technology, alongside the Lantern program to share harmful data securely on the Thrive platform. Lantern program is a Tech Coalition platform designed to make technology safe for children. It includes tech giants like Google, Apple, OpenAI, and Discord.

Legal Actions

For a long time, Snapchat, Meta, TikTok, and other social media platforms have been criticized for not taking sufficient action to moderate the content that teens consume online. This includes images and videos of self-harm.

Previously, parents and communities have taken legal action against the three platforms that are now partnering under the Thrive Mental Wellbeing program for allowing content that led to suicides. In 2021, internal research that leaked to the public revealed that Meta knew its Instagram platform could have harmful impacts on teenage girls.

Studies have suggested that teens who self-harm are active on social media. They have also shown that increased use of social media by minors has led to increased suicidal ideation and depression among those groups.

A Welcome Move

Meta’s mental health initiative has been welcomed by tech safety advocates. Acknowledging the good intentions of Thrive, technology watchdogs say the program appears to be a safety measure that companies take when they’re pressured by advocates and lawmakers.

“We are glad to see these companies working together to remove the types of content associated with self-harm and suicide. Without proper regulatory guardrails, the jury is out on whether this will have a significant impact on the harms that kids and teens are facing online,” Chief Advocacy Officer at Common Sense Media, Daniel Weiss said.

Earlier this year, Meta announced that it would be removing and limiting sensitive and inappropriate content from teenagers’ feeds. The social media giant also said it was planning to hide terms and search results relating to eating disorders, suicide, and self-harm from all users.

On Thursday, September 12, Meta reported that it pulled down 12 million content pieces featuring self-harm and suicide from Instagram and Facebook between April and June 2024. The company hopes that Thrive will help keep graphic content off the three social platforms that are participating in the programs.

Diane Hicks
X

Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as Necessary are stored on your browser as they are essential for enabling the ... Show More

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as Necessary are stored on your browser as they are essential for enabling the basic functionalities of the site.

We also use third-party cookies that help us analyze how you use this website, store your preferences, and provide the content and advertisements that are relevant to you. These cookies will only be stored in your browser with your prior consent.

You can choose to enable or disable some or all of these cookies but disabling some of them may affect your browsing experience.

Show Less

Necessary Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

Functional

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No Cookie to display

Analytics

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

Performance

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No Cookie to display

Advertisement

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No Cookie to display
Scroll to Top