Instagram Launches Parental Alerts on Teen Self-Harm Searches
In Focus
- Parents will receive notifications through email, WhatsApp, or text
- Only users in the parental supervision program on Instagram will receive the alerts
- Self-harm search alerts will roll out next week in the U.S., U.K., Australia, and Canada
Instagram will start notifying parents when their teen users search for terms relating to self-harm or suicide in the coming weeks. According to TechCrunch, the Meta platform has already taken steps to block users from searching for such content. However, the new Instagram parental suicide alerts are aimed at ensuring parents are informed of their teens’ repeated searches of such content in order to support them.
Self-Harm Searches Will Trigger an Alert
Searches that include phrases like ‘self-harm’ and ‘suicide’ will likely trigger an alert. Teen self-harm search notifications will be sent to parents who are currently enrolled in the parental supervision program on Instagram. According to the social media platform, the alerts will be sent through email, WhatsApp, or text.
“Tapping on the notification will open a full-screen message explaining that their teen has repeatedly tried to search Instagram for terms associated with suicide or self-harm within a short period of time” Instagram noted.
Additionally, Instagram will send an in-app notification with resources to help parents converse with their teens. The social platform, which introduced Instagram live restrictions last year, said search notifications will only be sent if self-harm searches meet a specific threshold.
“Parents will also have the option to view expert resources designed to help them approach potentially sensitive conversations with their teen. We chose a threshold that requires a few searches within a short period of time, while still erring on the side of caution,’ the company added.
Meta’s Struggle With Lawsuits
Instagram’s suicide content monitoring alerts come at a time when Meta and other tech companies are facing lawsuits over social media addiction. Meta and Apple are currently facing court battles over their child safety policies, particularly the use of default end-to-end encryptions.
Last week, Instagram head Adam Mosseri faced prosecutors in the U.S. District Court in the Northern District of California where Meta is facing a lawsuit over teen protection lapses. Mosseri was questioned over the delayed roll out of safety features on Instagram, including the nudity filter for private messages to teens.
Meta is facing another lawsuit in Los Angeles County Superior Court. In this court, court documents showed that Meta’s internal study had found parental controls and supervision had minimal impact on children’s addictive use of social media.
Additionally, the study showed that underage users who experienced stressful events had a high chance of struggling with social media regulation. Meta is also facing an underage exploitation trial in New Mexico where it is accused of connecting users to exploitative content.
Alerts Will Extend to AI Conversations
Instagram plans to expand the teen suicide protection feature to AI conversations in the future. In this case, parents will receive alerts when teens engage the app’s AI in discussions related to suicide or self-harm. Meta’s teen suicide prevention tools will roll-out next week in the U.S., U.K., Australia, and Canada. Roll-out to other regions will follow later this year.
