Instagram teen content restrictions
Published on
5 min read

Meta Introduces “Limited Content” Setting on Instagram, Takes Content Restrictions Global

In Focus

  • The “Limited Content” setting introduces age appropriate filters on Instagram
  • Meta’s content restrictions are based on 13+ movie rating
  • The social media faces global scrutiny over its teen protection practices

Instagram is expanding content restrictions for teen accounts globally. The Meta-owned social media platform revealed plans to restrict content for teen accounts in the U.S.,the U.K., Australia, and Canada last year. A report by TechCrunch shows that Instagram teen content restrictions are based on the 13+ movie ratings.

Restrictions Keep Teens from Viewing Harmful Content

By introducing the content restrictions globally, Instagram is looking to reduce teen access to harmful content. Such content includes posts that show graphic drug use, extreme violence, and sexual nudity. Instagram will hide or avoid recommending posts that feature content that shows risky stunts, strong language or marijuana.

Just like you might see some suggestive content or hear some strong language in a movie rated for ages 13+, teens may occasionally see something like that on Instagram, but we’re going to keep doing all we can to keep those instances as rare as possible. We recognise no system is perfect, and we’re committed to improving over time,” the company posted on its website.

The social media platform has also introduced a new setting called “Limited Content”. The Instagram Limited Content setting applies strict filters that block teens from viewing, commenting, or receiving comments for posts that carry harmful content.

Meta is introducing Instagram teen account content limits globally after the company faced trial in New Mexico for allegedly linking underage users to exploitative content. The court recently held the social media company accountable for harming teens.

Meta’s Efforts to Address Teen Mental Health

In recent years, Meta has faced scrutiny for prioritizing product growth over teen mental health. In response, the company has introduced measures to reduce potential harm, including parental alerts on teen self-harm searches on Instagram.

Other measures include new parental controls for its AI experiences, and a temporary pause on teen access to AI characters. When the company first introduced teen content restrictions last year, it promoted them as PG-13-inspired limits, a rating system used in movies.

The social media giant received a cease-and-desist letter from the Motion Picture Association (MPA), insisting that the movie rating system should not be applied to social media content.
Meta has since adjusted the way it brands Instagram movie-inspired content restrictions.

In its latest blog post, the social media giant acknowledged that “there are differences between movies and social media” and described the ratings as settings that are closer to an “Instagram equivalent” of a teen-appropriate movie.

Impact of the New Mexico Ruling

Following the verdict in the New Mexico and Los Angeles lawsuits, Meta now faces growing scrutiny over its teen protection practices. This could mean that its effort to expand content restrictions for teens globally may be a preemptive move.

In the New Mexico lawsuit, court documents showed that Meta deliberately delayed introducing a feature that automatically blurs explicit images on social media messages. The company did so despite being aware of the issue and its impact on teen health.

Michael Hill
Scroll to Top