Instagram revealed plans last October to limit content for teen accounts using 13+ movie ratings in countries like Australia, Canada, the UK, and the US. The company announced Thursday that these guidelines are now applied worldwide. This follows Meta being held accountable for harming teens by courts in New Mexico and Los Angeles last month.
The enforcement aims to reduce content on Instagram featuring extreme violence, sexual nudity, and drug use. It will also avoid recommending posts with strong language, risky stunts, and marijuana paraphernalia.
A new setting called “Limited Content” is introduced to apply stricter filters, preventing teens from seeing or interacting with certain comments.
“Teens might occasionally encounter suggestive content on Instagram similar to a 13+ movie, but we strive to minimize these instances. We understand no system is faultless and are committed to improving,” the company mentioned in a blog post.
Previously, Meta marketed these restrictions as inspired by PG-13, but the Motion Picture Association demanded the term not be used, stating the rating system isn’t comparable to social media content.
Meta appears to have shifted away from this branding. In its recent blog post, it acknowledged the difference between movies and social media, suggesting the settings are akin to an “Instagram equivalent” of a teen-appropriate movie rating.
Meta has faced scrutiny for prioritizing growth over teen mental health and responded with new controls to reduce potential harm. Recent efforts include notifying parents if teens search for self-harm content, enhancing parental controls, and pausing teen access to AI characters pending updates.
Court documents reveal Meta delayed implementing safety features like blurring explicit images, despite knowing the issue. Expanding content restrictions for teens globally may be a preventive measure as Meta could face further scrutiny following New Mexico and Los Angeles legal cases.
