In a move to address growing concerns about teen safety, Instagram’s parent company, Meta, is launching a new content control system analogous to the PG-13 movie rating. This update will automatically apply stricter content filters to the accounts of all users under the age of 18, creating a more controlled digital environment for its younger demographic.
The core of this change is the implementation of a default “13+” setting for all teen accounts. This system is designed to be the standard experience, and any deviation towards a less restrictive setting will necessitate parental consent. Meta explained that its goal is to make the teen experience on Instagram feel closer to the equivalent of watching a PG-13 film, a standard that is widely recognized by parents.
This new “PG-13” version will tighten the platform’s content restrictions significantly. Beyond the existing filters for adult or graphic material, the update will now hide or refuse to recommend posts that feature strong language or potentially dangerous stunts. It will also target content that might promote harmful activities and block searches for sensitive keywords, a measure aimed at preventing teens from actively seeking out inappropriate material.
The timing of this announcement is notable, following a critical report co-led by a former Meta engineer, Arturo Béjar. The research claimed that two-thirds of Instagram’s safety features were failing to protect young users. Organizations like the Molly Rose Foundation have echoed these concerns, casting doubt on whether Meta’s new promises will translate into meaningful change without transparent, independent oversight.
Initially launching in several English-speaking countries including the US and UK, the feature is slated for a worldwide release early in the coming year. While Meta presents this as a robust tool for parents, safety advocates insist that the burden of proof lies with the company to demonstrate that these new filters are genuinely effective in safeguarding children.
