Business

Instagram to introduce PG-13-style controls to protect teen users, says Meta

Instagram is introducing a PG-13-style content rating system to give parents greater control over what teenagers see on the platform, Meta has announced.

The change marks one of the company’s most sweeping efforts yet to align social-media content moderation with the kind of age guidance familiar from the cinema. All users under 18 will automatically be placed into a “13+” setting modelled on the US parental guidance film rating. Teens will only be able to opt out with explicit parental consent.

The PG-13 system, created in the United States more than four decades ago, has become shorthand for content considered broadly suitable for teenagers but containing material that may be inappropriate for younger children. Meta said its new approach would mirror that framework online.

“While there are obvious differences between movies and social media, we made these changes so teens’ experience in the 13+ setting feels closer to the Instagram equivalent of watching a PG-13 movie,” Meta said. “We wanted to align our policies with an independent standard parents are already familiar with.”

Instagram already restricts sexually suggestive, graphic, or adult content such as tobacco or alcohol promotion on teen accounts. The new settings go further, tightening filters around strong language, risky stunts, and imagery linked to harmful behaviours, including posts featuring marijuana or drug paraphernalia.

Search results will also be restricted more aggressively. Keywords such as “alcohol” or “gore” — and even common misspellings — will be blocked under the new moderation system.

The approach has been designed to resemble the UK’s 12A cinema classification. Just as films such as Titanic or The Fast and the Furious may feature fleeting nudity or moderate violence but remain accessible to teenagers, the new Instagram rules will not prohibit all instances of partial nudity or stylised aggression.

Meta said the system would launch first in the US, UK, Australia and Canada, before being expanded to Europe and other regions early next year.

The move comes amid growing scrutiny of Meta’s child-safety record and the effectiveness of its moderation tools.

A recent independent review led by Arturo Béjar, a former senior Meta engineer turned whistleblower, concluded that 64% of new safety tools on Instagram were ineffective. Conducted alongside academics from New York University, Northeastern University and the UK’s Molly Rose Foundation, the study found persistent exposure to harmful content among teenage users.

Béjar said: “Kids are not safe on Instagram.”

Meta rejected the findings, insisting that parents already have “robust tools” to manage teenagers’ accounts and monitor activity.

The UK communications regulator Ofcom has also warned that social media companies must adopt a “safety-first approach” under the forthcoming Online Safety Act, saying platforms that fail to protect children will face enforcement action and potential fines.

Child-safety campaigners welcomed the intent behind the PG-13 system but questioned whether it would deliver meaningful change.

“Time and again Meta’s PR announcements do not result in meaningful safety updates for teens,” said Rowan Ferguson, policy manager at the Molly Rose Foundation. “As our recent report revealed, they still have work to do to protect young people from the most harmful content. These further updates must be judged on their effectiveness — and that requires transparency and independent testing.”

Critics argue that parental controls can be effective only if they are easy to use and clearly communicated to families, while some digital-rights advocates warn that over-blocking could limit teenagers’ access to legitimate health or educational resources.

The rollout of a PG-13-style content standard reflects Meta’s wider strategy to bring its platforms closer to traditional media norms amid rising pressure from governments and watchdogs.

By borrowing a familiar system from the film industry, Instagram hopes to reassure parents that it is taking responsibility for the wellbeing of its youngest users — and to set a benchmark other social platforms may now feel compelled to follow.

Read more:
Instagram to introduce PG-13-style controls to protect teen users, says Meta