Meta has started enforcing stricter rules for Instagram user protection across its operational areas in the United Arab Emirates and Saudi Arabia. The update creates special protected accounts for all users under 18 years old. It includes additional security measures that are difficult to bypass without parental permission.
The update establishes a 13+ content rating as its primary standard. The system now delivers age-appropriate content to teenagers and requires parental intervention for content restriction adjustments. “We hope this update reassures parents that we're working to show teens safe, age-appropriate content on Instagram by default,” the company said. “We're committed to continuously reviewing and improving these guidelines over time."
Instagram expands its definition of inappropriate content. The platform now blocks all explicit content while prohibiting adult material and it currently prevents access to strong language and risky stunts and content that shows dangerous acts. The current industry development shows that moderation now extends beyond legal requirements because companies need to assess their potential risks.
A new ‘Limited Content’ mode gives parents even tighter control, including restricting comments entirely. While this may reassure families, it also shifts the platform closer to a supervised digital environment, blurring the line between social media and curated spaces.
The platform is going further by restricting interactions. Teens won’t be able to follow or engage with accounts flagged as inappropriate, and those accounts can’t reach teens either. Search has also been cleaned up, with sensitive terms blocked even when misspelled, alongside tighter AI responses.
The update from Meta establishes mandatory security measures, which take precedence over their non-mandatory safety features. The system improves security measures for users, but it reveals a fundamental conflict between two objectives: providing digital protection for teenagers and allowing them to explore online spaces.