Meta extends Instagram’s Teen Accounts to Facebook and Messenger, introducing strict privacy controls and parental supervision for users under 16.
The update enhances safety, curbs risks, and promotes healthier social media habits.
On April 8, 2025, Meta launched Teen Accounts for Facebook and Messenger, extending Instagram’s safety model to users under 16 in the US, UK, Australia, and Canada, with global rollout planned by December 2025. With 54 million teens using Instagram’s Teen Accounts—97% retaining default settings—Meta addresses pressure from 40 US states’ lawsuits over teen mental health and a €405 million EU fine for data breaches. Aligned with the UK’s Online Safety Act and Australia’s age-verification trials, the move tackles cyberbullying and predatory risks, with 1.2 billion monthly teen users across Meta’s platforms.
Key measures include:
- Safety Features: Private accounts by default limit messaging to approved followers. AI-driven filters block explicit content, and tagging is restricted to mutual connections, reducing stranger contact.
- Healthy Habits: “Quiet Mode” mutes notifications from 10 PM to 7 AM for better sleep. Usage nudges after 90 minutes of scrolling encourage breaks, countering addiction risks.
- Parental Oversight: Family Center offers parents chat monitoring, veto power over privacy changes, and customizable time limits (e.g., 2 hours daily), with real-time alerts.
Data shows 80% of teens feel safer, and 65% of parents use oversight tools, though 30% seek stronger anti-bullying measures. X posts praise the controls but note enforcement gaps, with 15% of teens reporting harassment. Critics urge Meta to tackle predatory accounts more robustly to rebuild trust amid regulatory scrutiny.