Meta Platforms said on Tuesday it will block Instagram users under 16 from livestreaming or unblurring nudity in direct messages without parental approval as part of a broader rollout of safety measures aimed at teenagers across its platforms.
The restrictions, which are part of Meta’s teen account program introduced last September, will initially apply to users in the United States, United Kingdom, Canada, and Australia before expanding globally in the coming months.
The new Instagram changes include preventing users under 16 from accessing Instagram Live unless authorized by a parent or guardian. Parental consent will also be required for teens to disable a tool that automatically blurs suspected nudity in direct messages.
Meta also announced that its teen-focused protections are being extended to Facebook and Messenger. These platforms will now adopt features already active on Instagram, such as default private account settings, restrictions on messages from unknown users, and filters on sensitive content. Additional tools will remind teens to take breaks after 60 minutes and pause notifications during typical sleeping hours.
“Teen Accounts on Facebook and Messenger will offer similar, automatic protections to limit inappropriate content and unwanted contact, as well as ways to ensure teens’ time is well spent,” Meta said in a blog post.
Since launching the teen account program, Meta said it had seen over 54 million teen profiles created under the new safeguards amid rising scrutiny over social media’s impact on youth well-being.