
Editor
Celine Low chevron_right
Meta has announced a series of new safeguards for Instagram to protect accounts run by adults that primarily feature children, as well as introduce additional safety features for teenage users.
This comes amid growing global concerns about the well-being of young people on social media platforms.
Stricter Safeguards for Accounts Featuring Children
Meta is implementing automatic, stricter controls for Instagram accounts managed by adults but primarily showcasing children. This includes accounts run by parents who regularly share photos and videos of their kids, as well as those managed by talent managers representing young individuals.
For these specific accounts, Instagram will now:
-
Automatically apply the strictest message settings to prevent unwanted direct messages (DMs) from reaching these accounts.
-
Enable the "Hidden Words" feature to filter out offensive or inappropriate comments.

Accounts affected by these stricter settings will receive a notification at the top of their Instagram Feed, informing them of the updated safety measures and prompting them to review their privacy settings.
Meta addressed the abuse of accounts featuring children, sharing that shared that it has already taken action against these account. It has removed almost 135,000 Instagram accounts sexualising accounts that primarily feature children, along with 500,000 associated Facebook and Instagram accounts.
"While these accounts are overwhelmingly used in benign ways, unfortunately, there are people who may try to abuse them, leaving sexualised comments under their posts or asking for sexual images in DMs, in clear violation of our rules," the company said.

Meta is implementing measures to prevent potentially suspicious adults from discovering these young profiles, including those adults whose accounts have been blocked by teenagers.
While these steps aim to make it harder for such individuals to find and interact with accounts featuring children by not recommending them and limiting their visibility in Instagram Search, it seems like slightly pointless: if an adult's account has been repeatedly blocked by minors, wouldn't a more decisive action, such as account termination, more effective?
New Safety Features for Teen Accounts
These are Instagram's dedicated experiences for teenagers with built-in protections that are applied automatically.
Teenagers will now see:
-
New options for safety tips in DMs: Reminding them to carefully check profiles and be mindful of what they share online.
-
Visibility of account join date: The month and year an account joined Instagram will now be displayed at the top of new chats, providing teens with more context about who they are messaging.
-
Combined block and report option: Instagram has added a new feature that allows teens to both block and report an account in a single action.

Meta reports positive early results, noting that in June alone, teens blocked accounts 1 million times and reported another 1 million after seeing safety notices.
Finally, Meta also provided an update on its nudity protection filter within DMs, confirming that 99% of users, including teenagers, have chosen to keep it enabled. Last month, over 40% of potentially sensitive images received in DMs remained blurred, indicating the feature's effectiveness in protecting users from unwanted content.

Similarly, Apple introduced measures to protect children's online safety when using their devices. Children are now required to get parental approval to text or call new numbers. They're also simplifying Child Account setup with default age-appropriate settings and making it easier for parents to link accounts to Family groups for full control over privacy.
Stay updated with ProductNation on here, Instagram & TikTok as well.
Read more related news here: