Instagram just rolled out new safety tools for teens. Inside direct messages, you’ll see a fresh “Safety Tips” prompt. Tap it, and you’ll get quick advice on spotting scams and staying safe when chatting with people you don’t know.
Right below the prompt, a one-tap button lets you block an account instantly. You’ll also spot a note showing when that account was created, so you can judge if it seems fishy.
If you’d rather block and report in a single click, there’s now a combined option for that, too. Meta says the change should help flag more problem accounts faster.
For parents and guardians using Instagram’s “adult-managed teen” setup, Meta added extra safeguards. No teen account—adult-managed or not—will show up in feeds recommended to strangers.
This follows a crackdown earlier this year when teams removed nearly 135,000 Instagram profiles tied to adults making inappropriate comments or requests to children under 13. They also took down another half-million linked Facebook and Instagram accounts.
Read More = Users Claim Instagram AI Is Wrongly Banning Accounts
Other protections, like hiding location data, blurring nudity in DMs, and blocking unknown adults from messaging, stay in place. Since launching the nudity blur worldwide, 99 percent of users keep it on, and over 40 percent of blurred images stay unseen in June alone.
The new location notice popped up a million times last month, and one in ten teens tapped it to learn more.
Beyond tech fixes, Meta supports an EU push to raise the minimum social media age. Right now, some countries want to agree on 15 as the cutoff, with the possibility of moving it to 16.
Meta’s on board, likely for both safety and business reasons, but either way, it could offer teens extra protection.
These updates don’t fix every risk. But they give younger users fresh reminders and simpler controls.
If you’re 13–17 and on Instagram, take a moment to explore the new options. They could save you from an unwanted DM or worse.


