Instagram is introducing its new policies limiting interactions between teens and adults to make its platform safer for young users. Instagram has banned adults from direct messaging teenagers who don’t follow them and introduces “safety prompts” that will be shown to teens when they DM adults who have been “exhibiting potentially suspicious behavior.”
Safety will give teenage users the option to report or block adults who are messaging them. The prompts will remind young users not to feel pressured to respond to messages and to “be careful sharing photos, videos, or information with someone you don’t know.”
Notices will appear when Instagram’s moderation systems spot suspicious behavior from adult users. The company is not sharing detail on how these systems operate but says such suspicious behavior could include sending “a large amount of friend or message requests to people under 18. ”Instagram says this feature will be available in some countries this month and available globally “soon.”
Instagram also says it’s developing new “AI and machine learning technology” to detect someone’s age when they sign up for an account. Officially, the app requires that users are aged 13 and above, but it’s easy to lie about one’s age. The company said it wants to do “more to stop this from happening” but did not detail how new machine learning systems might help with this problem.
New teenage users who sign up to Instagram will also now be encouraged to make their profile private. If they choose to create a public account anyway, Instagram will send them a notification later “highlighting the benefits of a private account and reminding them to check their settings.”