Instagram is introducing a new policy that will limit interaction between teenagers and adults in efforts to make the platform safer for younger people. Adults will no longer be able to send direct messages to teens they do not follow.
“When an adult tries to message a teen who doesn’t follow them, they receive a notification that DM’ing them isn’t an option,” Instagram said on its blog.
The app is also introducing “safety prompts” that will be shown to teens when they DM adults who have been “exhibiting potentially suspicious behavior.” The prompts will also give teenagers the option to report or block adults in their DMs. The prompts will issue reminders to the teenagers “to be careful sharing photos, videos, or information with someone you don’t know.” and to not feel pressured to respond to messages.
Instagram’s moderation system will also be on the lookout for suspicious activities by adult users. The platform will issue notices for activities such as sending out “a large amount of friend or message requests to people under 18.”
“If an adult is sending a large amount of friend or message requests to people under 18, we’ll use this tool to alert the recipients within their DMs and give them an option to end the conversation, or block, report, or restrict the adult.” the blog post said.
Instagram is also turning to Machine Learning to help with moderation. The company says it is developing new artificial intelligence to try and detect a users’ age upon sign up. Currently, the app only allows users older than 13, but there is no way to tell when one is lying.
New teenage accounts on the app will be encouraged to keep their profiles private. However, if the teens opt for public accounts, they will get reminders outlining the benefits of having a private account.
The features have been rolled out to a handful of countries including India, Brazil, Japan among others, with plans to roll it out globally soon.