Instagram has announced new policies to limit interactions between teenagers and adults to protect young people on the platform.
Instagram will now prevent adults from sending messages to people under 18 who don’t follow them:
“For example, when an adult tries to message a teen who doesn’t follow them, they receive a notification that DM’ing them isn’t an option. This feature relies on our work to predict peoples’ ages using machine learning technology, and the age people give us when they sign up.”
“As we move to end-to-end encryption, we’re investing in features that protect privacy and keep people safe without accessing the content of DMs,” Instagram said in a blog post.
Speaking about the initiative, Lucy Thomas, Co-Founder / Co-CEO, PROJECT ROCKIT said:
“Around the world it’s widely understood that most social media platforms require a 13-year minimum age requirement, but the complexity of age verification remains a long-standing, industry-wide challenge. That’s why it’s positive to see Instagram investing in innovative technologies that can and will create a safer online environment for younger users. By using machine learning to flag potentially inappropriate interactions, improving teen privacy features and DM-ing younger users with realtime safety info, Instagram is equipping young people with tools to be the architects of their own online experience.”
The platform is also going to start using prompts — or safety notices — to encourage teens to be cautious in conversations with adults they’re already connected to. Safety notices in DMs will notify young people when an adult who has been exhibiting potentially suspicious behavior is interacting with them in DMs.
For example, if an adult is sending a large amount of friend or message requests to people under 18, we’ll use this tool to alert the recipients within their DMs and give them an option to end the conversation, or block, report, or restrict the adult. People will start seeing these in some countries this month, and we hope to have them available everywhere soon.
Moreover, in the coming weeks, the platform will start exploring ways to make it more difficult for adults who have been exhibiting potentially suspicious behavior to interact with teens. This may include things like restricting these adults from seeing teen accounts in ‘Suggested Users’, preventing them from discovering teen content in Reels or Explore, and automatically hiding their comments on public posts by teens.