The first feature prevents adults from messaging people under 18 who don’t follow them. It then sends the adult user a notification saying they can’t DM the account. Instagram has provided little detail on how the system works, but parent company Facebook said in a blogpost that it uses AI to infer the users’ ages: The second feature sends teenage users prompts encouraging them to be careful when interacting with adults to whom they’re already connected. [Read: How do you build a pet-friendly gadget? We asked experts and animal owners] The system first detects potentially suspicious behavior, such as an adult sending a large number of friend or message requests to children. It then inserts a safety notice within the recipient’s DMs, and gives them the option to immediately end the conversation, or block, report, or restrict the adult. Instagram said the system will launch in some countries this month and roll out globally “soon.” The company will also encourage teens with public profiles to make their accounts private by sending notifications “highlighting the benefits of a private account and reminding them to check their settings.” The new features aim to make young people safer onInstagram, which research suggests was the most used platform for child grooming crimes during the first lockdown in England and Wales. Instagram said it’s now assessing further safety measures, including additional privacy settings, and will provide more detail on them in the coming months. Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.