Instagram is working on a new tool that can protect users from nude photos sent in private messages, Meta representatives, which now owns Instagram, told The Verge.
According to a tweet by insider Alessandro Paluzzi, the nudity protection technology covers photos in chats that may contain nudity and gives users the choice to view those photos or not.
According to Meta, the goal of implementing the new technology is to help shield people from unsolicited nude images or other unwanted content.
As additional protection, the company cannot view the images or share them with third parties. The new feature will be similar to the hidden words tool launched last year, which allows users to filter out offensive posts in DM requests based on keywords. If a request contains any filter word you choose, it will automatically be placed in a hidden folder, which you may never open.
Sending nude photos is considered a pretty hot issue these days, so the new feature will make many people happy.
According to a 2020 study by University College London, 75.8% of 150 young people between the ages of 12 and 18 received unsolicited nude photos.