Instagram working on a feature to tackle unsolicited nudes in your DMs

People sliding into Instagram DMs with unsolicited nudes and sexually explicit images has been a longstanding problem among users.

It looks like Instagram is finally working on protecting its users from this unpleasant experience.

App researcher Alessandro Paluzzi published an early image of the tool on Twitter this week.

Meta confirmed the feature to The Verge saying that the optional user controls were still in the early stages of development.

Based on Paluzzi’s image, people will be able to shield themselves from nude photos. The new feature will make sure that explicit messages containing nudity will stay covered unless you choose to view them.

The new feature will reportedly be similar to its ‘Hidden Words’ feature, which allows users to automatically filter direct message requests containing offensive content.

The technology will not allow Meta to view the actual messages or share them with third parties.

According to a report published earlier this year by the Center for Countering Digital Hate, a British nonprofit organization, Instagram’s tools failed to act upon 90 per cent of image-based direct messages sent to high-profile women.

Many were sent sexual images by men, and not even the ‘hidden words’ feature could completely filter out swear words like ‘b*tch’.

In June, Instagram introduced new parental supervision tools for the accounts of teenagers in the UK and Ireland.

Metro.co.uk has reached out to Meta for comment.

Source: Read Full Article