Instagram want to stop unsolicited nudes – they're going about it the wrong way

Instagram is working on a new filter.

Not one that alters your face to the proportions of a Kardashian, or turns you into a dog, but a filter that will automatically detect when a user sends an unsolicited nude photo in a direct message.

It sounds like a good plan. It sounds like a much-needed response to the ‘epidemic of misogynistic abuse’ sent to female users on the social media platform. 

But it also sounds like the onus is once again put on women to protect themselves against dick pics, rather than cut off the sexual deviancy at its source.

That’s because the proposed measures will involve a feature called ‘nudity protection’ that uses technology on your device to cover up sexually explicit photos – unless you choose to view them. 

As these messages are sent privately, Instagram can’t access the photos themselves, but technology on your own device will detect and protect the content sent to it.

As someone who has been sent her fair share of explicit, threatening and unsolicited sexual images from users in my DMs, it’s a tool I wish already existed. Not necessarily for myself, but for younger users on the platform who should not be exposed to sexual imagery of any nature.

I’m an adult; I know what a penis looks like. It’s not the image of a penis itself that offends me, but the fact that a man feels like he has the power to expose himself without my consent. Therein lies the abuse.

Giving the recipient an option not to view the content is a good thing, but it does little by way of protecting them from the psychological harm of the threat that lies beneath unsolicited dick pics.

When I was flashed by a man in public earlier this year, I didn’t find myself wishing that his penis had been pixelated

I will never fully understand the psychology of a man who sends unsolicited photos of his penis, but I have a good idea what it’s about: sexual dominance. Perhaps he’s looking for attention, looking to shock, seeking sexual gratification. Maybe he has a warped belief that this act will in some way lead to a romantic relationship. But when women receive these images, it shouldn’t be on us to figure out why. 

When adult women are ‘flashed’ in real life, it’s not necessarily the image of a penis itself that is the most troubling, but the fact that a man feels able to do it and get away with it.

When I was flashed by a man in public earlier this year, I didn’t find myself wishing that his penis had been pixelated. A ‘nudity protection’ filter wouldn’t have changed the psychological impact of the experience by very much.

Sure, I wouldn’t have the image imprinted in my memory, but I would still have known what was hidden behind the pixels and been left with questions about why he had done it to me.

I appreciate Instagram’s efforts to stop recipients from viewing dick pics, but what are they doing to stop men from actually sending them in the first place?

Instead of developing technology that gives the sender nudity protection on their device, why not use the same technology to block men from actually sending sexually explicit photos? 

Why not use technology to detect a nude image uploaded to a chat, and create a feature that allows the intended recipient to exercise their consent?

People use Meta apps to send sexually explicit photos privately between two consenting parties – it’s a part of life in the modern world of connecting, flirting and dating people online. If both people consent – good for them.

Why not identify nude images on the sender’s device and warn the sender. 

Instead of giving the recipient the option to cover their eyes, why not give the sender the responsibility to cover their genitalia?

How about a message that reads something like: ‘It looks like you want to send a photo that may contain nudity. Before proceeding, you must ask the recipient if they consent to receiving a photo of this nature – do you want to continue?’

If they choose to continue, a message could be sent to the intended recipient giving them the power to veto the image or accept it.

Why not take away the power from the sender and give it to the recipient instead?

This would allow for consenting adults to continue as normal, for non-consenting recipients to cock-block a sexual deviant, and for perverts to have their power stripped from them.

Instead of giving the recipient the option to cover their eyes, why not give the sender the responsibility to cover their genitalia unless they have permission not to?

In April, the Center for Countering Digital Hate published a report that found Instagram failed to act on 90% of image-based abusive DMs sent to female public figures included in the study.

I experienced this myself when a man repeatedly sent me sexually graphic sexual images, yet Instagram failed to act when I reported the misconduct. Instead, they removed a public post of mine condemning their failure to take my report seriously.

Instagram responded to the Center for Countering Digital Hate’s report by stating that accounts sending messages that break their platform’s rules are given a strike and are blocked from sending DMs for a period of time and that stronger punishments are handed out if such behaviour continues.

Once again, the onus is on women to make these reports, rather than on men to stop doing it in the first place. Instagram must focus on stopping offenders in their tracks, rather than relying on victims to take action after an offence.

If tech companies really want to protect women from abuse and misogyny, they can’t just ‘empower’ women into taking action, they must also remove power from men who use technology to offend.

Do you have a story you’d like to share? Get in touch by emailing [email protected]

Share your views in the comments below.

Source: Read Full Article