Horny Technology

What is the difference between AirDropping nudes and flashing someone on the street? What is the difference between creating deep-fake videos and spying on someone through their window? There is a blurring between fiction and reality, so are we prepared for apps that enable its users to envision others to be naked?

 

Last year, an app called Deep Nude was released, it is misogynistic and has no other purpose than to be misused. Yet again, we see another signal that technology has taken a misogynistic turn.

 

I am sure you are all familiar with AirDrop by now, if not, then it is a file-sharing service that was launched by Apple in 2011. The objective is to allow Apple users to share pictures, videos and files through Wi-Fi or Bluetooth. You have the option to make it available to everyone, to your contacts only or have it switched off entirely. The problem is that many of us forget to turn it off, or do not even realise it was ever on to begin with, which makes you detectable to anyone else who has their AirDrop on. Instead of opening one’s coat to flash us, we now live in a world where we can be cyber-flashed, whereby an individual can AirDrop explicit images to your phone without your consent. The preview function allows you to see what is being shared with you, so if your AirDrop is turned on for everyone, then you will find a delightful picture of someone’s genitals without your consent. Therefore, the preview of the image is what constitutes the flashing element of the receiving the picture as it appears before you can even confirm downloading it onto your phone.

[Image: Pexels]

On a university trip last year, we were all AirDropping funny and random pictures from the trip to Paris to the phones of other classmates, which is pretty normal among students. However, there is an increasing number of reports of anonymous and unsolicited nudes sent to strangers through the AirDrop function. This function can be exploited by perpetrators. It does not even have to pictures or videos of their own genitals, the content simply has to be pornographic or offensive, and can be distributed to anyone with their AirDrop available to everyone within a thirty-metre radius. You are likely to find this kind of behaviour in schools or public places. Sending explicit and pornographic material through the AirDrop function is a new take on an age-old way of bullying, humiliating and intimidating individuals. 

 

Is there anything that can be done to prevent it? Apple’s guidelines imply that users should change their automatic settings from public to private on AirDrop, in turn, only your contacts can detect your phone and share their files with you. For the perpetrator, it is relatively easy to avoid detection as they can change the name of their device on the AirDrop network, too. There is a fundamental lack of education on the new emerging ways of sexual harassment through technology. It is also a difficult area to police, the images shared could be of anything, including nude photographs of teenagers which constitutes as the distribution of child pornography under the law. The laws that are used to protect those that are underage can easily be used to punish them. Even if they are pictures you sent yourself, if you are under eighteen, you could face criminal charges and get a criminal record for creating and sharing explicit images of a child because you are technically underage. The content is therefore illegal for both the perpetrator sending the footage and the victim receiving it, even if you opt to screenshot the instance or save the footage as evidence, when you report it you are still in possession of child pornography. This is why cyber safety is such an important topic. At least with social media you can figure out who sent the footage, but the anonymity of AirDrop makes this horny technology terrifying, and you may never find a resolution or form of justice. 

 

It seems that as soon as technology finds another way to make living much easier, there are the twisted people who need to transform it into something about their genitals, such as deep fakes which use artificial intelligence to alter images or videos to create scenarios that never occurred. If you head on over to PornHub, for example, you can find the celebrities you know and love in porn they never made. These videos make it indiscernible from the real video and spotting deep fakes has become increasingly difficult. These can even be used on ordinary people, apps like DeepNude are popping up everywhere, which allow users to upload a photo of any person and remove their clothes with a few touches on the app. These apps are rather clumsy, at best, but they can be used by colleagues and schoolmates if wanted. DeepNude was pulled a week after its launch, but the software is still available, with many copying it or sharing it for free. 

 

It is disturbing to know that a hidden person is lurking nearby who can so casually send a picture of their genitals or someone else’s, watching you from afar, and that you would never know who sent the footage. People like this get their kicks, not from you being able to see their genitals, but from your reaction and that you have been surprised from seeing it. I guess it just goes to show that our computers, our tablets and our phones are still considered a lawless arena, in which aggressive texts and pictures can be sent to anyone. It is time to realise that our technology does not exist in an alternative universe, it is very much present in our reality, and we should not be using it help us out with our horniness anymore. 

Leave a Reply

Your email address will not be published. Required fields are marked *