Tech, Now + Beyond

Dreadful DeepNude app undresses images of women at a click

Unveiling new avenues for sextortion - fabulous.

I can’t remember one day this month when the Internet didn’t deposit a new kind of horror on my feed. Today, that honor belongs to DeepNude, a horrifying software that lets anyone undress the image of a woman with a few simple clicks. 

And I really mean anyone. It’s as easy as hopping on Google, searching up any random image and then feeding it into the desktop software or app. 

It’s built on the blocks of deepfake, an AI-powered technique used to somewhat realistically swap faces. It caused a shitstorm in 2017 as r/deepfakes (now banned and removed) took off with Redditors making ill use of the machine-learning algorithm to largely create fake celeb pornography. Results varied from highly-distorted swaps to ones realistic enough to be bought at first glance.   

And while deepfake was long debated on its dangers of being used as a disinformation tool, there’s no doubt that its largest victim was women. They were either relegated as targets of revenge porn or their imagery was unwillingly used to satisfy the fantasies of their “admirers”. 

DeepNude, then, is simply an evolution of deepfake, in the sense that it’s faster, easier and more realistic in its ability to effectively subject any female body to the male gaze. 

Essentially, it uses the photo of a clothed person (the more skin showing, the clearer the result) and then renders a new, naked (breasts + vulva) body on top. It’s fake but the more well-lit and well-angled an image, the more believable the result.

The resulting image, itself, is watermarked with a FAKE tag, in both the free and premium version. Nothing that can’t be Photoshopped away though.

Vice’s Motherboard, who broke the story, experimented with the software and uploaded an image of a man. The results simply involved pants being swapped with a vulva. 

They spoke with the creator, Alberto (a pseudonym) who told Motherboard that the software is based on an open-source algorithm, pix2pix, which essentially trains itself to improve against itself. In the case of DeepNude, Alberto uploaded more than 10,000 nude images of women. 

His reason for targeting women is that their nude images are readily available. However, Alberto is hoping to create a male version too. Great aspirations, indeed.

DeepNude came into being out of Alberto’s experiments with generative adversarial networks (used by pix2pix) and his fascination with the fake X-ray glasses that were toted in the 1960s and 1970s.

A screenshot of the opening page of the desktop version of DeepNude. The featured image on-page shows a clothed woman with a portion of her highlighted and nude-ified.
[Image description: A screenshot of the opening page of the desktop version of DeepNude. The featured image on-page shows a clothed woman with a portion of her highlighted and nude-ified.] Via deepnude.com
“I’m not a voyeur, I’m a technology enthusiast,” he said. “I think that what you can do with DeepNude, you can do it very well with Photoshop (after a few hours of a tutorial).”

“I also said to myself: the technology is ready (within everyone’s reach). So if someone has bad intentions, having DeepNude doesn’t change much… if I don’t do it, someone else will do it in a year,” added Alberto, while also acknowledging being fiscally-motivated. 

It pains me to note that none of this is illegal. In fact, it operates in a gray area. The nudity being shown is already publicly available and not actually of the person displayed in question. On top of that, most platforms which host such content are protected by federal laws. 

Platform moderation and legislation have failed to keep up with the breakneck speed that deepfake is evolving at. And, because of this, women keep becoming victims of a misogynistic system. 

Not only is this a gross invasion of sexual privacy, but it also opens new avenues for sextortion as well. It’s yet another means of controlling a woman through her body, and any woman at that.

Think of all the terrible ways it can be used. Think of all the fragile egos out there, all the pettiness, all the hate. It can be anyone and all they need is an image of you. 

This is a new kind of terrorism, a new way of pushing women out of public spaces online through fear of being “stripped” in accordance to anyone’s mood, spanning sick jokes all the way to blackmail or extortion. 

In fact, read this Huffington Post piece featuring haunting reviews of women targeted through deepfake porn. They were driven off of social media, shamed for acts not even their own and despite everything they suffered, “their” videos are still circulating online.

I’ll be honest. I’m tired of fighting, but fight we must. At the end of the day, it’s up to us to stop the spreading of such gross notions. Because in the age of deepfakes, lawmakers are more interested in stopping the spread of political misinformation as opposed to the uncalled for, unjust, and disgusting act of invading a person’s sexual privacy.

July 3, 2019, Update: The software has since been taken down. However, as is the way of the Internet, nothing deleted is ever truly gone. There are already reports of copies popping up over forums like Discord and Reddit as a result of other coders reverse-engineering DeepNude software.

Ironic, really, for Alberto’s own software to be stripped bare and redressed according to someone else’s needs. One might even say there’s a moral attached to this story. All in all, it goes to show that despite the dangers of deepfake software, the tech world is still pulling up short in throwing up a line of defense, even against itself.