The era of deepfake content is coming faster than society can possibly prepare for it.

Only recently we covered the deepfake Zuckerberg video that challenged Facebook to reconsider its lenient policies on fake content. A company called CannyAI created a video of Zuckerberg, talking about controlling “billions of people's stolen data, all their secrets, their lives, their futures," which rippled across the internet like the toll of a warning bell.

Then, recently, an app launched called DeepNude, which allows users to strip women in photographs of their clothing. It is a digitized x-ray machine, that makes it possible for anyone to look at any woman totally bare ass naked for a mere $50.

Or, I should say, it used to allow people to do that. As of June 27th the creator of the app, a strange man (with an x-ray glasses obsession) who calls himself “Alberto,” removed it from the internet.

Not because it wasn’t popular — DeepNude attracted so much traffic after Motherboard did a story on it that their server actually crashed. People loved the idea of being able to look at their favorite celebrities, friends, family and acquaintances naked so much, that the website couldn’t dish up downloads fast enough for everyone to get one.

The app used some pretty sophisticated and creative engineering, calculating breast size, body shape, skin tone and texture to give people the most accurate nudes technologically possible. Naturally the app worked better with pictures of women who were already mostly naked (in bikinis or workout clothes or underwear), according to Vice, who apparently bought the app and tested it for “research purposes.” Wink wink.

And yes, the algorithm only worked on photos of women. If you tried to use it on pictures of men it wouldn’t quite get things right… namely there would be a vagina where there was supposed to be junk and very prominent nipples. That’s because the app used nude photos already on the internet to fill in the gaps where clothing was, and naked images of women are far more abundant online than naked images of men.

But the creator expressed a genuine interest in making it work for men too.

About two years ago I discovered the potential of AI and started studying the basics,” Alberto said. “When I found out that GAN networks were able to transform a daytime photo into a nighttime one, I realized that it would be possible to transform a dressed photo into a nude one. Eureka. I realized that x-ray glasses are possible! Driven by fun and enthusiasm for that discovery, I did my first tests, obtaining interesting results.”

Interesting results, indeed.

DeepNude DeeNude
Well so much for all that. After Vice’s story on this app dropped, there was such violent public outcry (while at the same time an insane rush to download and use this newfangled technology) that Alberto was forced to kill DeepNude.

“Yes, it isn’t your actual vagina, but… others think that they are seeing you naked,” said Danielle Citron, professor of law at University of Maryland Carey School of Law. “As a deepfake victim said to me — it felt like thousands saw her naked, she felt her body wasn’t her own anymore.”

We are going to have to get better at detecting deepfakes, and academics and researchers are going to have to think more critically about how to better safeguard their technological advances so that they do not get weaponized and used in unintended and harmful ways,” said Hany Farid, a computer science professor at UC Berkeley.

On June 27th DeepNude was removed from the web.

“We created this project for users' entertainment months ago,” Alberto wrote in a statement. “We thought we were selling a few sales every month in a controlled manner… We never thought it would become viral and we would not be able to control traffic.”

“The world is not ready for DeepNude.” The statement concluded.

Which, may not matter when it comes to deepfake technology like this. This stuff is coming whether the world is ready for it or not. Even Alberto admitted that if it hadn’t been him who created DeepNude, it surely would have been someone else. Deepfake videos are popping up with increasing frequency, and the threat there is obvious: when people can undress anyone, and put any words (or noises) in their mouths, and make it look and sound like they actually said it, what will become of our sense of truth? How we anyone be able to trust what they’re seeing or hearing on the internet or in the news?

There will be deepfaked sex videos, deepfaked political mishaps, deepfaked revenge porn, deepfake whatever. Reality itself might unravel around us, on our screens and in our minds.

And what then? Will people be able to combat that? Can technology fix this problem, or is it one that must be fixed among humans? How can we maintain our addiction to digital and social media, while also being honest and truthful to one another?

It doesn’t seem like the kind of situation where we can have our cake and eat it, too.