Controversial deepfake app DeepNude shuts down hours after being exposed

Less than a day after receiving widespread attention, the deepfake app that used AI to create fake nude photos of women is shutting down. In a tweet, the team behind DeepNude said they “greatly underestimated” interest in the project and that “the probability that people will misuse it is too high.”

DeepNude will no longer be offered for sale and further versions won’t be released. The team also warned against sharing the software online, saying it would be against the app’s terms of service. They acknowledge that “surely some copies” will get out, though.

Motherboard first drew attention to DeepNude yesterday afternoon. The app, available for Windows and Linux, used AI to alter photos to make a person appear nude and was only designed to work on women. A free version of the app would place a large watermark across the images noting that they were fake, while a paid version placed a smaller watermark in a corner, which Motherboard said could easily be removed or cropped out. The app has been on sale for a few months, and the DeepNude team says that “honestly, the app is not that great” at what it does.

But it still worked well enough to draw widespread concern around its usage. While people have long been able to digitally manipulate photos, DeepNude made that ability instantaneous and available to anyone. Those photos could then be used to harass women: deepfake software has already been used to edit women into porn videos without their consent, with little they can do afterward to protect themselves as those videos are spread around.

The creator of the app, who just goes by Alberto, told The Verge earlier today that he believed someone else would soon make an app like DeepNude if he didn’t do it first. “The technology is ready (within everyone’s reach),” he said. Alberto said that the DeepNude team “will quit it for sure” if they see the app being misused.

DeepNude’s team ends their message announcing the shutdown by saying “the world is not yet ready for DeepNude,” as though there will be some time in the future when the software can be used appropriately. But deepfakes will only become easier to make and harder to detect, and ultimately, knowing whether something is real or fake isn’t the problem. These apps allow people’s images to be quickly and easily misused, and for the time being, there are few if any protections in place to prevent that from happening.

ncG1vNJzZmivp6x7tbTEr5yrn5VjsLC5jmtnanFfa3xzg45qb3BuYWmGd3vDnpyppqWZsm6%2Fx66rrGWUpMSvecOenKmekaCybrrUnZxmmZlirrG8jLCmpp2e