July 15, 2019

Last week, The New Paper of Singapore revealed the story of Rose, a 27-year-old woman who found a picture she had taken months before — wearing clothes, of course — in a sex forum. In the new image, Rose is naked.

She is just one of dozens who may find themselves in the same situation in the country and around the world. This is the result of an app that allows users to replace clothes with naked flesh.

And despite the shutdown of the app, the code is still available on a popular code hosting site.

According to Vice, the Deepnude app was released in June as an artificial intelligence software/device capable of swapping clothes for flesh, generating photos of naked women. The app doesn’t work for men. If you test it with a photo showing a guy you would get a man with girls parts, too.

When Deepnude hit the market, offering a free version and a premium one, many tech websites — not only in English, and not only in the United States — started criticizing the anonymous developers for what they considered to be a bad use of artificial intelligence. Women didn’t have to consent to be undressed by the app. So it sounded offensive and, for some people, even illegal. 

Deepnude only survived for about four days. The developers posted an annoucement on social media saying they didn’t expect to become so popular and that they had underestimated the request. On June 27, the DeepNude Twitter account announced the app was dead: “The world is not ready for Deepnude,” developers wrote.

https://twitter.com/deepnudeapp/status/1144307316231200768?ref_src=twsrc%5Etfw

But by that time, the code behind the app had spread all over the internet, as some Twitter users warned: “DeepNude is now open source (anyone can download the source code and build it). Girls should generally stop posting pictures of themselves online at this point. Arab moms were right,” commented one user from Egypt.

Github announced it would remove all open source versions of the Deepnude app from its platform. According to Vice, the codes and the results taken from the app were considered a violation of GitHub’s Community Guidelines.

But, since it is quite difficult to unpublish something, it is quite easy to find the codes. The question left behind is: what happens if you are the next Rose?

Support high-integrity, independent journalism that serves democracy. Make a gift to Poynter today. The Poynter Institute is a nonpartisan, nonprofit organization, and your gift helps us make good journalism better.
Donate
Cristina Tardáguila is the International Fact-Checking Network’s Associate Director. She was born in May 1980, in Brazil, and has lived in Rio de Janeiro for…
Cristina Tardáguila

More News

Back to News