App that removed clothes from women without their consent finally taken down

    1 of 1 2 of 1

      Last Sunday (June 23), an anonymous programmer created a piece of software that erases women’s clothing from a picture, and adds images of a realistic vulva and breasts to their bodies.

      The application, called DeepNude, used an open-source algorithm developed by the University of California, Berkeley, which trains computers to examine and edit images. The developer used a huge dataset of images—over 10,000 naked photos of women—to teach the algorithm to Photoshop a person’s clothes out of the pictures, and iterate on itself to make the doctored naked images as lifelike as possible. Because the algorithm was only fed images of women, only women could be—and were—targeted.

      A free trial version of the software allowed users to remove the clothes of any unconsenting woman, but printed watermarks across the image. In the paid version, which cost $50 USD, the watermark was replaced with a small stamp in the top left corner, which could be easily cropped or Photoshopped out.

      So many people accessed or bought the software that its servers crashed. No figures have been released to show how many women’s bodies had been edited.

      The developer, who went by the pseudonym Alberto, told the publication Vice that he created the software out of “fun” and “curiosity”.

      "I'm not a voyeur, I'm a technology enthusiast,” he said. “Continuing to improve the algorithm. Recently, also due to previous failures (other startups) and economic problems, I asked myself if I could have an economic return from this algorithm. That's why I created DeepNude."

      The developer suggested that his technology mirrored techniques already available on Photoshop, and reasoned that if he didn’t create the software, someone else would.

      DeepNude was removed from the Internet this afternoon, with the app’s Twitter account announcing that it would not be restored. The developer suggested in a statement that no other versions would be released and no one else would be offered permission to use the app.

      The statement implied that the developer had some knowledge of the impact it might have on women’s lives, but stopped short of addressing the psychological trauma of having imperceptibly false naked pictures circulated among friends or coworkers, and stripping women of their bodily autonomy and consent.

      No official legislation exists in Canada or the U.S. to regulate the creation of falsified images and videos like DeepNude or deepfakes.

      Kate Wilson is the Technology Editor at the Georgia Straight. Follow her on Twitter @KateWilsonSays

      Comments