Exposed deepfake database reveals horrific ways users manipulated celebrity images
Thousands of AI-generated nude deepfakes were left exposed in an unsecured database, including those that reportedly portrayed celebrities as young children.
The database, discovered by cybersecurity researcher Jeremiah Fowler, contained 93,485 images produced by a "nudify" app from the South Korea-based AI company GenNomis.
Fowler, as outlined in an article for vpnMentor, noted the presence of text files that included the command prompts made to generate each image. Among the files he viewed, Fowler said "nearly all of the images were explicit and depicted adult content."
The database also contained what Fowler said looked to be images of real people, likely uploaded by users seeking to place their faces on AI-generated nude bodies. It is unknown if any of the individuals in the database consented to having their likeness used in nude content.
"There are numerous AI image generators offering to create pornographic images from text prompts, and there is no shortage of explicit images online for the AI models to pull from," Fowler wrote. "Any service that provides the ability to face-swap images or bodies using AI without an individual’s knowledge and consent poses serious privacy, ethical, and legal risks."
Alongside the explicit files, Fowler added, were AI-generated images depicting "celebrities portrayed as children, including Ariana Grande, the Kardashians, Beyoncé, Michelle Obama, Kristen Stewart, and others."
While the images were made using GenNomis, Fowler was unable to verify if the database was run by the company or a third party. After reaching out with his findings, Fowler said that both the database and websites run by GenNomis were taken offline. WIRED reported the site disappeared almost immediately after it reached out for comment for an article on the database.
Research has shown that an estimated 96% of all deepfakes online are pornographic in nature, with nearly all of those involving women who did not provide consent to have their likenesses used.
Similar research from last year found that nearly 4,000 celebrities have been used in deepfakes. The technology has also been used countless times to extort and harass everyday people.
Internet culture is chaotic—but we’ll break it down for you in one daily email. Sign up for the Daily Dot’s web_crawlr newsletter here. You’ll get the best (and worst) of the internet straight into your inbox.
Sign up to receive the Daily Dot’s Internet Insider newsletter for urgent news from the frontline of online.
The post Exposed deepfake database reveals horrific ways users manipulated celebrity images appeared first on The Daily Dot.