In a startling display of moral erosion within the digital marketplace, Etsy, known for its hand-made and vintage goods, is shockingly hosting sellers of “deepfake” pornographic images, including AI-generated celebrity nudes. This alarming development persists despite efforts to clean up the site and highlights a growing concern over the unchecked spread of such explicit content.
The emergence of sexually explicit AI-generated images on Etsy, including those of celebrities, points to a grave oversight in regulating online content. Siwei Lyu, a computer scientist and expert on machine learning and the detection of deepfakes, expressed disbelief at the proliferation of such content on a platform like Etsy, traditionally known for innocuous products.
Despite Etsy’s attempts to curtail this issue, a simple search on the platform yields over a thousand results for “ai nude” or “deepfake porn,” while a search for “porn” returns zero. The explicit AI-generated content is not only readily accessible but also alarmingly promoted by Etsy’s algorithm in unrelated searches.
Among the disturbing findings are AI-generated nude images of entirely fabricated women and an e-book guide on creating X-rated AI content. Notably, a shop was found selling 95 photos of actress Margot Robbie, including AI-generated explicit images, under the guise of “celebrity nude art.” Representatives for Robbie did not immediately respond to requests for comment.
After Fox News Digital reached out to Etsy, the listings were removed for violating policies, but the damage had been done – some of these images had already been purchased and downloaded.
Lyu points out that generative AI technologies have become powerful enough to create highly realistic images, deceiving many who fail to recognize them as fakes. This challenge is compounded by the widespread personal use of such software, outpacing the development of laws to curb misuse, such as the creation of celebrity deepfakes or AI-illustrated pornography.
Blake Klinkner, an assistant law professor specializing in cybersecurity law, underscores the difficulty in legally addressing AI-generated images. The First Amendment protects a wide range of creative expressions, and federal laws have yet to effectively address the criminal aspects of AI-generated explicit content.
Sellers on platforms like Etsy use AI software to generate obscene, pornographic images of celebrities, exploiting the accessibility of AI technology. However, judges are often unfamiliar with deepfakes and hesitant to apply outdated laws to this modern issue, creating a legal grey area.
Tracking down creators of such images is challenging, as many use aliases and fake profiles. Klinkner describes this as a “Whack-A-Mole situation” – when one account is shut down, another emerges.
Etsy’s response to the issue, while a step in the right direction, appears insufficient, as thousands of pornographic images remain accessible on the platform. According to Etsy’s guidelines, pornography of any sort is prohibited, but the definition of what constitutes pornography remains ambiguous.
Recent incidents, such as deepfake images of singer Taylor Swift circulating on social media, have prompted legislative action. The No AI FRAUD Act, introduced by Rep. Maria Salazar, R-Fla., aims to penalize the creation of harmful generative AI images.
State laws on nonconsensual deepfake porn vary, with only a few offering legal recourse. Celebrities, in particular, face challenges in protecting their public image against realistic AI-generated depictions, which can infringe on their rights to privacy or publicity.
This ongoing controversy over AI-generated explicit content on platforms like Etsy brings to light the urgent need for robust legal frameworks to protect individuals, particularly public figures, from the harmful effects of deepfake technology. As technology continues to advance, the line between the real and the generated blurs, raising serious concerns about the potential damage to public perception and individual rights.