AI image generators claiming to “undress” celebrities and random women have been a disturbing presence on the internet for some time now. Recently, a concerning development was observed when these apps made their way into monetized ads on Instagram. According to a report by 404 Media, Meta, the parent company of social media giants Facebook and Instagram, hosted several paid advertisements promoting apps that utilize AI to create deepfake nude images from clothed photos. One such ad featured an image of Kim Kardashian with the caption “Undress any girl for free” and “Try it,” while another showcased two AI-generated photos of a young girl – one fully clothed and the other seemingly topless with the words “Any clothing delete” covering her breasts.
These apps have garnered increased attention over the past six months due to disturbing incidents where they were used to produce fake nude images of underage girls in schools across the United States and Europe. This has prompted investigations and legislative proposals aimed at safeguarding children from the harmful effects of such AI technology. In a report by Vice late last year, students in Washington revealed that they had accessed the “Undress” app through TikTok advertisements and used it to create fake nude images of their peers.
404’s investigation revealed that many of the ads promoting these apps had already been removed from the Meta Ad Library by the time they were flagged. A Meta spokesperson emphasized that the company does not permit ads containing adult content and takes swift action to eliminate any violating advertisements. However, some questionable ads were still active when 404 published its findings, indicating that Meta’s approach to content enforcement may be reactive rather than proactive, akin to a game of whac-a-mole.
This issue extends beyond Instagram, as previous reports have highlighted similar instances on other platforms like Google, which directed users to deepfake porn featuring not only celebrities but also politicians and public figures without their consent. Despite the disturbing nature of these apps, 404 discovered that one of the advertised apps required users to pay a subscription fee for access to its NSFW features, only to ultimately fail in generating the promised nude images.
The proliferation of such apps on Instagram is particularly alarming considering that a significant portion of teenagers continues to use the platform regularly. The presence of these ads underscores the urgent need for stricter regulation and enforcement measures to protect individuals, especially minors, from the misuse of AI technology for creating non-consensual and harmful content. As technology continues to advance, it is essential for platforms and lawmakers to stay vigilant and proactive in addressing emerging threats posed by deepfake technology.