Image Not FoundImage Not Found

  • Home
  • AI
  • How Instagram’s Sneaky Apps Are Stripping Away Privacy
How Instagram's Sneaky Apps Are Stripping Away Privacy

How Instagram’s Sneaky Apps Are Stripping Away Privacy

The rise of AI image generators claiming to have the power to “undress” celebrities and unsuspecting women is not a new phenomenon. However, what is alarming is their recent appearance in monetized ads on Instagram. According to a report by 404 Media, Meta, the parent company of social media giants Facebook and Instagram, had several paid posts in its ad library promoting apps like “Nudify.” These apps purportedly use artificial intelligence to create deepfake nude images from clothed photos. One particularly disturbing ad featured a photo of Kim Kardashian alongside the words “Undress any girl for free” and “Try it.” Another ad showcased two AI-generated images of a young girl, one fully clothed and the other apparently topless, with the caption “Any clothing delete.”

In the past six months, these types of apps have garnered significant attention after being used to generate fake nude images of underage girls in schools across the United States and Europe. This has sparked investigations and legislative proposals aimed at safeguarding minors from the harmful implications of such AI manipulation. As reported by Vice towards the end of last year, students in Washington admitted to using the “Undress” app to create fake nude images of their classmates after encountering advertisements on TikTok.

404’s investigation revealed that many of the concerning ads had already been removed from Meta’s Ad Library by the time they were flagged, while others were taken down only after being brought to the company’s attention. A Meta spokesperson emphasized that the company does not permit ads containing adult content and acts swiftly to eliminate any violations. Despite these efforts, some problematic ads were still live at the time of 404’s report, indicating a persistent challenge in effectively policing such content.

Last summer, Futurism uncovered that Google was prominently featuring deepfake porn in search results, including manipulated images of celebrities, lawmakers, influencers, and other public figures without their consent. Even a cursory search for “Deepfake porn” would lead users to “MrDeepFakes,” a notorious distributor of such content. Additionally, the investigation found that one of the apps being advertised on Instagram required users to pay a $30 subscription fee for access to its NSFW features, only to fall short of delivering on its promise to generate nude images.

The fact that these exploitative apps are being promoted on a platform as widely used as Instagram is deeply troubling, particularly considering that half of teenagers, as per a recent Pew survey, report daily engagement with the Meta-owned app. With the prevalence of such technology and its potential for misuse, it is imperative for tech companies to enhance their content moderation mechanisms and prioritize the protection of vulnerable users, especially minors, in the digital space.

Image Not Found

Discover More

AI Companion Apps Under Scrutiny: Senators Probe Child Safety Measures
Camera Industry Faces RAW Format Fragmentation: Challenges and Solutions
Microsoft Unveils Altair BASIC Source Code: A Glimpse into Tech History on 50th Anniversary
Razer Basilisk V3: Top-Rated Gaming Mouse Slashed in Price on Amazon
Amazon's Smart Home Revolution: Ring Founder Returns to Lead Innovation
TikTok Acquisition Heats Up: AppLovin Enters Race with Surprise Bid Amid Security Concerns
Global Markets Plunge as Trump Tariffs Fuel Recession Fears and Economic Uncertainty
Matter vs. Z-Wave: The Battle for Smart Home Dominance in Security Systems
Tech Giants Adopt AV1 Codec: Revolutionizing Video Streaming with 30% Better Compression