Apple and Google app stores actively steer users to “nudify” apps that strip clothing from women’s images using AI, fueling a surge in non-consensual deepfakes amid government inaction on tech accountability.[1]
Story Snapshot
- Tech Transparency Project reveals Apple and Google promote nudify apps through search results and ads, with 40% of top results enabling deepfake nudes.[1]
- Minors in Spain, New Jersey, and Ohio used these apps to create and share AI nudes of female classmates, leading to arrests and extortion.[2]
- arXiv study of 20 nudify web apps shows 95% target women, half enable sexual poses, with weak consent checks.
- Italian PM Giorgia Meloni sued over deepfake nudes, highlighting 90-95% of such content victimizes women without recourse.[1]
- Platforms suspend some apps post-exposure, but easy access persists via Telegram and web, evading store controls.[1][2]
App Stores Promote Nudify Tools
Tech Transparency Project researchers searched Apple App Store and Google Play Store using terms like “nudify,” “undress,” and “deepnude.” Roughly 40% of top 10 results in both stores produced apps that rendered women nude or scantily clad from AI-generated photos.[1] Apple ran ads for these apps as top results in three searches, despite policies against adult-themed promotions. Google featured a carousel of explicit app ads. Autocomplete queries further directed users to additional nudify options.[1]
The first “deepfake” search result in Apple’s store linked to an app that swapped clothed women images for nude versions. Google suspended some identified apps after inquiry, but the report shows stores actively boost visibility through algorithms and paid placements.[1] This promotion persists despite prior 2024 exposures of similar search surfacing.[1]
Real-World Harm Targets Minors and Women
Spanish police documented male minors using nudify apps to generate non-consensual nudes from schoolgirls’ social media photos, shared via WhatsApp and Telegram groups. Perpetrators extorted at least one victim for real images or money via Instagram.[2] Parent Jose Ramon Paredes Parra called smartphones “weapons” after his 14-year-old daughter’s victimization.
In the U.S., a New Jersey 17-year-old faced arrest for creating and sharing AI nudes of classmates, charged with harassment and child sexual exploitation.[1] Two Ohio teens at Mason High School received charges for distributing AI images of female peers via Discord, alongside child pornography possession. One device’s forensics revealed violent fantasies.[1]
An arXiv study examined 20 AI nudification web apps; 19 explicitly targeted women for undressing, half allowed sexual positioning, and only half referenced consent. Six sites skipped age verification, with monetization via credits or cryptocurrency. Italian Prime Minister Giorgia Meloni sued creators of her deepfake nudes for €100,000, noting 90-95% of deepfakes are non-consensual women’s pornography.[1]
Persistent Access and Enforcement Gaps
Developers claim harms arise from workarounds combining apps, not single tools, as in the Spain case.[2] Tech Transparency Project tests used AI-generated images, not real victims, limiting direct proof on authentic photos.[1] School incidents occurred off-campus, with no proven tie to store promotions.[1]
Minnesota became the first state to pass a law banning nudification apps that make it easy to “undress” or sexualize images of real people. Minnesota’s attorney general could impose fines up to $500,000 per fake AI nude flagged. https://t.co/wfAQOtlmAk
— Alex Nguyen (@AlexNguyen65) May 3, 2026
Despite reports, app stores show no broad response; nudify tools thrive on Telegram and web, sidestepping controls.[1][2] Victims suffer psychological trauma, yet regulatory action lags—30+ U.S. school cases in 20 months remain underaddressed.[1] This echoes patterns where platforms delay enforcement on image-based abuse until pressure mounts, leaving families exposed while tech giants prioritize revenue.[1]
Both conservatives and liberals share frustration with unaccountable elites in Big Tech, mirroring federal government failures. These apps erode privacy and dignity, departing from founding principles of individual liberty protected from exploitation. Parents across the spectrum demand action as smartphones weaponize against children.[2]
Sources:
[1] Web – TTP – Apple and Google Are Steering Users to Nudify Apps
[2] Web – Mobile apps fueling AI-generated nudes of young girls: Spanish police



