Apple and Google App Stores Are Actively Promoting Nudify Apps That Generated $122M in Revenue

A new analysis by the Tech Transparency Project (TTP) has found that Apple's App Store and Google Play are not just hosting apps that generate non-consensual nude images — they're actively promoting them through search results and paid advertising. The apps collectively generated over $122 million in revenue, with many rated suitable for minors.
What the Analysis Found
TTP researchers discovered dozens of "nudify" apps — tools that use AI to digitally remove clothing from photos of real people — readily available on both major app stores. More alarmingly, the stores' own search algorithms and ad systems were surfacing these apps prominently, effectively endorsing them to users looking for photo editing tools.
Several apps carried age ratings of 4+ or 12+, meaning children could download them without parental restriction. The revenue figure of $122M suggests these aren't fringe tools — they're mainstream products with substantial user bases.
How the Apps Circumvent Policies
Both Apple and Google have policies against apps that generate non-consensual intimate imagery (NCII). Yet the apps persist through a combination of deliberately vague descriptions, category mislabeling, and rapid resubmission after takedowns. App stores rely heavily on automated review processes, and bad actors have learned to game them.
TTP's report highlights that the problem isn't just passive — the algorithmic promotion of these apps suggests the stores' recommendation engines don't flag sexual content or NCII risk as negative signals when ranking results.
The Human Cost
The consequences are well-documented. Schools across the US, UK, and Australia have reported incidents where students used AI nudify tools to create fake explicit images of classmates. Victims — predominantly teenage girls — face harassment, psychological trauma, and in some cases, career and reputational damage that follows them into adulthood.
Law enforcement has struggled to keep pace. Most jurisdictions lack specific laws criminalizing AI-generated NCII, and cross-border enforcement is nearly impossible when app developers operate from lenient regulatory environments.
Pressure on Apple and Google to Act
The TTP report adds fuel to legislative efforts in the US Congress and the EU to mandate stricter app store gatekeeping. Lawmakers have pointed out that both Apple and Google take a 15-30% cut of app revenue — meaning they're financially benefiting from tools explicitly designed to harm people.
Apple and Google have been contacted for comment. Previous reports have led to selective app removals, but critics argue that whack-a-mole takedowns aren't a solution — structural changes to how apps are reviewed, categorized, and promoted are needed.
The Bottom Line
The $122M in revenue from nudify apps isn't just a policy failure — it's a market signal that app stores have created an environment where harm is profitable. Until Apple and Google treat algorithmic promotion of NCII tools as a liability rather than a revenue source, the problem will continue to scale.
Related Articles
- Anthropic Rolls Out Identity Verification for Claude
- AI Deepfake Nude Images Hit 640 Students Across 90 Schools
- EU Launches Open-Source Age Verification App for Child Safety