Apple and Google generated millions from apps enabling AI-generated non-consensual nude imagery, violating their own content policies and exposing users to privacy risks.

A new investigation by the Tech Transparency Project (TTP) reveals that Apple and Google have profited from apps enabling the creation of non-consensual nude imagery despite explicit prohibitions in their app store policies. Researchers identified 55 such apps on Google Play and 47 on Apple's App Store, collectively downloaded 705 million times and generating $117 million in revenue.
Policy Violations and Enforcement Failures
Both platforms explicitly ban apps that:
- Create nude or sexually suggestive content without consent (per Apple's App Review Guidelines Section 1.1.4)
- Facilitate harassment (under Google Play's Sensitive Events Policy)
Despite these policies, TTP researchers successfully used free app versions to:
- Generate fully nude images from clothed photos of AI-generated models
- Swap faces onto nude bodies using "face-transfer" features
- Access these tools through apps rated as suitable for children as young as 9
Regulatory and Privacy Implications
This violation triggers multiple legal concerns:
1. GDPR/CCPA Compliance Failures
- Apps processing biometric data (facial features) without explicit consent violate Articles 9 and 35 of the EU GDPR
- Failure to prevent collection of minors' data breaches California's Age-Appropriate Design Code
2. National Security Risks Over 60% of identified nudify apps had Chinese developer ties, creating risks under:
- China's Data Security Law requiring data sharing with authorities
- Potential exposure of U.S. citizens' fabricated nude imagery to foreign governments
3. Platform Liability Apple and Google face potential penalties under:
- Section 230(c) of the Communications Decency Act for knowingly hosting harmful content
- FTC enforcement for deceptive practices regarding app safety claims
Platform Responses and Ongoing Risks
Following TTP's January 21 findings:
- Apple removed 25 apps but had previously profited from sponsored ads for "nudify" searches
- Google suspended "several" apps but millions of downloads remain active
This follows a December 2023 TTP report where both platforms hosted 70+ apps from OFAC-sanctioned entities, suggesting systemic moderation failures.
User Protection Recommendations
Immediate Actions for Victims:
- File removal requests under GDPR Article 17 or CCPA Section 1798.105
- Report apps to platform abuse teams
Regulatory Solutions:
- Fines proportional to revenue generated from policy-violating apps
- Mandatory age-verification systems under proposed U.S. Kids Online Safety Act
"When platforms profit from apps that violate their own policies and endanger users, they become complicit in the harm," said TTP Director Katie Paul. "If they won't enforce their rules voluntarily, regulators must intervene."

Comments
Please log in or register to join the discussion