New report claims Apple's search suggestions and ads are steering users toward apps that create deepfake nude images, raising serious concerns about content moderation and platform responsibility.
A new investigation by the Tech Transparency Project (TTP) has revealed that Apple's App Store search system and advertising platform may be inadvertently helping users discover and access "nudify" apps—applications designed to create deepfake nude images of women.
The Problem Persists
The report, which follows up on TTP's January investigation that identified dozens of nudify apps on the App Store, found that both Apple's App Store and Google Play Store continue to surface these problematic applications through search results and sponsored listings. According to the investigation, nearly 40% of the top 10 apps returned for searches like "nudify," "undress," and "deepnude" were capable of generating nude or scantily clad images of women.
How the System Works
Perhaps most concerning is how Apple's own search infrastructure appears to be facilitating access to these apps. The investigation discovered that typing "AI NS" as part of a search that could lead to "AI NSFW" prompted the App Store to suggest "image to video ai nsfw"—a search term that returned several nudify apps in the top ten results.
Sponsored results also played a role in surfacing these applications. For instance, a search for "deepfake" returned an ad for FaceSwap Video by DuoFace as the first result. The app allows users to swap faces from still images onto videos. When TTP tested the app by uploading an image of a clothed woman and a video of a topless woman, the app generated a video showing the clothed woman's face on the nude woman's body after displaying a short advertisement.
Another search for "face swap" yielded an ad for AI Face Swap, which offers preset face swap templates and allows users to swap faces on uploaded images. Testing revealed that the app swapped faces without restrictions when provided with images of a clothed woman and a topless woman.
Developer Responses
In an interesting twist, TTP contacted several app developers about their findings. At least one developer confirmed they were using Grok for image generation but claimed they "had no idea it was capable of producing such extreme content." The developer pledged to tighten moderation settings for image generation in response to the findings.
Apple's Response
Despite declining to comment on TTP's initial request for information, Apple responded to the report by removing most of the apps TTP identified. This reactive approach raises questions about Apple's content moderation processes and whether the company's systems are adequately screening apps before they become available for download.
The situation highlights the ongoing challenges that major app platforms face in moderating content, particularly as artificial intelligence tools become more sophisticated and accessible to developers. While Apple has long positioned itself as a company that prioritizes user privacy and safety, this investigation suggests that its automated systems may be inadvertently undermining those values by connecting users with harmful applications.
The presence of these apps, some of which appear suitable for minors according to the report, represents a significant content moderation failure that could have serious real-world consequences for the women whose images are being manipulated without their consent.

For the full details of the Tech Transparency Project's investigation, you can read their complete report here.

Comments
Please log in or register to join the discussion