Apple App Store Search Systems Accused of Promoting Deepfake Nude Image Apps
A new investigation conducted by the Tech Transparency Project (TTP) has raised serious concerns about Apple's own search and advertising mechanisms within the App Store. The report alleges that these systems may be actively directing users toward applications that generate deepfake nude images of women, often through promoted results and autocomplete suggestions.
Promoted Listings and Search Suggestions Surface Problematic Apps
According to findings detailed in a report by 9to5Mac, basic searches on the App Store can yield sponsored listings for apps with deepfake capabilities. For instance, a simple query for "deepfake" returned a promoted result for a face-swapping application. Testing revealed that this app could place the face of a clothed woman onto the body of a topless individual after users viewed a brief in-app advertisement.
Search suggestions also appear to facilitate access to such content. Typing "AI NS" prompted the App Store to recommend "image to video ai nsfw," which then surfaced several nudify-style apps among the top results. The pathway to these applications is not concealed; it is integrated into the standard search flow, step by step.
Lax Moderation and Developer Responses Highlight Systemic Issues
Another app identified in the report, found through a "face swap" query, allowed users to upload images and swap faces without any visible restrictions during testing. The process was straightforward, with no clear barriers at the point of use, indicating potential gaps in content moderation.
TTP contacted some developers of these apps for comment. In one case, a developer stated they were using Grok for image generation but claimed unawareness that the tool could produce explicit outputs. The developer promised to tighten moderation settings in response.
Widespread Problem Across Apple and Google Platforms
Apple declined to comment on the investigation's findings. However, following the report's publication, most of the apps flagged by TTP were removed from the App Store. Despite this, the broader issue persists, as similar patterns were observed on the Google Play Store.
The report adds that both Apple and Google continue to fail in preventing nudify apps from appearing in their app stores. Alarmingly, some of these applications are listed with age ratings suggesting suitability for minors. TTP found that nearly 40% of the top 10 apps returned for searches like "nudify," "undress," and "deepnude" could "render women nude or scantily clad."
Specific Example Underscores Lack of Safeguards
In a specific test, another App Store search for the term "face swap" yielded an advertisement for an app called AI Face Swap. This application offers preset face swap templates and permits users to swap faces on images they upload themselves. TTP uploaded a photo of a woman in a blue sweater standing in a living room and an image of a topless woman, and the app swapped their faces with no restrictions, as noted in the report.
TTP's findings indicate that despite periodic removals of such apps, app store safeguards are still insufficient, allowing these applications to surface through standard search and discovery features. This highlights an ongoing challenge for major tech platforms in policing harmful content effectively.



