The Tech Transparency Project (TTP) has carried out a follow-up inquiry into the availability of “nudify” applications on the App Store, uncovering that Apple and Google’s search and advertising frameworks might be unintentionally guiding users toward these apps. This article summarizes the primary insights from TTP’s most recent report.
### Nudify Applications Continue on App Stores
The TTP report emphasizes that the App Store and Google Play Store still provide access to apps capable of creating deepfake nude images of women. This is facilitated through sponsored search results and autocomplete suggestions. Alarmingly, around 40% of the leading results for searches using terms like “nudify,” “undress,” and “deepnude” included apps that can depict women nude or in scantily clad outfits.
Moreover, the report reveals that some of these applications are promoted as appropriate for minors, heightening significant concerns regarding their availability. Sponsored results for these applications were also noted, showcasing a disturbing trend in app advertisement.
### Specific Insights
A prominent example from the report included an App Store search for “deepfake,” which displayed an advertisement for FaceSwap Video by DuoFace. This application enables users to swap faces from still images onto videos. TTP tested the application by uploading an image of a clothed woman along with a video of a topless woman, resulting in a generated video that placed the clothed woman’s face onto the nude body.
Another search for “face swap” resulted in an ad for an application named AI Face Swap, which likewise allowed users to exchange faces without restrictions. TTP demonstrated this by uploading images of a woman dressed in a blue sweater and a topless woman, successfully swapping their faces.
### Developer Responses and App Store Responses
In their investigation, TTP contacted several application developers. One developer admitted to utilizing Grok for image creation but claimed unawareness of its potential to generate extreme content. They pledged to improve moderation settings for image generation following the findings.
Additionally, TTP observed that entering “AI NS” in the App Store search resulted in suggestions related to “AI NSFW,” leading to several nudify applications appearing among the top search results. Although Apple did not respond to TTP’s inquiries, they did remove the majority of the identified applications post-report.
### Conclusion
The TTP’s findings highlight persistent problems with the existence of nudify applications on prominent app stores and the involvement of search and advertising systems in promoting them. The report raises essential questions regarding content moderation and the duties of tech companies in protecting users, especially minors, from harmful applications. For more information, the full report can be accessed via TTP’s website.
