Defining Nonconsensual Synthetic Intimate Imagery
Artificial intelligence (AI) revolutionizes the creation of nonconsensual synthetic intimate imagery (NSII) by increasing the scale and speed with which perpetrators can create explicit images. Using computer vision and generative AI techniques, “nudification” applications can digitally remove clothing from photographs or videos of real people. These tools enable users without technical or even photo editing skills to generate intimate imagery of an individual without their consent.
NSII encompasses any digitally altered intimate images of a person created without the consent of the image subject.1 NSII falls under the broader category of synthetic media, which refers to any form of media that has been digitally manipulated or created to represent something that does not exist in reality.2 These images or videos are created using technologies from traditional photo editing software to sophisticated AI algorithms. NSII represents a form of image-based sexual abuse involving the nonconsensual creation or distribution of nude or sexual imagery, or the threat to do so, often as a form of control, power, or harassment.3
NSII is produced through several methods: face-swapping technology that replaces someone’s face onto existing adult content or live sexual videos; digital image manipulation that alters photographs to make clothed individuals appear undressed; or artificial intelligence generation that creates completely new images depicting people in nude or sexual situations.4 The technologies can be used for creative and positive purposes. For example, educators can use AI image generators to create infographics and other instructional material.5 However, the availability of these tools makes their use more widespread and the potential for harm significant, transforming what was once a technically complex process requiring specialized skills into something virtually anyone with internet access can do.
The term “deepfake” is commonly used to describe digitally altered content to make it appear that a person is doing or saying something that they never actually did or said. The history of deepfakes is firmly rooted in the harassment of women: The term itself was coined in 2017 by a Reddit moderator who created a now-removed subreddit for users to exchange nonconsensual, falsified sexual videos of female celebrities.6 Survey data suggested a serious problem back in 2019; 14.1 percent of 864 total respondents in a nonrepresentative survey across the United Kingdom, Australia, and New Zealand reported experiencing the creation, distribution, or threat of distributing a digitally altered sexualized image of them.7 By 2023, the top 10 explicit deepfake websites attracted over 34 million monthly visitors. An oft-cited 2023 report estimated that 98 percent of deepfake videos online were “pornographic” and 99 percent of those pornographic deepfake videos targeted women, the vast majority of which were prominent actresses and musicians.8
An Enabling Ecosystem
Both the creation and distribution of NSII are facilitated by a complex internet infrastructure encompassing payment providers, search engines, app stores, AI model-hosting platforms, and mainstream social media sites. This ecosystem has commercialized the nonconsensual sexualization of women’s bodies, with numerous “nudify” applications operating as profitable businesses that monetize gender-based abuse.9
Historically, combating NSII has been complicated by fragmented approaches across the internet ecosystem, including divergent platform policies.10 While some platforms proactively banned the sharing of sexual deepfakes and digitally altered content, many others were slow to respond, or required victims to self-identify and report the imagery for it to be removed, placing the enforcement burden on those being harmed.11
NSII is, by definition, nonconsensual. While deepfake technology has applications protected by free speech laws in many countries, such as in entertainment and parody, these same tools become harmful when used to generate intimate imagery without consent. The technology serves both legitimate and harmful purposes. This creates enforcement challenges for technology companies that enable deepfake creation and must distinguish between legitimate and harmful uses of the same underlying tools.
As governments increasingly enact legislation to address NSII, such as the TAKE IT DOWN Act, enacted May 2025 in the United States, platforms face increasing legal pressure to remove such content. For example, a provision of the TAKE IT DOWN Act requires removal of nonconsensual intimate imagery (AI-generated or otherwise) within 48 hours of a verified request.12 The distributed nature of the internet creates multiple intervention points where different stakeholders, from AI developers to payment processors, can disrupt the NSII pipeline. Despite this, significant challenges to curbing the creation and spread of NSII remain due to the financial incentives driving this ecosystem.
Creation Infrastructure
AI Developers: AI models power nudification websites. Historically, AI developers have made open-source models that offer the functionality to create deepfake nudes, including popular image generators like Stable Diffusion and Flux. These models require limited technical expertise from users, who can easily deploy them to create AI nudification websites or apps.13 A 2023 Graphika report estimates that the increasing capability and accessibility of open-source AI image diffusion models are the primary driver of growth in NSII services.14
Apps: The proliferation of bad-faith “nudify” applications, which allow users to upload a photo and receive back a “nude” version of the subject, has fueled a market based on exploiting women’s images. These apps typically sell various nudification features with very limited free functionality.15 One popular app has an annual budget of $3.5 million, according to a whistleblower.16 Nudify apps advertise their services as creating fake nonconsensual nude or sexually explicit images of women, in some cases specifically marketing to young men and boys.17
Most of these apps use a machine learning model trained to predict how an image subject would look naked and then alter the image to represent their as-predicted nude bodies. Other apps leverage AI face-swapping to morph the subject’s face onto another person’s body. Additional features of applications allow a user to put a subject in sexual scenes. One study found that the vast majority of the apps studied (19 out of 20) explicitly specialize in the undressing of women, while only half mention that they expect the user to have the image subject’s consent, and fewer ask for affirmation that consent has been obtained.18
Model and App Hosting Platforms: The proliferation of AI tools capable of creating NSII has created enforcement challenges across two key channels: model-hosting platforms and mobile app stores.
Model-hosting platforms like Civitai, Hugging Face, and GitHub have become primary repositories for AI models designed for nudification and deepfake creation, enabling both at commercial scale.19 One study found a huge rise in easily accessible deepfake models on model-hosting platforms, particularly on Civitai.20 Over 34,000 deepfake model variants, many of which indicate an intention or capability to generate NSII, have been downloaded almost 15 million times since 2022 and were available on popular repositories.21 Models hosted by these platforms allow users to generate pornographic videos of anyone they have an image of.22 In some cases, models hosted on these platforms power some of the most prolific NSII creation websites and services.23
Simultaneously, app stores such as Google Play Store and Apple App Store serve as critical hosts for mobile applications that specialize in nonconsensual “undressing” of women. These apps often utilize the underlying models hosted on the platforms, creating an interconnected ecosystem where model repositories provide the technical foundation and app stores provide user-friendly access points. While the exact number of nudify apps remains unclear, research from July 2025 examined 85 model-hosting platforms and found they collectively attracted an average of 18.5 million visitors over six months, with the potential to generate up to $36 million annually.24
Even when models violate terms of service, model-hosting platforms have found it difficult to prevent abuse of these tools.25 After Civitai banned 50,000 models that were being used to generate NSII, users migrated thousands of these models to another popular model-hosting platform as part of a concerted community effort to preserve the models.26 Even when they attempt to remove bad actors, these platforms struggle with enforcement and face diverse technical challenges. For example, for text-to-image generators, platforms can develop safeguards that refuse to generate images based on an inappropriate written prompt—but it is more challenging to build such protections into tools that generate videos based on images.27
Distribution Networks
Social Media Platforms: Online platforms play a pivotal role in allowing users to create NSII through bots, directing users to nudify sites via advertisements, or enabling the circulation of NSII. In the context of NSII targeting public officials, perpetrators often leverage mainstream social media platforms to give the explicit content a broader audience. The harms of NSII are magnified when such content spreads widely.28 Though most mainstream social media platforms actively prohibit NSII, enforcement is often insufficient, as demonstrated by a recent case involving Taylor Swift, where such content was viewed 27 million times in 19 hours.29
The role of social media platforms as a marketing tool for NSII services is growing. NSII providers leverage mainstream platforms to advertise their capabilities or direct users to their own websites via referral link spam. A 2023 report estimated the volume of referral link spam for nudify services increased by more than 2,000 percent on platforms, including Reddit and X, from January to December 2023.30 In June 2025, Meta sued one such company that had advertised on Facebook and Instagram.31
Deepfake Platforms: Following their widespread deplatforming on mainstream social media platforms, portions of the deepfake community migrated to dedicated platforms to continue discussing deepfake technology and share their creations. These platforms are home to forums explicitly devoted to technical assistance, dataset sharing, and the deepfake market.32 A 2024 estimate has 94 percent of NSII material hosted on sites dedicated to the practice.33 One of the most prominent of these platforms received 17 million visitors a month before it was shut down by its internet service provider.
Bots: Some online communities are centered on sharing and trading images of nonconsensual intimate images. One Telegram channel with over 45,000 unique members hosted bots that allow users to submit a photo and receive a nude back within minutes.34 A 2024 investigation found 50 nudify bots on Telegram that had reached over 4 million monthly users combined.35 Even after the bots are removed, the software that powers the programs can be found on open source repositories and torrenting websites.36 These communities all share tips and tricks for other methods for generating the same type of videos without the bot.37
Supporting Infrastructure
Search Engines and App Stores: Platforms that support discovery of deepfake platforms or apps through search play an important role in the visibility of NSII. In 2023, Google was the single largest driver of traffic to deepfake porn websites.38 In recent years, intimate deepfake videos of women could be found at the top of Google search results.39
Internet Service Providers: Internet service providers (ISPs) offer the infrastructure on which nudify apps or deepfake platforms rely. In some cases, ISPs do act to remove websites when they violate their terms of service. A service provider withdrew its support for one of the most prominent and mainstream marketplaces for intimate deepfakes after facing mounting scrutiny.40 As an internet governance mechanism, however, ISP removals are a particularly blunt instrument that can be abused to wipe entire websites alleged of hosting obscene content off the internet.41
Online Payment Providers: Payment providers play a pivotal role in the NSII ecosystem, as most nudification applications or platforms operate as commercial enterprises. Many of these applications rely on third-party payment processors or cryptocurrency transactions.42 One study found cryptocurrency to be the most popular avenue for payment for nudify apps. Nevertheless, many platforms still attempt to use mainstream payment providers like PayPal, despite policies that typically ban processing payments for NSII services.43 The potential power of payment processors to influence platform behavior became evident when they threatened to stop processing payments from Civitai unless the platform updated its rules to prevent hosting models that could be abused to create NSII.44
Citations
- Rebecca Umbach et al., “Non-Consensual Synthetic Intimate Imagery: Prevalence, Attitudes, and Knowledge in 10 Countries,” CHI ’24: Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems (May 11, 2024), source.
- Suzie Dunn, “Legal Definitions of Intimate Images in the Age of Sexual Deepfakes and Generative AI,” McGill Law Journal 69, no. 4 (October 2024), source.
- Nicola Henry et al., Image-Based Sexual Abuse: A Study on the Causes and Consequences of Nonconsensual Nude or Sexual Imagery (Routledge, 2020), 4–5.
- Dunn, “Legal Definitions of Intimate Images in the Age of Sexual Deepfakes and Generative AI,” source.
- Jillian Rubman, “Supporting Learning with AI-Generated Images: A Research-Backed Guide,” MIT Sloan Teaching & Learning Technologies, March 6, 2024, source.
- Samantha Cole, “AI-Assisted Fake Porn Is Here and We’re All Fucked,” Vice, December 11, 2017, source.
- Asher Flynn et al., “Deepfakes and Digitally Altered Imagery Abuse: A Cross-Country Exploration of an Emerging Form of Image-Based Sexual Abuse,” British Journal of Criminology 62, no. 6 (2022): 1341–58, source.
- Security Hero, 2023 State of Deepfakes: Realities, Threats, and Impact (Security Hero, 2023), source.
- Alexios Mantzarlis and Santiago Lakatos, “AI Nudifiers Continue to Reach Millions and Make Millions,” INDiCATOR, July 13, 2025, source.
- Danielle Keats Citron, Hate Crimes in Cyberspace (Harvard University Press, 2016); Tarleton Gillespie, Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (Yale University Press, 2018); Asher Flynn, Jonathan Clough, and Talani Cooke, “Disrupting and Preventing Deepfake Abuse: Exploring Criminal Law Responses to AI-Facilitated Abuse,” in The Palgrave Handbook of Gendered Violence and Technology, ed. Anastasia Powell, Asher Flynn, and Lisa Sugiura (Palgrave Macmillan, 2021), 583–603.
- “Nonconsensual Content Policy,” Pornhub, September 2024, source; “New Decision Addresses Meta’s Rules on Nonconsensual Deepfake Intimate Images,” Oversight Board, July 25, 2024, source; “Never Post Intimate or Sexually Explicit Media of Someone Without Their Consent,” Reddit, July 5, 2023, source; Noelle Martin, “Image-Based Sexual Abuse and Deepfakes: A Survivor Turned Activist’s Perspective,” in The Palgrave Handbook of Gendered Violence and Technology; Asher Flynn et al., “Deepfakes and Digitally Altered Imagery Abuse: A Cross-Country Exploration of an Emerging Form of Image-Based Sexual Abuse,” British Journal of Criminology 62, no. 6 (November 2022): 1341–58, source.
- S. 146 – TAKE IT DOWN Act (2025), source; “CCRI Statement on the Passage of the TAKE IT DOWN Act (S. 146),” Cyber Civil Rights Initiative, April 28, 2025, source.
- Cassidy Gibson et al., “Analyzing the AI Nudification Application Ecosystem,” arXiv.org, November 14, 2024, source.
- Santiago Lakatos, A Revealing Picture (Graphika, December 8, 2023), source.
- Gibson et al., “Analyzing the AI Nudification Application Ecosystem,” source.
- Ashley Belanger, “Nudify App’s Plan to Dominate Deepfake Porn Hinges on Reddit, 4chan, and Telegram, Docs Show,” Ars Technica, July 1, 2025, source.
- Emmet Lyons and Leigh Kiniry, “Meta’s Platforms Showed Hundreds of ‘Nudify’ Deepfake Ads, CBS News Investigation Finds,” CBS News, June 6, 2025, source; Belanger, “Nudify App’s Plan to Dominate Deepfake Porn,” source.
- Gibson et al., “Analyzing the AI Nudification Application Ecosystem,” source.
- Gibson et al., “Analyzing the AI Nudification Application Ecosystem,” source; Rachel Winter and Anastasia Salter, “DeepFakes: Uncovering Hardcore Open Source on GitHub,” Porn Studies 7, no. 4 (2020): 382–97, source.
- Will Hawkins, Brent Mittelstadt, and Chris Russell, “Deepfakes on Demand: The Rise of Accessible Nonconsensual Deepfake Image Generators,” FAccT ’25: Proceedings of the 2025 ACM Conference on Fairness, Accountability, and Transparency (June 23, 2025), source.
- Hawkins, Mittelstadt, and Russell, “Deepfakes on Demand,” source.
- Emanuel Maiberg, “‘Configuration Issue’ Allows Civitai Users to AI Generate Nonconsensual Porn Videos,” 404 Media, May 20, 2025, source.
- “Re: GitHub Hosting Source Code for Sexually Exploitative Technology, Facilitating Image-Based Sexual Abuse (IBSA), Sexual Exploitation, and Promoting the Dangerous Use of Generative-AI,” National Center on Sexual Exploitation, April 28, 2023, source.
- Mantzarlis and Lakatos, “AI Nudifiers Continue to Reach Millions and Make Millions,” source.
- Maiberg, “‘Configuration Issue’ Allows Civitai Users to AI Generate Nonconsensual Porn Videos,” source; Hawkins, Mittelstadt, and Russell, “Deepfakes on Demand,” source.
- Emanuel Maiberg, “Hugging Face Is Hosting 5,000 Nonconsensual AI Models of Real People,” 404 Media, July 15, 2025, source.
- Maiberg, “‘Configuration Issue’ Allows Civitai Users to AI Generate Nonconsensual Porn Videos,” source.
- Beatriz Kira, “When Non-Consensual Intimate Deepfakes Go Viral: The Insufficiency of the U.K. Online Safety Act,” Computer Law & Security Review 54 (September 2024), source.
- Kat Tenbarge, “Nude Deepfakes Images of Taylor Swift Went Viral on X, Evading Moderation and Sparking Outrage,” NBC News, January 25, 2024, source.
- Lakatos, A Revealing Picture, source.
- “Taking Action Against ‘Nudify’ Apps,” Meta, June 12, 2025, source; Kolina Koltai and Melissa Zhu, “Meta’s Suit Against Hong Kong Firm Was Just the Beginning–More Companies Linked to CrushAI ‘Nudify’ Apps,” Bellingcat, June 18, 2025, source.
- Brian Timmerman et al., “Studying the Online Deepfake Community,” Online Trust and Safety 2, no. 1 (2023), source.
- “Deepfake Abuse: Landscape Analysis (The Exponential Rise of Deepfake Abuse in 2023–2024),” #MyImageMyChoice, 2024, source.
- Karen Hao, “A Deepfake Bot Is Being Used to ‘Undress’ Underage Girls,” MIT Technology Review, October 20, 2020, source.
- Sammi Carmela, “‘Nudify’ Deepfake Bots on Telegram Are up to 4 Million Monthly Users,” Vice, October 16, 2024, source.
- James Vincent, “Deepfake Bots on Telegram Make the Work of Creating Fake Nudes Dangerously Easy,” The Verge, October 20, 2020, source.
- Maiberg, “‘Configuration Issue’ Allows Civitai Users to AI Generate Nonconsensual Porn Videos,” source.
- Cecilia D’Anastasio and Davey Alba, “Google and Microsoft Are Supercharging AI Deepfake Porn,” Bloomberg News, August 24, 2023, source.
- “Deepfake Abuse: Landscape Analysis,” source.
- Layla Ferris, “AI-Generated Porn Site Mr. Deepfakes Shuts Down After Service Provider Pulls Support,” CBS News, May 5, 2025, source.
- Emily B. Laidlaw, “Mechanisms of Information Control: ISPs,” in Regulating Speech in Cyberspace: Gatekeepers, Human Rights, and Corporate Responsibility (Cambridge University Press, 2015).
- Kolina Koltai, “AnyDream: Secretive AI Platform Broke Stripe Rules to Rake in Money from Nonconsensual Pornographic Deepfakes,” Bellingcat, November 27, 2023, source; Kolina Koltai, “Behind a Secretive Global Network of Non-Consensual Deepfake Pornography,” Bellingcat, February 23, 2024, source.
- Gibson et al., “Analyzing the AI Nudification Application Ecosystem,” source.
- Maiberg, “‘Configuration Issue’ Allows Civitai Users to AI Generate Nonconsensual Porn Videos,” source.