国产视频

Defining Nonconsensual Synthetic Intimate Imagery

Artificial intelligence (AI) revolutionizes the creation of nonconsensual synthetic intimate imagery (NSII) by increasing the scale and speed with which perpetrators can create explicit images. Using computer vision and generative AI techniques, 鈥渘udification鈥 applications can digitally remove clothing from photographs or videos of real people. These tools enable users without technical or even photo editing skills to generate intimate imagery of an individual without their consent.

NSII encompasses any digitally altered intimate images of a person created without the consent of the image subject.1 NSII falls under the broader category of synthetic media, which refers to any form of media that has been digitally manipulated or created to represent something that does not exist in reality.2 These images or videos are created using technologies from traditional photo editing software to sophisticated AI algorithms. NSII represents a form of image-based sexual abuse involving the nonconsensual creation or distribution of nude or sexual imagery, or the threat to do so, often as a form of control, power, or harassment.3

NSII is produced through several methods: face-swapping technology that replaces someone鈥檚 face onto existing adult content or live sexual videos; digital image manipulation that alters photographs to make clothed individuals appear undressed; or artificial intelligence generation that creates completely new images depicting people in nude or sexual situations.4 The technologies can be used for creative and positive purposes. For example, educators can use AI image generators to create infographics and other instructional material.5 However, the availability of these tools makes their use more widespread and the potential for harm significant, transforming what was once a technically complex process requiring specialized skills into something virtually anyone with internet access can do.

The term 鈥渄eepfake鈥 is commonly used to describe digitally altered content to make it appear that a person is doing or saying something that they never actually did or said. The history of deepfakes is firmly rooted in the harassment of women: The term itself was coined in 2017 by a Reddit moderator who created a now-removed subreddit for users to exchange nonconsensual, falsified sexual videos of female celebrities.6 Survey data suggested a serious problem back in 2019; 14.1 percent of 864 total respondents in a nonrepresentative survey across the United Kingdom, Australia, and New Zealand reported experiencing the creation, distribution, or threat of distributing a digitally altered sexualized image of them.7 By 2023, the top 10 explicit deepfake websites attracted over 34 million monthly visitors. An oft-cited 2023 report estimated that 98 percent of deepfake videos online were 鈥減ornographic鈥 and 99 percent of those pornographic deepfake videos targeted women, the vast majority of which were prominent actresses and musicians.8

An Enabling Ecosystem

Both the creation and distribution of NSII are facilitated by a complex internet infrastructure encompassing payment providers, search engines, app stores, AI model-hosting platforms, and mainstream social media sites. This ecosystem has commercialized the nonconsensual sexualization of women鈥檚 bodies, with numerous 鈥渘udify鈥 applications operating as profitable businesses that monetize gender-based abuse.9

Historically, combating NSII has been complicated by fragmented approaches across the internet ecosystem, including divergent platform policies.10 While some platforms proactively banned the sharing of sexual deepfakes and digitally altered content, many others were slow to respond, or required victims to self-identify and report the imagery for it to be removed, placing the enforcement burden on those being harmed.11

NSII is, by definition, nonconsensual. While deepfake technology has applications protected by free speech laws in many countries, such as in entertainment and parody, these same tools become harmful when used to generate intimate imagery without consent. The technology serves both legitimate and harmful purposes. This creates enforcement challenges for technology companies that enable deepfake creation and must distinguish between legitimate and harmful uses of the same underlying tools.

As governments increasingly enact legislation to address NSII, such as the TAKE IT DOWN Act, enacted May 2025 in the United States, platforms face increasing legal pressure to remove such content. For example, a provision of the TAKE IT DOWN Act requires removal of nonconsensual intimate imagery (AI-generated or otherwise) within 48 hours of a verified request.12 The distributed nature of the internet creates multiple intervention points where different stakeholders, from AI developers to payment processors, can disrupt the NSII pipeline. Despite this, significant challenges to curbing the creation and spread of NSII remain due to the financial incentives driving this ecosystem.

Creation Infrastructure

AI Developers: AI models power nudification websites. Historically, AI developers have made open-source models that offer the functionality to create deepfake nudes, including popular image generators like Stable Diffusion and Flux. These models require limited technical expertise from users, who can easily deploy them to create AI nudification websites or apps.13 A 2023 Graphika report estimates that the increasing capability and accessibility of open-source AI image diffusion models are the primary driver of growth in NSII services.14

Apps: The proliferation of bad-faith 鈥渘udify鈥 applications, which allow users to upload a photo and receive back a 鈥渘ude鈥 version of the subject, has fueled a market based on exploiting women鈥檚 images. These apps typically sell various nudification features with very limited free functionality.15 One popular app has an annual budget of $3.5 million, according to a whistleblower.16 Nudify apps advertise their services as creating fake nonconsensual nude or sexually explicit images of women, in some cases specifically marketing to young men and boys.17

Most of these apps use a machine learning model trained to predict how an image subject would look naked and then alter the image to represent their as-predicted nude bodies. Other apps leverage AI face-swapping to morph the subject鈥檚 face onto another person鈥檚 body. Additional features of applications allow a user to put a subject in sexual scenes. One study found that the vast majority of the apps studied (19 out of 20) explicitly specialize in the undressing of women, while only half mention that they expect the user to have the image subject鈥檚 consent, and fewer ask for affirmation that consent has been obtained.18

Model and App Hosting Platforms: The proliferation of AI tools capable of creating NSII has created enforcement challenges across two key channels: model-hosting platforms and mobile app stores.

Model-hosting platforms like Civitai, Hugging Face, and GitHub have become primary repositories for AI models designed for nudification and deepfake creation, enabling both at commercial scale.19 One study found a huge rise in easily accessible deepfake models on model-hosting platforms, particularly on Civitai.20 Over 34,000 deepfake model variants, many of which indicate an intention or capability to generate NSII, have been downloaded almost 15 million times since 2022 and were available on popular repositories.21 Models hosted by these platforms allow users to generate pornographic videos of anyone they have an image of.22 In some cases, models hosted on these platforms power some of the most prolific NSII creation websites and services.23

Simultaneously, app stores such as Google Play Store and Apple App Store serve as critical hosts for mobile applications that specialize in nonconsensual 鈥渦ndressing鈥 of women. These apps often utilize the underlying models hosted on the platforms, creating an interconnected ecosystem where model repositories provide the technical foundation and app stores provide user-friendly access points. While the exact number of nudify apps remains unclear, research from July 2025 examined 85 model-hosting platforms and found they collectively attracted an average of 18.5 million visitors over six months, with the potential to generate up to $36 million annually.24

Even when models violate terms of service, model-hosting platforms have found it difficult to prevent abuse of these tools.25 After Civitai banned 50,000 models that were being used to generate NSII, users migrated thousands of these models to another popular model-hosting platform as part of a concerted community effort to preserve the models.26 Even when they attempt to remove bad actors, these platforms struggle with enforcement and face diverse technical challenges. For example, for text-to-image generators, platforms can develop safeguards that refuse to generate images based on an inappropriate written prompt鈥攂ut it is more challenging to build such protections into tools that generate videos based on images.27

Distribution Networks

Social Media Platforms: Online platforms play a pivotal role in allowing users to create NSII through bots, directing users to nudify sites via advertisements, or enabling the circulation of NSII. In the context of NSII targeting public officials, perpetrators often leverage mainstream social media platforms to give the explicit content a broader audience. The harms of NSII are magnified when such content spreads widely.28 Though most mainstream social media platforms actively prohibit NSII, enforcement is often insufficient, as demonstrated by a recent case involving Taylor Swift, where such content was viewed 27 million times in 19 hours.29

The role of social media platforms as a marketing tool for NSII services is growing. NSII providers leverage mainstream platforms to advertise their capabilities or direct users to their own websites via referral link spam. A 2023 report estimated the volume of referral link spam for nudify services increased by more than 2,000 percent on platforms, including Reddit and X, from January to December 2023.30 In June 2025, Meta sued one such company that had advertised on Facebook and Instagram.31

Deepfake Platforms: Following their widespread deplatforming on mainstream social media platforms, portions of the deepfake community migrated to dedicated platforms to continue discussing deepfake technology and share their creations. These platforms are home to forums explicitly devoted to technical assistance, dataset sharing, and the deepfake market.32 A 2024 estimate has 94 percent of NSII material hosted on sites dedicated to the practice.33 One of the most prominent of these platforms received 17 million visitors a month before it was shut down by its internet service provider.

Bots: Some online communities are centered on sharing and trading images of nonconsensual intimate images. One Telegram channel with over 45,000 unique members hosted bots that allow users to submit a photo and receive a nude back within minutes.34 A 2024 investigation found 50 nudify bots on Telegram that had reached over 4 million monthly users combined.35 Even after the bots are removed, the software that powers the programs can be found on open source repositories and torrenting websites.36 These communities all share tips and tricks for other methods for generating the same type of videos without the bot.37

Supporting Infrastructure

Search Engines and App Stores: Platforms that support discovery of deepfake platforms or apps through search play an important role in the visibility of NSII. In 2023, Google was the single largest driver of traffic to deepfake porn websites.38 In recent years, intimate deepfake videos of women could be found at the top of Google search results.39

Internet Service Providers: Internet service providers (ISPs) offer the infrastructure on which nudify apps or deepfake platforms rely. In some cases, ISPs do act to remove websites when they violate their terms of service. A service provider withdrew its support for one of the most prominent and mainstream marketplaces for intimate deepfakes after facing mounting scrutiny.40 As an internet governance mechanism, however, ISP removals are a particularly blunt instrument that can be abused to wipe entire websites alleged of hosting obscene content off the internet.41

Online Payment Providers: Payment providers play a pivotal role in the NSII ecosystem, as most nudification applications or platforms operate as commercial enterprises. Many of these applications rely on third-party payment processors or cryptocurrency transactions.42 One study found cryptocurrency to be the most popular avenue for payment for nudify apps. Nevertheless, many platforms still attempt to use mainstream payment providers like PayPal, despite policies that typically ban processing payments for NSII services.43 The potential power of payment processors to influence platform behavior became evident when they threatened to stop processing payments from Civitai unless the platform updated its rules to prevent hosting models that could be abused to create NSII.44

Citations
  1. Rebecca Umbach et al., 鈥淣on-Consensual Synthetic Intimate Imagery: Prevalence, Attitudes, and Knowledge in 10 Countries,鈥 CHI 鈥24: Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems (May 11, 2024), .
  2. Suzie Dunn, 鈥淟egal Definitions of Intimate Images in the Age of Sexual Deepfakes and Generative AI,鈥 McGill Law Journal 69, no. 4 (October 2024), .
  3. Nicola Henry et al., Image-Based Sexual Abuse: A Study on the Causes and Consequences of Nonconsensual Nude or Sexual Imagery (Routledge, 2020), 4鈥5.
  4. Dunn, 鈥淟egal Definitions of Intimate Images in the Age of Sexual Deepfakes and Generative AI,鈥 .
  5. Jillian Rubman, 鈥淪upporting Learning with AI-Generated Images: A Research-Backed Guide,鈥 MIT Sloan Teaching & Learning Technologies, March 6, 2024, .
  6. Samantha Cole, 鈥淎I-Assisted Fake Porn Is Here and We鈥檙e All Fucked,鈥 Vice, December 11, 2017, .
  7. Asher Flynn et al., 鈥淒eepfakes and Digitally Altered Imagery Abuse: A Cross-Country Exploration of an Emerging Form of Image-Based Sexual Abuse,鈥 British Journal of Criminology 62, no. 6 (2022): 1341鈥58, .
  8. Security Hero, 2023 State of Deepfakes: Realities, Threats, and Impact (Security Hero, 2023), .
  9. Alexios Mantzarlis and Santiago Lakatos, 鈥淎I Nudifiers Continue to Reach Millions and Make Millions,鈥 INDiCATOR, July 13, 2025, .
  10. Danielle Keats Citron, Hate Crimes in Cyberspace (Harvard University Press, 2016); Tarleton Gillespie, Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (Yale University Press, 2018); Asher Flynn, Jonathan Clough, and Talani Cooke, 鈥淒isrupting and Preventing Deepfake Abuse: Exploring Criminal Law Responses to AI-Facilitated Abuse,鈥 in The Palgrave Handbook of Gendered Violence and Technology, ed. Anastasia Powell, Asher Flynn, and Lisa Sugiura (Palgrave Macmillan, 2021), 583鈦犫撯仩603.
  11. 鈥淣onconsensual Content Policy,鈥 Pornhub, September 2024, ; 鈥淣ew Decision Addresses Meta鈥檚 Rules on Nonconsensual Deepfake Intimate Images,鈥 Oversight Board, July 25, 2024, ; 鈥淣ever Post Intimate or Sexually Explicit Media of Someone Without Their Consent,鈥 Reddit, July 5, 2023, ; Noelle Martin, 鈥淚mage-Based Sexual Abuse and Deepfakes: A Survivor Turned Activist鈥檚 Perspective,鈥 in The Palgrave Handbook of Gendered Violence and Technology; Asher Flynn et al., 鈥淒eepfakes and Digitally Altered Imagery Abuse: A Cross-Country Exploration of an Emerging Form of Image-Based Sexual Abuse,鈥 British Journal of Criminology 62, no. 6 (November 2022): 1341鈥58, .
  12. S. 146 – TAKE IT DOWN Act (2025), ; 鈥淐CRI Statement on the Passage of the TAKE IT DOWN Act (S. 146),鈥 Cyber Civil Rights Initiative, April 28, 2025, .
  13. Cassidy Gibson et al., 鈥淎nalyzing the AI Nudification Application Ecosystem,鈥 arXiv.org, November 14, 2024, .
  14. Santiago Lakatos, A Revealing Picture (Graphika, December 8, 2023), .
  15. Gibson et al., 鈥淎nalyzing the AI Nudification Application Ecosystem,鈥 .
  16. Ashley Belanger, 鈥淣udify App鈥檚 Plan to Dominate Deepfake Porn Hinges on Reddit, 4chan, and Telegram, Docs Show,鈥 Ars Technica, July 1, 2025, .
  17. Emmet Lyons and Leigh Kiniry, 鈥淢eta鈥檚 Platforms Showed Hundreds of 鈥楴udify鈥 Deepfake Ads, CBS News Investigation Finds,鈥 CBS News, June 6, 2025, ; Belanger, 鈥淣udify App鈥檚 Plan to Dominate Deepfake Porn,鈥 .
  18. Gibson et al., 鈥淎nalyzing the AI Nudification Application Ecosystem,鈥 .
  19. Gibson et al., 鈥淎nalyzing the AI Nudification Application Ecosystem,鈥 ; Rachel Winter and Anastasia Salter, 鈥淒eepFakes: Uncovering Hardcore Open Source on GitHub,鈥 Porn Studies 7, no. 4 (2020): 382鈥97, .
  20. Will Hawkins, Brent Mittelstadt, and Chris Russell, 鈥淒eepfakes on Demand: The Rise of Accessible Nonconsensual Deepfake Image Generators,鈥 FAccT 鈥25: Proceedings of the 2025 ACM Conference on Fairness, Accountability, and Transparency (June 23, 2025), .
  21. Hawkins, Mittelstadt, and Russell, 鈥淒eepfakes on Demand,鈥 .
  22. Emanuel Maiberg, 鈥溾楥onfiguration Issue鈥 Allows Civitai Users to AI Generate Nonconsensual Porn Videos,鈥 404 Media, May 20, 2025, .
  23. 鈥淩e: GitHub Hosting Source Code for Sexually Exploitative Technology, Facilitating Image-Based Sexual Abuse (IBSA), Sexual Exploitation, and Promoting the Dangerous Use of Generative-AI,鈥 National Center on Sexual Exploitation, April 28, 2023, .
  24. Mantzarlis and Lakatos, 鈥淎I Nudifiers Continue to Reach Millions and Make Millions,鈥 .
  25. Maiberg, 鈥溾楥onfiguration Issue鈥 Allows Civitai Users to AI Generate Nonconsensual Porn Videos,鈥 ; Hawkins, Mittelstadt, and Russell, 鈥淒eepfakes on Demand,鈥 .
  26. Emanuel Maiberg, 鈥淗ugging Face Is Hosting 5,000 Nonconsensual AI Models of Real People,鈥 404 Media, July 15, 2025, .
  27. Maiberg, 鈥溾楥onfiguration Issue鈥 Allows Civitai Users to AI Generate Nonconsensual Porn Videos,鈥 .
  28. Beatriz Kira, 鈥淲hen Non-Consensual Intimate Deepfakes Go Viral: The Insufficiency of the U.K. Online Safety Act,鈥 Computer Law & Security Review 54 (September 2024), .
  29. Kat Tenbarge, 鈥淣ude Deepfakes Images of Taylor Swift Went Viral on X, Evading Moderation and Sparking Outrage,鈥 NBC News, January 25, 2024, .
  30. Lakatos, A Revealing Picture, .
  31. 鈥淭aking Action Against 鈥楴udify鈥 Apps,鈥 Meta, June 12, 2025, ; Kolina Koltai and Melissa Zhu, 鈥淢eta鈥檚 Suit Against Hong Kong Firm Was Just the Beginning鈥揗ore Companies Linked to CrushAI 鈥楴udify鈥 Apps,鈥 Bellingcat, June 18, 2025, .
  32. Brian Timmerman et al., 鈥淪tudying the Online Deepfake Community,鈥 Online Trust and Safety 2, no. 1 (2023), .
  33. 鈥淒eepfake Abuse: Landscape Analysis (The Exponential Rise of Deepfake Abuse in 2023鈦犫撯仩2024),鈥 #MyImageMyChoice, 2024, .
  34. Karen Hao, 鈥淎 Deepfake Bot Is Being Used to 鈥楿ndress鈥 Underage Girls,鈥 MIT Technology Review, October 20, 2020, .
  35. Sammi Carmela, 鈥溾楴udify鈥 Deepfake Bots on Telegram Are up to 4 Million Monthly Users,鈥 Vice, October 16, 2024, .
  36. James Vincent, 鈥淒eepfake Bots on Telegram Make the Work of Creating Fake Nudes Dangerously Easy,鈥 The Verge, October 20, 2020, .
  37. Maiberg, 鈥溾楥onfiguration Issue鈥 Allows Civitai Users to AI Generate Nonconsensual Porn Videos,鈥 .
  38. Cecilia D鈥橝nastasio and Davey Alba, 鈥淕oogle and Microsoft Are Supercharging AI Deepfake Porn,鈥 Bloomberg News, August 24, 2023, .
  39. 鈥淒eepfake Abuse: Landscape Analysis,鈥 .
  40. Layla Ferris, 鈥淎I-Generated Porn Site Mr. Deepfakes Shuts Down After Service Provider Pulls Support,鈥 CBS News, May 5, 2025, .
  41. Emily B. Laidlaw, 鈥淢echanisms of Information Control: ISPs,鈥 in Regulating Speech in Cyberspace: Gatekeepers, Human Rights, and Corporate Responsibility (Cambridge University Press, 2015).
  42. Kolina Koltai, 鈥淎nyDream: Secretive AI Platform Broke Stripe Rules to Rake in Money from Nonconsensual Pornographic Deepfakes,鈥 Bellingcat, November 27, 2023, ; Kolina Koltai, 鈥淏ehind a Secretive Global Network of Non-Consensual Deepfake Pornography,鈥 Bellingcat, February 23, 2024, .
  43. Gibson et al., 鈥淎nalyzing the AI Nudification Application Ecosystem,鈥 .
  44. Maiberg, 鈥溾楥onfiguration Issue鈥 Allows Civitai Users to AI Generate Nonconsensual Porn Videos,鈥 .
Defining Nonconsensual Synthetic Intimate Imagery

Table of Contents

Close