The Targeting of Female Public Officials

Internet-based platforms can allow female politicians to communicate directly with their constituents, overcoming the marginalization and bias they might face in traditional media outlets. These platforms also expose them to alarming levels of sexism, harassment, and threats, which can have damaging effects on young women’s political ambitions.1

University of Virginia law professor Danielle Citron argues that women are often “canaries in the coal mine” when it comes to early uses of digital technologies, offering early warning indicators of how new digital tools will be misused.2 In 2007, online harassers targeted a software developer and prominent blogger with rape and death threats, doctored images, and doxxing, posting her Social Security number and home address online. Although her blog, “Creating Passionate Users,” was building her reputation in the tech community, she suspended it and canceled public appearances.3

Highly visible women, including media personas, human rights defenders, and public officials, often become early targets of technology-facilitated abuse, serving as harbingers of broader patterns to come. As one female parliamentarian observed about nonconsensual synthetic intimate imagery (NSII): “Whatever new things come up, it’s always used against the women first. They are the victim in every case. AI is not an exception in any way.”4 #MyImageMyChoice, a nonprofit that researches NSII prevalence, notes that one of the groups of women disproportionately targeted is those connected to politics.5

Women politicians report being extremely concerned about the pervasiveness of gender-based abuse in the digital space as a real barrier and a serious disincentive for young women to consider a political career.6 In a recent global survey of 14,000 girls and young women, half of the respondents experienced harassment for voicing political opinions. Of those that experienced harassment, 20 percent self-censored online as a result.7 One study found strong evidence that in Kenya and Colombia, online harassment and threats decreased politically active women’s willingness to voice political opinions online.8 Generally, experts are concerned that NSII will have a higher cost for female politicians, who already face higher levels of harassment and defamation than men in the field.9

Case Studies and Key Insights

The following analysis focuses specifically on the targeting of public officials with nonconsensual synthetic intimate imagery (NSII), investigating the patterns and tactics used in 100 incidents weaponizing NSII against public officials across 14 countries. To identify documented cases, this research employed keyword searches to identify English-language reports of NSII targeting politicians or public officials. This research drew from media reporting, artificial intelligence (AI) incident databases, academic papers, and think tank reports.10

The data collection took place over a four-month period between February and June 2025. The research surfaced reports of NSII targeting public officials between 2017 and 2025. The vast majority of examples (92 of 100) took place between 2022 and 2025. This may be a result of the widespread availability of AI tools, increased media awareness of the issue of NSII, or other factors. This research did not have a specific geographic focus but was intended to surface any English-language report of NSII targeting a public official.

There are several limitations to the methodology. As sourcing involved English-language AI incident databases and keyword searches, predominantly English-language sources surfaced.11 This likely leaves crucial data gaps from non-English language sources. In addition, the analysis relied on the reporting of third parties or self-reporting, which naturally yielded data gaps, in particular on which platforms NSII surfaced, how much user engagement it received, and how quickly platforms acted to limit its spread. Given the reliance on third parties, it was not possible to independently verify the accuracy of reports.

The observations presented here emerge from a limited set of case studies and constitute a qualitative analysis of available data. Given the limitations of this analysis, it is impossible to broadly generalize about the prevalence of NSII globally, the tactics employed, or the types of officials most frequently targeted. Another important caveat is that similar to other forms of image-based sexual abuse, NSII is likely a significantly underreported phenomenon. The limited number of documented cases should not be interpreted as evidence of low prevalence, but rather as a reflection of the challenges in documenting and reporting these attacks. When researchers do investigate, they find ample examples. In the United States, one report found tens of thousands of sexualized deepfakes depicting 26 members of Congress (25 women and one man).12

Demographics

NSII attacks disproportionately target female officials, with women comprising the vast majority of victims in documented cases. Out of 100 examples, only three involved male public officials: the governor of São Paulo, a former South Korean president, and one member of the U.S. Congress (see Figure 1). Female targets span the entire spectrum of public officials, from high-profile presidential candidates to local city commissioners, ambassadors, and civil servants.

Electoral Timing

In at least 42 of the documented cases, women were targeted during critical political moments, including during campaign periods and the weeks immediately prior to an election. The strategic timing of these attacks points toward NSII as a tool of political interference. By targeting women during high-visibility campaign periods and elections, perpetrators can maximize both the psychological impact on victims and the potential to influence voter perceptions at crucial moments. This timing pattern demonstrates that NSII can undermine women’s political representation precisely when their voices and leadership are most visible and consequential.

Case Study

Cara Hunter, a politician in Northern Ireland, was weeks away from the 2022 national legislative elections when her face was superimposed onto the face of another person in a deepfake video.13 According to the politician, the video was shared thousands of times across WhatsApp and social media.14 The timing created particularly difficult choices: She had to weigh whether to issue a public rebuttal immediately or wait until after voting concluded. As she reflected, “Years of building trust in my community and in one fell swoop, it meant [nothing].”15

Targeting Political Opposition

In five documented cases, perpetrators weaponized NSII against challengers to political incumbents and opposition figures across different regions. By targeting women who challenge existing power structures, these attacks serve to delegitimize political opposition and maintain the status quo. NSII targeting opposition figures aligns with broader trends showing that opposition politicians are targeted with abuse at higher rates than ruling party politicians.16

Case Study

In the Republic of Georgia, Salomé Zourabichvili, the country’s first elected female president, faced backlash after she pardoned the founder and former director of an opposition television channel. Following the official pardon, manipulated images spread on Facebook falsely depicting her as a prostitute and demanding her removal from office.17 Although initially backed by the ruling party, the president lost its support. This conflict continued to escalate, culminating in the ruling party initiating an impeachment proceeding against her three months after she issued the pardon.18

State-Backed Influence Operations

The integration of NSII into broader state-backed influence campaigns represents an escalation that combines traditional information operations with gendered attacks designed to exploit misogyny. The weaponization of NSII against female political figures includes a pattern in which Russian influence actors disproportionately target female candidates with gender-specific disinformation.19

In one case, NSII targeting Ukrainian parliamentarian Svitlana Zalishchuk had suspected Russian origins.20 In another targeting the Georgian president, NSII content was shared by pro-Kremlin sources.21 Though these cases could not be confirmed to originate from Russian state actors, five cases involved targeting women leaders in countries that Russia had previously invaded, including the Republic of Georgia and Ukraine. Modern conflicts increasingly involve information campaigns, where sexual deepfakes can serve as powerful propaganda tools to undermine women leaders’ authority and distract from substantive political issues. Zalishchuk became the target of NSII on social media immediately after delivering a high-profile speech at the United Nations on how the conflict with Russia was impacting Ukrainian women. Tweets that included crudely manipulated nude images of her began to circulate during the early months of Russia’s 2014 invasion of Crimea.22

Case Study

A pornographic video falsely depicting U.S. presidential candidate Hillary Clinton in a sex act was traced to a Reddit account with suspected links to the Russian government-affiliated Internet Research Agency (IRA). This account, later confirmed by Reddit as the IRA’s most popular Reddit account, had almost 100,000 upvotes before it was removed by the site. The account posted the video on several platforms before the 2016 election. The same video was subsequently posted to an American pornography website where it was viewed 250,000 times.23

Cultural Sensitivities

NSII attacks can be particularly devastating for women politicians and public officials in conservative societies and traditional communities where cultural shame carries severe consequences.24 In four documented instances, perpetrators exploited local cultural sensitivities by portraying female public officials as violating moral codes. In some countries, women can be murdered for violating cultural expectations of modesty.25 NSII perpetrators exploit cultural taboos, transforming digital harassment into potentially physical danger.

Case Study

Azma Bukhari, the information minister of Pakistan’s most populous province, had her face digitally superimposed onto the sexualized body of an actor in a fabricated video. The minister, one of the few women leaders in the country, became aware of the video when it quickly spread on social media.26 The politician described being “shattered” when the deepfake came to her attention and did not appear publicly for days after the video appeared online. After initially hesitating, she ultimately brought her case to Lahore’s High Court.

Tactics

The use of face-swapping technology was the most frequently cited NSII creation method, with perpetrators superimposing women’s faces onto existing, often pornographic content in the majority of documented cases. The technical sophistication ranged dramatically from crude photo manipulation to advanced AI-generated deepfakes, potentially created through “nudify” applications. Traditional, pre-AI editing techniques such as Photoshop remain prevalent (in at least 21 cases) alongside newer AI tools (72 cases referenced “deepfakes” or “AI” creation tools), making it challenging to distinguish between fully AI-generated content, AI-assisted alterations, and content created using legacy manipulation methods. Though not included in this analysis, the research revealed several instances in which perpetrators used real images of women that they misattributed as portraying public officials. In one example, images of an American actress in a bikini went viral after they were inaccurately described as depicting Croatia’s first post-independence female head of state.27

Several of the documented cases involved crudely manipulated images or videos that were easily identifiable as fabricated. The obvious artificiality of the content, however, did not diminish its capacity to delegitimize or humiliate female politicians. Rather than aiming for deceptive, realistic images or videos, perpetrators prioritized creating content to delegitimize or demean female officials, a goal achieved regardless of the content’s believability.

The persistence of harm from clearly fake content demonstrates that the damage stems less from the potential deception of viewers than from the act of sexualization itself. When women officials are depicted in explicit scenarios, even obviously fabricated ones, the content reduces them from serious political actors to sexual objects in the public consciousness. The threat landscape also includes several sophisticated examples, such as seamlessly integrated deepfake videos of Italy’s prime minister that were uploaded to American adult websites, showing the technology’s evolution toward increasingly convincing forgeries.28

Perpetrators drew from diverse image sources, ranging from official government photographs to personal social media content. In one notable case, Sabrina Javellana, a local city commissioner in Florida, was targeted using photographs sourced from her personal Instagram account three years before she assumed office, demonstrating how perpetrators mine targets’ digital histories for usable material.

In the limited number of cases in which data was available on the specific accounts circulating NSII on social media, perpetrators often distributed the images or videos through fake accounts designed to obscure the true source. These “sock puppet” accounts impersonated credible figures, such as a journalist, to lend legitimacy to the malicious content. In most instances, the offending posts were eventually removed by the platforms. But in general, NSII rarely remained confined to a single platform. Instead, media reporting suggests NSII content circulated across multiple mainstream social media sites, maximizing potential reach and making content removal for victims more difficult due to platform fragmentation.

Impact Analysis

NSII inflicts both individual and systemic harms that extend far beyond personal violations. Individual public officials targeted by sexualized deepfakes experience direct psychological trauma, reputational damage, and the added burden of mounting legal and administrative responses to these attacks. The broader implications, however, pose an even greater threat to democratic participation. NSII functions as a weapon designed to undermine women’s political engagement at every level. It weakens the standing of current female candidates and officials, discourages women from entering politics, and pressures those already in office to withdraw from public service entirely. By weaponizing sexuality against women in politics, NSII threatens to reshape political participation.

Individual-Level Impacts

NSII inflicts both psychological and professional damage on targeted women, undermining both their mental health and their credibility as public figures. The psychological toll can be significant. Azma Bukhari noted being “depressed” after being targeted, while Cara Hunter called the experience “the most horrific and stressful time of my entire life,” trauma that still impacted her life three years after the initial abuse.29

Beyond personal suffering, NSII undermines women’s policy positions and expertise. As Hunter explained, “It was a campaign to undermine me politically. It has left a tarnished perception of me that I can’t control. I’ll have to pay for the repercussions of this for the rest of my life.”30 Svitlana Zalishchuk noted, “It was all intended to discredit me as a personality, to devalue me and what I’m saying.” Months after the social media campaign, a journalist asked the Ukrainian politician if it was true that she ran naked through the streets of Kyiv.31

Being targeted by NSII forced female public officials to shoulder legal and administrative burdens, often while juggling campaign responsibilities or official duties. Several filed formal complaints with police or electoral courts, while others felt compelled to publicly debunk the content on social media platforms. Two Brazilian candidates, Letícia Arsenio and Loreny Caetano, posted rebuttal videos to their social media pages to debunk and condemn the content.32

The legal responses varied by jurisdiction but consistently demanded time and resources from victims. In Brazil, multiple candidates filed police reports during election season, including a city council candidate and two mayoral candidates, while another mayoral candidate pursued a criminal complaint through the electoral court.33 Italy’s prime minister, Giorgia Meloni, filed a civil defamation suit against two men who circulated explicit deepfake videos of her and ultimately had to take time away from her official responsibilities to testify during the trial.34

Collective Harms

NSII targeting of female public officials can threaten fundamental democratic principles by deliberately attempting to influence voting behavior and undermine support for women candidates. This manipulation of electoral processes through sexualized attacks represents a direct assault on democratic participation and fair representation. In addition, the knowledge that one’s likeness could be artificially injected into pornographic content may discourage women from engaging with online platforms or pursuing public roles altogether.

Perhaps most concerning is the resulting behavioral change among women in politics. Some may be deterred from seeking office in the first place, while others already in positions of power may choose to self-censor their views or limit their public visibility to avoid becoming targets. Evidence from individuals targeted by NSII reveals both behavioral changes and devastating psychological consequences, including depression and post-traumatic stress disorder.35 These effects can be extensive: While in office as one of the youngest elected officials in Florida’s history, Sabrina Javellana made her social media accounts private, changed how she dressed for public appearances, and stopped walking alone at night.36

The psychological, legal, and administrative burdens can prove so overwhelming that some women abandon their political careers entirely. Javellana left her career as a politician after being targeted by NSII, overwhelmed by the volume of material created about her and the prohibitive cost of hiring a lawyer to pursue each perpetrator. She decided not to seek reelection, despite being confident she could win, feeling safer outside the public eye.37

The Florida case highlights critical vulnerabilities for local and younger politicians who lack the resources available to high-profile national figures when combating NSII. With limited budgets and staff, they may simply decide that the personal and financial costs of public service have become too high. The result is a chilling effect that could drive women away from local politics, where they may begin their political careers and where women are still underrepresented.38

This systematic intimidation has the potential to reduce women’s full participation in democratic governance and limit the diversity of voices in political discourse. This is not merely a hypothetical concern: Several female British lawmakers quit ahead of the United Kingdom’s 2019 general election, with some explicitly citing vicious abuse and intimidation as reasons for stepping down.39

Citations
  1. Lucina Di Meco, “Online Threats to Women’s Political Participation and the Need for a Multi-Stakeholder, Cohesive Approach to Address Them,” paper presented at the 65th session of the Commission on the Status of Women (CSW 65), UN Women Expert Group Meeting, New York, October 5⁠–⁠8, 2020, source; Maarja Lühiste et al., “When Does Fame Not Matter? Examining Gender Differences in Politicians’ Social Media Experiences,” Politics & Gender (July 30, 2025), source.
  2. Robert Chesney, Danielle Citron, and Hany Farid, “All’s Clear for Deepfakes: Think Again,” Lawfare, May 11, 2020, source; Danielle Citron, “Cyber Civil Rights,” Boston University Law Review 89, no. 1 (2009): 61, source.
  3. Citron, “Cyber Civil Rights,” 61, 64⁠–65, source.
  4. Pranshu Verma and Cat Zakrzewski, “AI Deepfakes Threaten to Upend Global Elections. No One Can Stop Them,” Washington Post, April 23, 2024, source.
  5. “Deepfake Abuse: Landscape Analysis,” ​​source.
  6. Di Meco, “Online Threats to Women’s Political Participation,” source.
  7. Plan International, State of the World’s Girls 2020: Free to Be Online? (Plan International, 2020), source.
  8. National Democratic Institute (NDI), Tweets That Chill: Analyzing Online Violence Against Women in Politics (NDI, 2019), source.
  9. Vandinika Shukla, “Deepfakes and Elections: The Risk to Women’s Political Participation,” Tech Policy Press, February 29, 2024, source.
  10. The author used the Partnership on AI AI Incident Database, the OECD AI Incident Database, the Rest of World 2024 AI Elections Tracker, the Resemble.AI Deepfake Incident Database, and the Political Deepfakes Incidents Database.
  11. The author used the following Boolean operation: (“artificial intelligence-generated image” OR “artificial intelligence-generated video” OR “deepfake” OR “deepfake porn” OR “nonconsensual intimate image” OR “synthetic media”) AND (“public official” OR “politician” OR “candidate”).
  12. American Sunlight Project (ASP), Deepfake Pornography Goes to Washington: Measuring the Prevalence of AI-Generated Non-Consensual Intimate Imagery Targeting Congress (ASP, December 11, 2024), source.
  13. “Creator of Deepfake Images of MLA Yet to Be Found,” BBC News, January 23, 2025, source.
  14. Rebecca Black, “Stormont MLA Targeted by Deepfake Video Urges Legal Clampdown,” The Standard, January 14, 2025, source.
  15. Mark Scott, “Deepfake Porn Is Political Violence,” Politico, February 8, 2024, source.
  16. Amnesty International India, Troll Patrol India: Exposing Online Abuse Faced by Women Politicians in India (Amnesty International India, 2020), source.
  17. “Manipulated Photos Depicting Salome Zourabichvili Have Been Circulating On Facebook,” Myth Detectors, June 27, 2023, source.
  18. “Georgian Dream Launches Impeachment Proceedings Against President,” Civil Georgia, September 1, 2023, source.
  19. Julia Smirnova et al., Digitale Gewalt Und Desinformation Gegen Spitzenkandidat: Innen Vor Der Bundestagswahl 2021 (Institute for Strategic Dialogue, September 2021), source; Samantha Bradshaw and Amélie Henle, “The Gender Dimensions of Foreign Influence Operations,” International Journal of Communication 15 (2021), source.
  20. Nina Jankowicz, “How Disinformation Became a New Threat to Women,” Coda, December 11, 2017, source.
  21. Natia Kekenadze, Tina Gogoladze, and Salome Giunashvili, Sexist Language and Gendered Disinformation 2023 (Media Development Foundation, 2023), source.
  22. Jankowicz, “How Disinformation Became a New Threat to Women,” source; “Manipulated Photos Depicting Salome Zourabichvili Have Been Circulating On Facebook,” Myth Detectors, June 27, 2023, source; “A Photo of Vera Kobalia Is Being Circulated on Social Media,” Myth Detectors, January 11, 2023, source; Kekenadze, Gogoladze, and Giunashvili, Sexist Language and Gendered Disinformation 2023, source.
  23. Ben Collins, “Russia-Linked Account Pushed Fake Hillary Clinton Sex Video,” NBC News, April 10, 2018, source.
  24. Inter-Parliamentary Union (IPU), Sexism, Harassment, and Violence Against Women in Parliaments in the Asia-Pacific Region (IPU, March 2025), source.
  25. Kelly Ng, “Pakistan: U.S. Teen Shot Dead by Father over TikTok Videos,” BBC News, January 30, 2025, source.
  26. AFP, “Deepfakes Weaponised to Target Pakistan’s Women Leaders,” France 24, December 3, 2024, source.
  27. Louis Baudoin-Laarman, Jean-Gabriel Fernandez, and Pedro Noel, “Images of the Croatian President in a Bikini: Beware the Fakes,” AFP, July 14, 2018, source.
  28. Seb Starcevic, “Italy’s Giorgia Meloni Called to Testify in Deepfake Porn Case,” Politico, March 21, 2024, source.
  29. AFP, “Deepfakes Weaponised to Target Pakistan’s Women Leaders,” source; “Creator of Deepfake Images of MLA Yet to Be Found,” source.
  30. Scott, “Deepfake Porn Is Political Violence,” source.
  31. Jankowicz, “How Disinformation Became a New Threat to Women,” source.
  32. Marcelo Macedo Soares, “Candidata Aciona PF Após ter Fotos e Video Manipulados por Inteligência Artificial,” Agenda do Poder, September 7, 2024, source; Juliana Causin, “‘Nudes’ Falsos, Deepfake e Jingles Sintéticos Marcam uso da IA no Primeiro Turno e Apontam Desafios para 2026,” O Globo, October 15, 2024, source.
  33. Soares, “Candidata Aciona PF Após ter Fotos e Video Manipulados por Inteligência Artificial,” source; Beatriz Farrugia, “Brazil’s Electoral Deepfake Law Tested as AI-Generated Content Targeted Local Elections,” DFRLab, November 26, 2024, source Dauer, “É #FAKE Foto de Tabata Amaral em Pose Sensual; Trata-se de Deepfake,” g1, September 15, 2024, source; Causin, “‘Nudes’ Falsos, Deepfake e Jingles Sintéticos Marcam uso da IA no Primeiro Turno e Apontam Desafios para 2026,” source; Paulo Piassi, “Prefeita de Bauru Registra Boletim de Ocorrência Contra Veiculação de Deepfake com seu Rosto Sobre Corpo Nu,” g1, September 19, 2024, source.
  34. Starcevic, “Italy’s Giorgia Meloni Called to Testify in Deepfake Porn Case,” source.
  35. Nicole Krättli, “Fake Porn—Real Victims,” video, Neue Zürcher Zeitung, September 1, 2023, source; Danielle Keats Citron, The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age (W. W. Norton & Company, 2022); Asia A. Eaton and Clare McGlynn, “The Psychology of Nonconsensual Porn: Understanding and Addressing a Growing Form of Sexual Violence,” Policy Insights from the Behavioral and Brain Sciences 7, no. 2 (2020), source.
  36. Coralie Kraft, “Trolls Used Her Face to Make Fake Porn. There Was Nothing She Could Do,” New York Times Magazine, July 31, 2024, source.
  37. Kraft, “Trolls Used Her Face to Make Fake Porn,” source.
  38. Justin de Benedictis-Kessner, “Women Are Still Underrepresented in Local Government, Despite a Woman Running for President,” Ash Center for Democratic Governance and Innovation, September 20, 2024, source.
  39. Bianca Britton, “There Were Never More Women in U.K. Parliament. Now There’s an Exodus,” CNN, October 31, 2019, source.
The Targeting of Female Public Officials

Table of Contents

Close