Chapter III: Trends in Modern War Driving Adaptation in Military Deception

War is primordial, and its nature remains the same. Yet because of changes in the strategic environment and the impact of new technologies, the character of war—how it manifests itself in the real world—is evolving. This report proposes that there are six significant trends in contemporary military affairs that are not just driving changes in the character of war, but will specifically play a role in an evolved approach for military deception.

Trend 1: Democratized Battlespace Awareness and the Signature Battle

This trend sits at the confluence of three technological influences explored earlier: artificial intelligence (AI), commercial sensing, and uncrewed systems. The significant enhancements in battlespace awareness provided to military commanders by this convergence are perhaps the most important influence on the conduct of military deception.

Every item of military hardware possesses multiple signatures. These might be visual, aural, or in the electromagnetic spectrum. Military units, at different levels, also possess signatures. This can include patterns of operations and exercise schedules as well as indicators for impending military activity. For example, large formations of ground vehicles generate dust, noise, heat, and exhaust signatures that can be detected by a range of different sensors. Individual ships, and naval task forces, generate visual, electromagnetic, sound, and wake signatures. Aircraft and missiles—even those with stealth properties—also have unique signatures that can be detected. Future military organizations must be able to minimize their tactical and strategic signatures, use recorded signatures to deceive, and be able to detect and exploit adversary signatures—across all the domains in which humans compete and fight.

Finally, while each individual person can be seen, heard, smelled, etc., some individuals, such as senior commanders, have more prominent signatures. This might include concentrated communications networks around them or even signature styles of tactical decision-making, as well as individual and organizational patterns that might be identified in operational or strategic decision-making.

The aggregation of all of these kinds of signatures is an ongoing and adaptive signature battle. The technological sophistication of potential adversaries, their mass, and their presence in every domain of war mean that the battle of military signatures will be one of the defining aspects of warfare in the twenty-first century. Those institutions that can collect information on the various signatures of military organizations and turn this information into timely, actionable products will possess a decisive advantage.1

A major shift from the above trends is how signature detection using large numbers of uncrewed systems and commercial sensors has also become more important in military operations, and this was examined in the previous chapter. For example, many commentators and analysts now use the U.S. National Aeronautics and Space Administration (NASA) Fire Information and Resource Management System (FIRMS), freely available online, to assess the level of artillery fire missions in Ukraine.2 And because Russians have often resorted to insecure communications during their invasion, civilian operators as well as military intelligence agencies have been able to intercept, analyze, and share sensitive military discussions.3

Data from civilian technologies can provide a signature that might be exploited by unfriendly actors. The Strava information leak, in which fitness data led to the mapping of several U.S. facilities overseas, is just one example of consumer device signatures having an impact on military activities—and the challenge for military forces to minimize their signatures.

These open-source sensors often use older versions of contemporary military technology. And as the NASA FIRMS demonstrates, these provide only rough approximations of events that might be related to military activity. But intelligence activities—and busting through deception measures—are about assembling multiple layers of information to build a complete picture or hypothesis of what those layers mean. The ability of open-source sensors, often used by civilian intelligence agencies such as Bellingcat, to detect military activity means that military institutions must be even more careful and clever with their signatures in the twenty-first century if they hope to achieve any deception of their foes.

Open-source analytical capacity provides an important way to see more of an enemy operating system and break through their deception systems. It can be the provision of commercial AI and data services to assist in classified analysis, or the blending of commercial analysis (such as the daily reports and analysis for Ukraine, the Middle East, and Taiwan undertaken by the Institute for the Study of War). Perhaps the most promising capacity in the ecosystem of capabilities that make up democratized intelligence is artificial intelligence.

The war in Ukraine has become the first international conflict in which the opposing sides have actively developed and used AI for military purposes. During the war, Ukraine has benefited from allies and partners providing AI technologies. A key element of the war has been the massive amounts of data generated by a plethora of sources. The huge and growing volume of data is larger than humans can analyze quickly and accurately. AI has therefore become an increasingly useful capability for data analysis to aid Ukrainian and Russian decision-making in this war.

Between 2014 and 2022, Ukraine’s tech-savvy workforce developed and introduced multiple new situational awareness and battlefield management systems to the Ukrainian military. Many were unofficial and unsanctioned, but the volunteer groups had direct communication with front-line operational forces, allowing them to focus their development efforts on high-priority military needs.

One of the initiatives, the situational awareness system Delta, was eventually adopted and formally integrated into the Ukrainian military and also achieved NATO certification. Delta has recently been enhanced with AI/machine learning-enabled capabilities.4 The use of Delta, and the tactical equivalent called Kropyva, has now been normalized at every level of military activities in the Ukrainian Armed Forces.

The support for AI development in Ukraine has transformed since 2022. Many government agencies and institutions have shifted from initially neglecting AI to actively creating specialized departments and units dedicated to developing AI capabilities. This transformation has been driven largely by the pressing demands of the ongoing war against Russia, in which AI technologies have repeatedly demonstrated the potential to provide an advantage on the battlefield.

In a recent report from the Center for Strategic and International Studies, Kateryna Bondar explores the expansion of military AI into six major applications:

  1. Autonomy. The most significant advancements have been in autonomous systems, where Ukraine is making strides in areas such as GPS-denied navigation and swarm operations.
  2. Open-source intelligence and fighting disinformation. AI helps to analyze large volumes of digital content from media and social networks and to identify Russian narratives, propaganda, and information campaigns spreading disinformation.
  3. Situational awareness and command and control. AI enhances situational awareness with numerous software platforms used by the military to analyze battlefield and intelligence data and to facilitate real-time, efficient decision-making.
  4. Demining. AI-powered analytic software and AI-enabled unmanned ground vehicles improve the efficiency and safety of mine clearance.
  5. Training and simulation. AI-driven training simulations help soldiers adapt to complex battlefield conditions by playing close-to-real combat scenarios with AI adjustments to address warfighters’ skill gaps.
  6. Damage assessment. AI is crucial in damage assessment, utilizing satellite data and drone imagery to analyze damage, losses, and devastation, and to estimate future recovery efforts.5

The Russians are equally looking to exploit AI for military benefit. The Russian Ministry of Defense seeks to employ AI to provide data analysis and decision-making capacity to military forces in a “human in the loop” approach to improve the effectiveness of military operations. As Sam Bendett notes in a May 2024 report on Russian AI: “Russian military discourse emphasizes that in the long term, there will be an eventual point where technologies subsume and then replace human involvement in military operations—yet in the near term, Russian military thinking affirms that humans must remain firmly in the loop.”6

There are five principal implications of this new era’s democratization of battlespace awareness and the accompanying signature battle for military deception operations.

First, adversaries may be able to harness advanced sensor networks as well as open-source information that can much more rapidly subvert deception operations. Both Ukraine and Russia have used military and commercial sensor information throughout the war, providing a real-time demonstration of this capability to an array of state and non-state actors globally. Because of this, it is likely that it will be harder to generate surprise against enemy forces. Military organizations will therefore also need to improve their capacity to understand the wide range of open-source sensors and information that is available and develop methods to ingest this, and blend it with military intelligence to include their ability to detect threats and degrade enemy capacity to detect friendly forces.

Second, there has been an increasing use of the term “battlespace transparency” in discourse about the war in Ukraine and future war. This term is being used by lay commentators as well as in professional military forums.7 While this trend of converging military and commercial sensing and analysis has significantly enhanced tactical visibility of the battlespace, the term “battlefield transparency” is probably an overstatement of what is taking place and sensible skepticism should be applied. There is unlikely to ever be an “unblinking eye” on all human activities and intentions despite the increasingly ubiquitous and pervasive nature of sensors—military and civilian—that mean many military signatures can be detected in near real time. As Ukraine has demonstrated, if the battlefield were transparent, surprise by definition would be impossible. That has not been the case in the past three years.

Third, deception operations are clearly possible despite the enhanced (but not transparent) visibility of the battlespace and the availability of rapid, massed precision-attack capabilities. New types of adaptive camouflage, vehicle shaping, cyber warfare, and interference will need to be developed to counter the new and integrated sensor technologies. At the same time, for effective military deception operations in the technical realm, space-based systems will need to be spoofed, jammed, or otherwise made unable to undertake the sensing and communications that can compromise deception and the generation of surprise (and force preservation) in military operations.

Fourth, such measures can enhance the practice of signature management. Insurgents and terrorist groups in the past two decades have especially learned to mask their various signatures to generate a level of stealth in how they operated, where and when they attacked, as well as their sources of support.8

Signatures must not only be detected but also measured and recorded. This can assist in detecting other similar adversary signatures using AI, or in friendly-projecting these signatures to confuse enemy sensors and analytical efforts. At the same time, signature management does not only support tactical deception. Another area requiring attention is strategic capabilities at home bases, military headquarters, and other infrastructure.

Using signatures that cannot be hidden to deceive can also be effective. In the 1973 Yom Kippur War, the Egyptian military successfully convinced the Israelis that their military preparations for war were actually just large-scale routine exercises. The Egyptians created an alternative truth to deceive the Israelis and retain the element of surprise. In 2024, Ukraine did the same by deploying a large number of troops to the Sumy region, ostensibly to defend against a Russian incursion in Sumy Oblast. In reality, this was the assault force that was allocated to strike into Kursk. Accordingly, in a battle of signatures—and application of algorithmic decision support—friendly systems must be trained to smash through deception measures that are provided by well-planned feints like these.

Fifth, despite the importance of technology, the centrality of good training and leadership development processes is even more crucial. The key to military success is generating more uncertainty in the minds of enemy commanders and staff than exists in ours. Humans, and humans supported by improved staff and better AI decision support, are a non-discretionary element of this. The training for commanders to employ and adapt new masking techniques to minimize their potential of detection will be increasingly important in pre-war and wartime activities. As discussed earlier in this report, deception planning in the People’s Liberation Army (PLA) is a primary responsibility of commanders, not staff. Consideration of how this might be implemented in U.S. and NATO military institutions is required.

Trend 2: New Era Mobilization and Mass

New approaches to mass collection and analysis of personal information, misinformation and disinformation, manufacturing using 3D printing, and the ubiquity of massed, uncrewed systems across the land, sea, air, and space domains are resulting in a new era of mass warfare. This involves the concurrent use of large-scale conventional forces and massed autonomous systems, and the wide-scale use of influence operations, including sophisticated algorithms.9

The rise of unmanned systems has been chronicled elsewhere and is covered in more detail in Chapter II. While debate continues over whether robotic systems will replace or complement humans on the battlefield, military forces are racing to enhance the mass of combat power they can generate through the use of autonomous systems and clever algorithms. In the past three years, this has seen both Ukraine and Russia significantly expand the scope of precision attack across the battlespace and increase its pace.

Future military organizations will need to generate forces with an optimal balance of expensive platforms and cheaper, smaller autonomous systems that will be quicker, more adaptable to different missions, and more widely available. This balanced force—which generates mass through crewed systems, autonomous capabilities, and influence activities—must be employed using new twenty-first-century warfighting concepts and strategies by people whose training and education feature the integrated application of human and machine capabilities.

Mass influence operations are an important element of the new operational environment. Algorithms, machine learning, and massive datasets can, and do, assist military and government organizations to undertake wide-scale—yet precisely targeted—influence operations in a way that was impossible even a decade ago. This form of mass will continue to develop as institutions learn the lessons of Chinese coercive activities in the Indo-Pacific and the activities of the belligerents in the Russo–Ukraine War, as well as the application of global information operations by terrorist organizations such as Hamas.10

As the war in Ukraine demonstrates, the convergence of this trend with the enhanced visibility of new-era meshed civil–military sensor networks means that massing forces can also lead to disaster. This is not a new trend, however. Since the lethality of large military forces began to significantly improve in the wake of the first industrial revolution, military doctrine has emphasized dispersion (and deception) as a response to the more lethal battlespace environment.

The harsh reality for military commanders now is that they face a twenty-first-century massing versus dispersion predicament. The new meshed civil–military sensor frameworks have resulted in an environment where nearly all signatures of military equipment, personnel, and collective forces can be detected more accurately and more rapidly. When linked to the outcomes of the precision revolution of the past three decades, this closes the detection to destruction gap (or kill chain) in military operations to very small amounts of time.

The consequence is that massing military forces for ground combat operations, large-scale aerial attacks, or naval operations has become tactically and even strategically higher-risk than in previous eras. Even if newly developed hard and soft kill measures can better protect forces when they mass for decisive events, they are almost assured of detection, which makes achieving surprise very difficult.

This lesson is driven home by the case of the failed Ukrainian 2023 counteroffensive. It was clear to the Russians, well before H-Hour, the locations where the Ukrainian main effort would fall. As one report notes:

“At the strategic level, leaks of top-secret information from Ukraine’s international partners gave the Russians a precise picture of the Ukrainian assault force’s structure, anticipated capabilities, limitations, and options for axes of advance. Furthermore, the public messaging from the Ukrainian government, and public discourse from partners, gave Russia a clear understanding of the timing of a likely assault and informed AFRF [Armed Forces of the Russian Federation] planning. Russian penetration of Ukrainian communications systems enabled capture of a range of materials. The result was that when the offensive started, Ukrainian efforts to compartmentalise planning often left friendly forces with less understanding of the wider plan than Russian commanders. The lesson is clear: Future operations must be accompanied by appropriate deception and more effective operational security.”11

Consequently, the Russians were able to ensure the optimum deployment of their ground forces and fires to blunt the initial Ukrainian attacks while permitting their forces more freedom of action elsewhere. This represents a signature battle and is a critical aspect of this operational problem. In essence, modern military forces must be equally capable of operating in dispersed and massed forms, but they must be able to minimize detection of when they do mass in a way that provides an improved chance of surprise and landing a decisive blow against an adversary.

This has multiple implications for the conduct of military deception operations:

First, the mass use of drones and other sensors, as seen throughout the conflict in Ukraine, has saturated the battlespace. This means that concentrations of military forces are difficult to hide, as are their support systems, such as logistics, transport networks, and communications networks. One of the Russian responses to this environment is the use of smaller tactical teams to conduct infiltration tactics at unit boundaries and during unit rotations by the Ukrainian ground forces. It is not fool-proof but is successful enough given the current disparities in manpower.12

Second, the use of technologies such as additive manufacturing (at home and in the field) may permit the rapid construction of dummies and decoys for use to saturate enemy sensors. This might include clear, plentiful emitters that replicate and saturate the enemy’s capacity to undertake electronic warfare activities. While such technology has not been widely used in Ukraine, it offers one potential pathway for deployed deception operations in areas where there is no ready access to local industry to construct decoys.

Third, technologies to spoof enemy sensors and databases about friendly locations and intentions are crucial. This includes the mass production of noise to overwhelm enemy counter-deception measures, like the use of chaff to confuse enemy air defense systems in the Second World War as well as in the modern recent Gulf Wars and Ukraine. It will also probably demand the wide-scale deployment of decoy drones to overwhelm enemy sensor systems during strike operations. Both Ukraine and Russia have employed this technique to improve the chances of penetrating sophisticated air defense systems.13 To complement these decoys, the mass application of cyber capabilities and AI might be useful to detect and interfere with sensors and the networks that transmit data about friendly-force locations.

Fourth, the ability to disperse, concentrate, and re-disperse is critical to deception. Not only does this confuse an adversary about the direction from which a physical main effort might emanate, but it also makes the identification and destruction of high-value military targets a more difficult (and time-consuming) undertaking. The Russians, after the hard lessons of the introduction of high mobility artillery rocket systems (HIMARS) in 2022 and the targeting of their Black Sea Fleet, have adapted to implement more dispersed concepts of operations on the ground and at sea over the past three years. The concept of distributed operations, which has been explored by several Western military institutions, is crucial to dispersion and is explored in the final chapter of this report.

Trend 3: Cheaper, More Precise Deep Strike

Long-range strike has been a key development for the Ukrainian Armed Forces, as well as military and non-state actors in the Middle East, over the past several years. Beginning the war with almost no capacity to hit Russian strategic targets, the Ukrainians have demonstrated an evolved approach to long-range strike that embraces a high-end/low-end mix of weapons and combines foreign and indigenous solutions. This strike capability has been constructed from a combination of ground-based rocket launchers, armed drones, cruise missiles, and uncrewed maritime strike vessels. This long-range strike complex is not just a military capability—it is a political necessity.

The development of a lower-cost strategic strike complex in Ukraine and elsewhere has been underpinned by the technological influences of uncrewed weapons’ availability and cost, as well as meshed military and civilian sensor networks.14 A good example of a new long-range strike capability is the Ukrainian 14th Unmanned Aerial Vehicle (UAV) Regiment, which is part of the Unmanned Systems Force. It conducts strikes, employing integrated planning from a range of military and national intelligence organizations, at ranges out to 2000 kilometers. Potential adversaries will have increasing access to a wider range of long-range systems over the coming decade.

This lower bar for strategic strike capacity is forcing a new conversation among the most senior military and political leaders about what is the appropriate balance of long-range strike and close combat capabilities. Many nations have begun to increase their investment in longer-range strike systems. While long-range strike capabilities are important components of the arsenals of military institutions, they are not a silver bullet.

The planning, conduct, assessment, and adaptation of long-range strike across domains must be carefully balanced with investment in close combat and other capabilities. There are several reasons for this. First, it forces adversaries to make difficult choices about the array of military capabilities to develop and deploy, generating uncertainty. Balancing between the deep and close fights also provides redundancy in conventional deterrence systems. An enemy might be able to penetrate a long-range strike complex but may still have to close with a fight in combat. Not every nation wants to do this. And as Israel discovered on October 7, 2023, placing too much emphasis on remote, long-range recon-strike complexes and insufficient emphasis on close combat capacity can lead to devastating failure.

The wide availability of long-range strike capabilities among state and non-state actors has the following implications for military deception:

First, Ukraine has reinforced that deception is a necessary part of strike planning and execution. Route planning for weapons, as well as for preparatory intelligence collection, must deceive the enemy about likely targets, which will then influence their deployment of anti-drone and missile sensors and attack systems. Decoys during the execution of strike operations are critical, and this has been a feature of nearly all Ukrainian and Russian strike operations in the past couple of years.

Second, deception is key to protecting strike infrastructure. While the obvious challenge is to ensure an enemy cannot strike and destroy storage depots for munitions and delivery platforms, key planning staff locations must also be protected and potentially included in deception plans. This might involve operations to degrade the capacity of enemy sensors to collect imagery and other information for target mensuration. The Russians, for example, commonly use smoke and mist generators to degrade satellite collection of imagery over critical infrastructure in Russia. They, and the Ukrainians, also frequently move air defense units to improve their survivability and complicate strike planning for their adversaries.15

Finally, delivery of new weapon systems during wartime should also be accompanied by preconceived deception and operational security plans so that the adversary cannot prepare counters before the systems are used. This was the case for the arrival of ATACMs (army tactical missile systems) in Ukraine in 2023, which led to several important strikes on Russian force concentrations that did not realize they were at threat of such strikes.16

Trend 4: Strategic Influence and Cognitive Dominance Activities

GettyImages-1239844238 (CHAPTER 3) (1)
A Ukrainian soldier waves the Ukraine flag during Russia’s invasion attempt in 2022.
Alexey Furman via Getty Images

War has always possessed a complex balance of physical, intellectual, and moral forces.17 Influencing the thinking and actions of enemy commanders, and political leaders, is as old as war. Winning war in ancient times and in the modern era is about winning the war and winning the story of the war. Actions that are undertaken as part of influence activities before and during war include disinformation, and—importantly for our purposes here—military deception.

Disruptive twenty-first-century technologies have not only enhanced the lethality of military forces at greater distance, but they also now provide the technological means to target and influence various populations (enemy and friendly) in a way that has not been possible before. The ability of modern states, corporations, and non-state actors to devise and test strategic messages—targeting different groups with different narratives—through the internet and social media, adjust those messages, and access hundreds of millions of users almost instantly is unprecedented. Leaders, their advisors, military service members, and civilian populations can all be targeted in manners both directed and at scale.

Social media has revolutionized global communication, social interaction, marketing, and professional discourse. It has demonstrated a capacity for penetration and influencing perceptions of humans that is historically unprecedented, particularly when compared to other means of communication.

Social media is also dissimilar from other forms of media because of two principal reasons. First, social media is viral; users are both targets and also potential fellow combatants who can be enlisted into sharing content across their own social networks. As the 2018 book LikeWar proposes, “You are now what you share.”18 

Second, social media users with smartphones are media consumers who are highly mobile and increasingly omnipresent in most settings.19 As of January 2025, just over 63 percent of people worldwide are classified as social media users, and social networks are either the most or second-most visited types of sites for every age group between 16 and 64.20 Although there are limitations placed on content in many countries—China has a state-of-the-art censorship regime—even authoritarian nations like Russia and China have their own versions of social media, which are often used as tools of influence by the state and corporations. These regimes have become experts in leveraging social media to spread disinformation and undertake external influence campaigns.21

As the war in Ukraine has demonstrated, the “influence playing field” is not entirely dominated by authoritarian regimes. In the lead-up to the invasion of Ukraine on February 24, 2022, U.S. intelligence agencies were able to use sensitive sources and reporting to not just discover but also attempt to preempt Russian operations.22 These releases of information discredited Russian narratives about the war, crowded the information space to degrade the impact of Russian influence campaigns, and directly assisted the Ukrainian development of military strategy to defend their nation.23 Once the war began, President Volodymyr Zelenskyy masterfully leveraged social media to ensure his nation received military, economic, intelligence, political, and humanitarian aid from the West.

Influence activities undertaken by the Russians, as well as other state and non-state actors, have also had a significant impact on the war. Social media, and its attendant influencers and podcasters, had an impact on the congressional debate over American support for the war in 2024. Several podcasters in the United States were indicted for supporting Russian propaganda activities in September 2024 to influence American opinion on the war and voting in the 2024 presidential elections.24

The effect is also shaping the reporting and perception of war. The rise of citizen social media war commentators began to emerge more than a decade ago, as was noted in a 2013 paper titled “The New War Correspondents.25 But there has been an explosion in this approach during the Russo–Ukraine War. Hundreds, if not thousands, of online commentators—some well-credentialed and experienced, some not experienced at all—have gained a high level of influence among populations in Russia, Ukraine, and many Western nations. Analysts have developed sophisticated mapping products to track progress in the war and have shared countless numbers of images and videos, which all exert some level of influence on those who view them.

This has been a role traditionally reserved for a small number of specialist journalists, otherwise known as war correspondents. The older generation of war correspondents has adapted quickly, combining their traditional reporting approaches with the exploitation (and verification) of social media geolocation and other data to report on the war.26 But because of the demands on rapid reporting, over-reliance on social media information (which is more difficult to verify and very open to lies and deception) is a risk for policymakers and senior military leaders. As one report on the use of social media in Ukraine notes, “We’re still working out just what ‘truth is the first casualty’ means in the social media age. It’s no longer the case that the truth just isn’t out there, or that it’s only available pre-glossed with propaganda. It’s that there’s too much information.”27

The influence of generative AI is also reaching the war in Ukraine and the conduct of deception operations. Deepfake technologies can be employed to generate fake news and fake videos of credible spokespersons, with the aim of degrading trust in politicians and government-provided information about the war. At the same time, as a recent Lawfare article notes, deepfakes can cause a “liar’s dividend,” where those confronted with evidence of corruption and abuses of power can sow uncertainty and avoid accountability by saying, “It’s fake.”28

While China and Russia have demonstrated an advanced capability to undertake strategic influence operations, the recent war in Ukraine demonstrates that it is possible for democracies to also generate strategic influence. This can be done by military and government institutions, from professional and amateur war correspondents. War and competition in the twenty-first century will see an increasingly sophisticated approach to influence operations—at every level—and an increasing quantity of them.

A final element of this trend is the increasing attention being paid by military institutions to cognitive warfare, a concept with growing relevance in the modern security environment. Enemy states seek to continuously undermine the integrity of political processes in democratic societies, as well as related military-strategic aims, through the application of integrated strategies that coordinate political, military, economic, and information activities.29

It is important to draw a distinction here between traditional information operations and cognitive warfare.30 As Christoph Deppe has written, “Whereas information warfare centers on controlling the dissemination of information, cognitive warfare strategically aims to shape and manage the reactions of individuals and groups to that information.”31

It is an approach that China has been studying for some time. The most recent report from the U.S. Department of Defense about Chinese military capability notes that: “The PLA concept of cognitive domain operations (CDO) combines psychological warfare with cyber operations to shape adversary behavior and decision-making. Since at least the mid-2010s, the PLA has been incorporating the concept of cognitive warfare or CDO into PLA frameworks for conducting influence operations. While the concept of cognitive warfare appears to be a PLA-specific, overall PRC [People’s Republic of China] influence operations reflect a whole-of-government approach to shaping the information environment.”32

The efforts described in Chinese literature as cognitive dominance operations have the purpose of achieving “mind dominance.” Given that the primary objective of military deception is to influence how enemy decision makers think, the concept of cognitive warfare is likely to play an increasingly important function in planning and achieving military deception aims. Building a capability for implementing this approach to war has seen China invest in a range of human sciences, as the 2024 Pentagon report notes: “The PLA is exploring a range of ‘neurocognitive warfare’ capabilities that exploit adversaries using neuroscience and psychology.”33 But it also uses new technologies such as AI, social media, and deepfakes as well.

Russia has also invested in cognitive operations. As Dima Adamsky describes in The Russian Way of Deterrence, Russia’s operations in the information sphere have traditionally been divided into cognitive-psychological (CP) and digital-technological (DT) activities, but there is an increasing convergence of the two forms of operation. The aim of these activities is to achieve “informational deterrence,” and in a term used by Adamsky, the achievement of “cumulative coercion.”34

One of the most chilling possibilities with this approach to warfare is that potential adversaries might be able to entirely circumvent military operations by using cognitive warfare to coerce entire societies into believing that either war is not possible to win or that their nation has no interest in challenging the strategic objectives of an adversary state.

While mass, precision-strike, and meshed civil–military sensor networks are crucial elements of contemporary warfare, the ability to influence and shape decision-making at all levels is also critical. This has been accepted by many Western military organizations, and some of the leading work done outside China and Russia is undertaken at the NATO Allied Command Transformation organization. It has developed a cognitive warfare concept, explaining “cognitive” as referring to “the mental action or process of understanding, encompassing all aspects of intellectual function, including the subconscious and emotional aspects that drive a majority of human decision-making.”35

But in Western military organizations, cognitive warfare still presents multiple challenges. While it may provide military and national security decision makers with a method to navigate the complexities of modern warfare, it may also require new methods of recruiting, training, and developing individuals and evolve how strategic and operational planning is conducted. Importantly, the assessment of strategic risk and opportunity in these kinds of operations in Western military organizations remains underdeveloped.

Military deception is an important component of contemporary strategic influence and cognitive dominance operations. The application of military deception seeks to influence the thinking and decision-making of enemy leaders. Consequently, the interaction of strategic influence, cognitive dominance activities, and military deception has the following implications:

First, cognitive warfare is likely to increasingly influence the planning and conduct of military deception and will eventually become the principal influence in these activities. Further work to develop the concept of cognitive warfare is required, as is research and doctrine development on the role of cognitive warfare in the conduct of military deception.

Second, the pervasiveness of social media users in many of the environments in which humans will fight means military commanders and planners must incorporate the threats and opportunities of social media into planning processes. Assumptions must be made in formal planning and decision-making about the level of visibility that social media might provide over military operations, and the levels of deception that might be required to prevent or minimize it. The war in Ukraine has probably been a peak in social media commentary about any single war.

Citizen commentators and influencers are now a prolific, constant presence in national and global discourse and the battle of ideas. The opinions of social media commentators, regardless of their expertise, can shift opinions in portions of the less-informed citizenry of democratic and authoritarian nations alike. These commentators can be deceived, however. In addition to the rise of citizen war correspondents using social media, there has been an emergence of curators. As “The New War Correspondents” notes, “People have taken the role of both aggregating and disseminating information to a large number of people in the city who follow them and who even send reports to them.”36 Further study of this phenomenon and its application for military deception operations might start with these curators and super-aggregators.

Third, the application of deepfakes by potential adversaries could impact the credibility of national leaders, and their communication with their citizens, in situations of competition and conflict. The Russians, early in this war, attempted to use a deepfake of President Zelenskyy surrendering to influence opinion in Ukraine and among its supporters.37 In 2023, Russian President Putin even addressed a deepfake version of himself at an event in Russia.38 The use of advanced AI to create these will necessitate counter-deepfake technologies in military and other national security institutions to ensure that flooding the internet with deepfakes doesn’t impact the “cut through” of actual national strategic messaging and subsequently dissuade populations from supporting national competition or war efforts.

Finally, given the propensity of potential adversaries to engage in deception, the conduct of counter-deception operations must assume a more central place in military operations. The capacity to recognize adversary deception mechanisms must be enhanced if the ability of authoritarian regimes to bluff and deceive democracies is to be degraded.

Trend 5: Ubiquitous Air, Sea, and Land Autonomy and Human-Machine Integration

An important principle in the design of future military organizations is to augment human physical and cognitive capabilities to generate greater mass, more lethal deterrent capabilities, more rapid decision-making, and more effective integration in the battlespace. At the same time, the marriage of human and robotic/algorithmic capacity might also result in more efficient training and education; improved strategy development; better development, execution, and measurement of influence campaigns; and better experimentation and testing of future force models, battlefield options, and other institutional challenges.39

Robotic systems, big data, high-performance computing, and algorithms are already being developed and deployed by military organizations in increasing numbers. Recently, the application of autonomous systems within existing human organizations and tactical approaches has been seen in Ukraine and Israel. From its drone operations, Ukraine has amassed a huge trove of video data, which is now being applied to train AI targeting algorithms to improve the success rates of drone strikes on the battlefield.40 Ukraine has also been employing AI for imagery analysis in strategic intelligence organizations, although, as the Chief of Ukraine’s military intelligence General Kyrylo Budanov has made clear, it still requires improvement.41 Israel, on the other hand, appears to have advanced even further. Before its war in Gaza, it developed AI models, called Habsora (the Gospel), which were able to rapidly generate hundreds of additional targets compared to human targeting processes.42

It is also apparent that both Ukraine and Russia are using humans teamed with algorithms in their domestic and international influence campaigns. These are activities that, because of their scale and sophistication in targeting, require both human inventiveness and machine power for generating mass influence.43

It is important to note that these new and disruptive autonomous systems (robots) and algorithms (AI) will not just be tools used by humans. In many cases, these technologies might act as full partners of human beings in the conduct of military missions. This will necessitate a change to many of the extant training and education philosophies in military organizations.

First, the marriage of human ingenuity and creativity with robots and algorithms in places like Ukraine, Russia, and Israel in the past three years portends significant improvements in the speed and quality of decision-making in future operations; this includes deception and counter-deception activities. The advantages of this approach will accrue to both friendly and enemy forces.

Second, the marriage of human and algorithmic cognitive abilities might be leveraged to develop more sophisticated (and possibly creative) approaches to military deception operations as well as the conduct of counter-deception activities against adversaries.

Trend 6: Faster and Better-Integrated Adaptation

Frank Hoffman proposes in Mars Adapting: Military Change During War that “the ultimate test of military preparation and effectiveness does not end once a war begins. On the contrary, history strongly reflects the enduring phenomena of learning and implementing change during war as well…The requirement that a force must adapt while it is in combat is built into the inherent nature of war.”44

Adaptation is a device to build advantage in many areas continuously while constantly negating enemy advantage. The possession of a systemic, strategic, and well-led approach to adaptation is something that can give a nation, or alliance, greater power in both peace and war. At the same time, it is something that can be used against friendly forces to devastating effect if they don’t understand it or don’t make efforts to degrade our enemy’s ability to learn and adapt.

Since the beginning of the large-scale Russian invasion in February 2022, the most important individual and institutional behavior for Ukraine and Russia has been their ability to learn and adapt. This is an interactive fight because each side is learning based on the reactions of their adversary and then finding and implementing solutions to improve their effectiveness against that enemy. This process, which can be described as the Adaptation Battle, occurs at the tactical, operational, and strategic levels.

Adaptation has taken place in both the Ukrainian and Russian military institutions during the war. This adaptation has changed the structure, tactics, and training of both organizations, and they are both different military institutions now if compared to the Ukrainian and Russian military institutions that existed before the Russian invasion of 2022.

It has also taken place at multiple levels in both military institutions. Adaptation is not a singular or holistic process that takes place at one level of an institution. In any military institution, there will be multiple instances of adaptation occurring at any one time, and these will generally be occurring in different geographic areas (depending on threat) as well as at different levels within the hierarchical construct of a military force.

The effectiveness of adaptation that occurs in war, however, relies at least partially on the quality of pre-war foundations. While adaptation that occurs in combat is a natural reaction for military personnel who wish to “survive the next battle,” wider-scale adaptation can be more effective if there are pre-existing processes and cultures for learning and sharing lessons within and between institutions. The possession of an institutional learning culture and an agreed method for learning and sharing lessons can ensure that once battlefield lessons begin to flow during wartime, there are ready pathways for ensuring that these lessons can be employed to improve training, education, tactics, strategy, equipment, and alliances to improve the overall military effectiveness of an institution.

There is a relationship between adaptation and deception in at least two areas.

First, the speed of planning, decision-making, action, and assessment in many forms of military operations is increasing. Ukrainian battlefield commanders now describe an adaptation cycle on the battlefield, seeing drone ops evolve every two to three weeks and Russian ground tactics evolving every two to three months.45 This is an outcome of the effects of hypersonic weapons, the potential for AI to speed up decision-making at multiple levels of command, and the increasingly rapid media cycles that influence political decisions. This also means an adversary will seek to speed up its learning and adaptation to overcome friendly forces’ efforts to generate a tempo of operations that overwhelms their decision-making capacity. Deception can play a role in slowing down the ability of the enemy to learn about friendly intentions, posture, locations, and capabilities, and therefore have an impact on their adaptive cycles.

Second, deception might be deliberately injected into enemy adaptation cycles. As Scott Gerwehr and Russell W. Glenn note in their 2003 report Unweaving the Web, “Covert/clandestine or other actions taken to introduce errors into the innovation process may significantly hamper adaptation.” Multiple actions might be part of this effort, including disinformation that is communicated to the enemy that purports to reveal a technical or tactical vulnerability. Or enemy deception measures may be allowed to go unchallenged to permit replication of the bad to hinder adaptation and provide friendly planners with a targetable vulnerability in the enemy system.46

For example, Ukraine injected fake stories about the reasons for the deployment of forces to Sumy in early 2024, and about its inability to generate an offensive operation in 2024. This meant that the Russians focused on adapting to Ukrainian defensive initiatives in the eastern front and not in the northeast in Kursk.

Before undertaking deception activities to interfere with enemy adaptation processes, an assessment would be required to gauge just how good we are at learning and adaptating at the tactical and strategic levels. While Gerwehr and Russell propose one model in Unweaving the Web, other models might be more appropriate in the circumstances faced by NATO forces in Europe.

In a recent discussion, a Ukrainian senior officer who just completed four years in command of a tank brigade offered that his Russian enemy has become very good at deception. It is a rapidly evolving field, and what worked a year ago does not work now. He noted that while some commanders on both sides are poor at military deception, it is a crucial idea in how the Ukrainian Armed Forces operate. He paraphrased Sun Tzu in describing how Ukrainians think about deception: “If you are weak, make the enemy think you are strong.” For Ukraine, this has cognitive and technical elements.47

The rapid pace of learning and adaptation in modern operations has the following implications for the conduct of deception operations:

First, the rapid pace of adaptation in modern tactical operations will have an impact on measuring the success or failure of deception operations. The Ukrainians, Russians, Israelis, and Hezbollah have all demonstrated this faster learning and adaptation trait in the past three years, and it should be assumed that this is the new pace of learning for any adversary we face. Consideration should be given to rapid-assessment methodologies that enhance measurement of the impact of deception, and these might then be leveraged for measuring the success and failure of medium- and long-term deception activities.

Second, another implication of the quickening pace of tactical operations is that there will be minimal time for separate or sequential planning activities for supporting plans such as deception. Therefore, deception planning must be integrated fully within normal tactical planning processes.

Third, the shortening of the detection to destruction timeframe, as evidenced from the Russo–Ukraine War, increases the imperative for deception and concealment operations. It will demand the seeding of the battlespace with more dummy targets—including physical decoys and fake networks—than an adversary can deal with.

Fourth, for strategic deception operations, these may take months or years to come to fruition. Building a culture of strategic patience in both political and strategic military leaders will be critical to the success of such deception activities.

Finally, contemporary military deception activities will demonstrate their own learning and adaptation cycles. Both Ukraine and Russia have learned this lesson during the past three years, with one example being that placing decoys alone no longer works. They must be accompanied by fake heart sources, periodic transmissions, and track marks to and from the decoy. This constitutes a deeper level of investment in deception and must be integrated into broader institutional learning and adaptation systems.

Adaptation is important for the cognitive aspects of war. It underpins learning about enemy misinformation and the impact it is having on friendly populations and political systems. Adaptation is crucial not just to winning the war, but also to winning the “story” about the war. It is also vital for military deception operations. As the next chapter describes, military organizations have a range of areas where they will need to adapt in order to be successful at military deception operations in the coming decades.

Citations
  1. Mick Ryan, War Transformed (U.S. Naval Institute Books, 2022), 93.
  2. “Fire Information for Resource Management System,” NASA, source.
  3. “Why Russian Radios in Ukraine Are Getting Spammed with Heavy Metal,” The Economist, March 28, 2022, source.
  4. Kateryna Bondar, Does Ukraine Already Have Functional CJADC2 Technology? (Center for Strategic and International Studies, 2024), source.
  5. Bondar, Understanding the Military AI Ecosystem in Ukraine, 2–3, source.
  6. Samuel Bennett, The Role of AI in Russia’s Confrontation with the West (Center for a New American Security, 2024), source.
  7. See Gina Cavallaro, “The Transparent Battlefield: Combat Training Centers Sharpen Unit Tactics for High tech Fight,” Association of the United States Army, June 25, 2024, source; Dorsel Boyer and ISC Robert K. Becker, “How Ukraine Overcame the Transparent Battlefield to Achieve Operational Surprise in Kursk,” TRADOC G2, September 19, 2024, source.
  8. An important example of this was the Hamas deception plan executed in the lead-up to its October 7, 2023, attack on southern Israel. See Yoav Zitun, “How Hamas Outsmarted Israel: The Deception That Led to the October 7 Intelligence Breakdown,” Ynet, February 28, 2025, source.
  9. Ryan, War Transformed, 83.
  10. Daniel Byman and Emma McCaleb, “Understanding Hamas’s and Hezbollah’s Uses of Information Technology,” Center for Strategic and International Studies, July 31, 2023, source.
  11. Jack Watling, Oleksandr V Danylyuk, and Nick Reynolds, Preliminary Lessons from Ukraine’s Offensive Operations, 2022–23 (Royal United Services Institute, July 2024), 31, source.
  12. This evolution in Russian ground tactics was raised multiple times by Ukrainian brigades during co-author Mick Ryan’s March 2025 research visit to Ukraine. See also Michael Kofman, Assessing Russian Military Adaptation in 2023 (Carnegie Endowment for International Peace, October 2024), source; and U.S. Army, ATP 7-100.1: Russian Tactics (February 2024), source.
  13. Giorgio Di Mizio and Michael Gjerstad, “Ukraine’s Ground-Based Air Defence: Evolution, Resilience, and Pressure,” International Institute for Strategic Studies, February 24, 2024, source.
  14. Roman Romaniuk, “They Set Targets Deep Inside Russia on Fire: The Untold Story of the 14th Unmanned Aerial Vehicle Regiment,” Ukrainska Pravda, March 17, 2025, source.
  15. A description of these Russian counter-strike activities was provided by Ukrainian military intelligence chief, General Kyrylo Budanov, during a research visit to Ukraine in March 2025.
  16. Lara Seligman, Paul McLeary, Alexander Ward, and Veronika Melkozerova, “Ukraine Uses Secretly Shipped U.S. Missiles to Launch Surprise Strike,” Politico, October 17, 2023, source.
  17. In Australian Army doctrine, the combination of these three is called “fighting power.”
  18. Peter Singer and Emerson Brooking, LikeWar: The Weaponization of Social Media (Houghton Mifflin Harcourt, 2018), 273.
  19. Mick Ryan and Marcus Thompson, “Social Media in the Military: Opportunities, Perils and a Safe Middle Path,” Grounded Curiosity, August 21, 2016, source.
  20. “Digital 2025 Global Overview Report,” We Are Social, source.
  21. And they can do it quickly. As a study by Massachusetts Institute of Technology researchers found that social media has enabled false news to travel faster and penetrate further than true stories. Analyzing major news stories over a 10-year period, including 126,000 stories and 3 million tweets, the study found that false information outperforms true information. See Sorough Vosoughi, Deb Roy, and Sinan Aral, “The Spread of True and False News Online,” Science, March 9, 2018, source.
  22. U.S. preemption activities, before the war, are described in Jessica Brandt, “Preempting Putin: Washington’s Campaign of Intelligence Disclosures is Complicating Moscow’s Plans for Ukraine,” Brookings, February 18, 2022, source. Post-invasion preemption with intelligence is explored in Shannon K. Crawford, “Preemptive, Public U.S. Strikes Winning Intelligence War with Russia: Analysis,” ABC News, April 15, 2022, source; and Douglas London, “How Intelligence Is Helping to Win the Unthinkable War with Russia,” The Hill, March 30, 2022, source.
  23. Jennifer Kavanagh, “The Ukraine War Shows How the Nature of Power Is Changing,” Carnegie Endowment for International Peace, June 16, 2022, source.
  24. Alan Suderman and Ali Swenson, “Right-Wing Influencers Were Duped to Work for Covert Russian Operation, U.S. Says,” Associated Press, September 6, 2024, source.
  25. Andrés Monroy-Hernández, et al., “The New War Correspondents: The Rise of Civic Media Curation in Urbane Warfare,” Proceedings of the 2013 Conference on Computer-Supported Cooperative Work: 1443–1452, source.
  26. This combination of new and old ways of reporting on wars is explored in Gillian Vernick, “Visual Forensics Merge with Traditional War Reporting in ‘First Social Media War,’” Reporters Committee for Freedom of the Press, March 7, 2022, source.
  27. Christopher Warren, “Trending in the Trenches: Social Media is Giving Birth to a New Kind of War Journalism,” Crikey, March 7, 2022, source.
  28. See Daniel Byman, Daniel Linna, and V. S. Subrahmanian, “Should Democratic Governments Use Deep Fakes?,” Lawfare, May 9, 2024, source; and Josh A. Goldstein and Andrew Lohn, Deepfakes, Elections, and Shrinking the Liar’s Dividend (Brennan Center for Justice, January 23, 2024), source.
  29. Christoph Deppe and Gary S. Schaal, “Cognitive Warfare: A Conceptual Analysis of the NATO ACT Cognitive Warfare Exploratory Concept,” Frontiers 7 (2024), source.
  30. Robert “Jake” Bebber, Cognitive Competition, Conflict, and War: An Ontological Approach (Hudson Institute, May 2024), 9, source.
  31. Deppe and Schaal, “Cognitive Warfare: A Conceptual Analysis of the NATO ACT Cognitive Warfare Exploratory Concept,” source.
  32. U.S. Department of Defense, Military and Security Developments Involving the People’s Republic of China, 2024, 37, source.
  33. U.S. Department of Defense, Military and Security Developments Involving the People’s Republic of China, 2024, 27, source.
  34. Dima Adamsky, The Russian Way of Deterrence (Stanford University Press, 2024), 40–57.
  35. “Cognitive Warfare: Strengthening and Defending the Mind,” NATO Allied Command Transformation, April 5, 2023, source.
  36. Monroy-Hernández et al., “The New War Correspondents: The Rise of Civic Media Curation in Urbane Warfare,” source.
  37. Tom Simonite, “A Zelensky Deepfake Was Quickly Defeated. The Next One Might Not Be,” Wired, March 17, 2022, source.
  38. Peter Suciu, “Putin’s Deepfake Doppelganger Highlights the Danger of the Technology,” Forbes, December 15, 2023, source.
  39. Ryan, War Transformed, 83–4.
  40. Max Hunder, “Ukraine Collects Vast War Data Trove to Train AI Models,” Reuters, December 21, 2024, source; David Kirichenko, “The Rush for AI-Enabled Drones on Ukrainian Battlefields,” Lawfare, December 5, 2024, source.
  41. Mick Ryan, “What I Learnt About the Future of War in Ukraine This Week,” Australian Financial Review, March 14, 2025, source.
  42. Elizabeth Dwoskin, “Israel Built an ‘AI Factory’ for War. It Unleashed It in Gaza,” Washington Post, December 29, 2024, source.
  43. The literature on this part of the war remains immature. However, some good sources include Kavanagh, “The Ukraine War Shows How the Nature of Power Is Changing,” source; Michaela Dodge, Russia’s War in Ukraine and Implications for Its Influence Operations in the West (National Institute for Public Policy, June 7, 2022), source; and Suzanne Smalley, “Russian Information Operations Focus on Dividing Western Coalition Supporting Ukraine,” CyberScoop, July 7, 2022, source.
  44. Francis G. Hoffman, Mars Adapting: Military Change During War (U.S. Naval Institute Press, 2021).
  45. Mick Ryan, “Dispatch from Ukraine: The Adaptation Battle Intensifies,” Interpreter (blog), Lowy Institute, March 17, 2025, source.
  46. Scott Gerwehr and Russell W. Glenn, Unweaving the Web: Deception and Adaptation in Future Urban Operations (RAND Corporation, 2003), 55.
  47. Author interview with senior officer in Ukrainian Ground Forces, in Kyiv, March 10, 2025. Name withheld by request.
Chapter III: Trends in Modern War Driving Adaptation in Military Deception

Table of Contents

Close