The Pentagon is an illustrious palace with neon Futura-fonted signs that beckon visitors to try the jet-shaped chocolates available in the commercial center, which lies somewhere beyond the rows of desktops loaded with Windows 1.0 and the secret room where the military brass authorize drone strikes.
At least, this was my perception of the Pentagon after having attended the 2017 Future of War conference. “The defense department, broadly speaking, is run in many ways like a 1980s corporation,” said Eric Schmidt, who, among other titles, is the executive chairman of Google-parent company Alphabet, Inc. and chairs the Pentagon’s Defense Innovation Advisory Board, a sharp cohort of tech industry superstars. “There’s not very much software being used, and it feels that way.”
Yet despite the image of the Pentagon—and by extension, aspects of US military intelligence—being of another era, it will certainly be key to the future. The annual conference, now in its third year, brought defense leaders, both military and civilian, to tackle innovation-related military challenges. By focusing on military innovation, specifically with respect to the US military, the diverse line-up of panelists and participants offered a glimpse of an alternative narrative of history, one that places technology as the driver of culture, politics, and war, rather than the other way around.
In his discussion with Schmidt, Tom Ricks, Senior Advisor on National Security at New America’s International Security Program, said in response to Schmidt’s statements on the ability of technology to make the human more efficient, “It sounds to me like you’re describing a terrific war of attrition in which victory goes to the side with the ability to sustain itself economically and demographically.”
While Ricks didn’t belabor the point, he was certainly onto something. Perhaps in future wars, victory will also lie with the side that has a stronger understanding of how the world’s poor, mostly brown, smartphone-gripping populations engage with technology. Military readiness will likely depend on the ability of the services to anticipate alternative uses of technology, rather than merely react to them. And it’s worth noting that those rules of military victory would make recent revelations about secret Facebook groups where Marines (and possibly other branches) sexually humiliate their female colleagues especially damning, in terms of demographic sustenance and the value of human capital.
So in other words, when we’re talking about what the future of war will look like, it might look like the social media that’s already all around us. And it may come down to how, and for what purposes, it’s weaponized. This was the subject of a panel moderated by Peter Singer, a strategist and senior fellow at New America. Conceptually, social media is not as monolithic as the panel might indicate. Perhaps understanding how other regions and demographics engage with communications technology, and further, how foreign governments have conducted information warfare in the past, would help experts predict how foreign militant groups might weaponize communications technology to conduct information warfare in the future.
Indeed, tracking the development of technology is a tricky but necessary component to creating the proper conditions for innovation, which experts at the conference posit the military is desperately in need of. It’s even trickier trying to determine how an individual, or a military, might use new (or old) technologies strategically—or against you. There are many variables at play: geopolitics, sociology, demographics, and of course, history.
To complicate matters further, technological innovation is nonlinear, so there’s also an argument to be made that it’d be imprudent to focus on innovation at the expense of implementing and imagining uses for old technologies. Or put differently, as much as we want to make observations about how ISIS is using social media today, or will use it tomorrow, we also need to look back to other innovations in communications technology. According to historian David Edgerton in his book Shock of the Old, our analysis ought to begin with how we implement a new technology, rather than the invention itself, because often the realization of a technology’s potential comes decades after the initial development. For example, in terms of drone use, countries like the United States, Iran, and Israel have been experimenting with unmanned aircraft technology since the late 20th century, but the United States wouldn’t conduct its first drone strike until 2002. Even today, only nine countries have used armed drones in combat (depending on whom you ask), and norms of use are still being established.
Schmidt also suggested that the military is lacking radical innovation, which may be true, but probably won’t help the United States win more wars. If anything, it might just extend them further.
To that effect, the panel on “Moral Injury” was perhaps the most sobering panel of the event for speaking more frankly on the subject of death and trauma in and around perpetual war. Those who believe that drone strikes can be precise view this execution technology as a progressive step in how we conduct wars. But a look at lethal injection, one of the most recent innovations in execution technology, introduced in the United States and proliferated around the world shortly after, or even the history of how we slaughter swine, reveals that the future of war is far grimmer than a conference can allow. The truth is, innovation in war does not typically lead to a reduction in the rate of killing. It just displaces it, moving it farther onto the fringes.Though it’s impossible to predict the future of war, ruminating on the subject with the industry’s leading scholars certainly isn’t futile. But perhaps we are beyond looking at innovation in war in terms of technologies and should focus more on how populations are using them. After all, the next game changer in war could be something that’s already in the military’s arsenal, waiting to be implemented in a new way.