Jan. 25, 2017
“Where’s this thing going anyway?” I asked myself after a particularly frustrating day at work.
It was late 2005 and I had just spent the last few months of my relatively new career in the Department of Defense and the Intelligence Community as an intelligence analyst focusing on Information Assurance and Computer Network Operations. No one called it cybersecurity then. In fact, no one really cared about anything if it wasn’t counterterrorism.
Having seen the smoke from the Pentagon on 9/11 from my college campus, I decided to join the national security workforce right after I graduated from college. My decision was not the typical one for a young, newly minted female graduate with a liberal arts degree. Most of my colleagues were military and mostly male. I spent my days with a small team analyzing the information warfare programs of nation states and other actors looking to threaten the DoD or US Government.
Most of my colleagues were engineers, computer scientists, or Information Technology types. But a few others, like me, were international relations, history, and political science majors. This meant that we infused a different, human-centered perspective into our work: we knew that behind every attack was a human (or group of humans) and understanding their motivations and level of sophistication was as critical to defense and policy prioritization as determining the vulnerabilities in our system. Our reports were geared towards defense and policy-makers who needed to make strategic and operational decisions; decisions that just couldn’t be made by reviewing ones and zeros.
But our perspective was unique: most of our direct customers came from that traditional IT space, a place where it mattered less who was behind the bad activity, and more that it had happened. “We see it, we stop it,” one Marine captain told me, summing up their strategy, after I had given a briefing on threat actors to his team.
There were many days I questioned whether it was worth it to stick with the field. Every report, every briefing, every exercise in which we probed the human motivations of those threat actors was questioned by network defenders and their superiors as irrelevant to the mission of defending our nation’s cybersecurity. But I stuck with it, and I saw the field evolve into a new profession with a workforce and leadership that evinced the need for a comprehensive and contextual understanding of the “who, what, where, and why” of threats.
Those were the perspectives I looked for when, as my career advanced, I had the opportunity to hire. Interestingly, some of the best threat analysts recruited were former lawyers, teachers, and linguists — almost none with a cybersecurity or computer science degree. In fact, those candidates that did have a technical degree often lacked some of the broader skills — such as an ability to communicate complex concepts to lay people — needed to make them successful. That seems to hold true across industry: a recent study by CSIS and Intel Security found that only 23% of the companies surveyed think educational programs are preparing students to enter the industry.
The reality of cybersecurity in today’s world is that it is more multidisciplinary than ever. Effective cyber operations require coordination and integration between disciplines like vulnerability management, threat intelligence, security operations, forensics, malware analysis, and incident response. And beyond that, executives need to make informed business decisions — whether legal, policy or investment — based on the cyber risks to their enterprises.
It’s increasingly apparent that cybersecurity is a new economy skill for the business executive, technical whiz, and policy wonk alike. I never thought I would see the day when so many cybersecurity related stories, let alone one, would regularly grace the covers of major news outlets. Whether it’s criminal ransomware attacking a healthcare provider, the Russian hack into the DNC, or the biggest breach in history at Yahoo, the impacts of such events further highlights the need for people with skills across the entire spectrum of how an organization functions.
In fact, that’s what makes the field so exciting now. It’s changing fast — and it desperately needs more smart people to guide and shape its future. A study from BurningGlass, CompTIA, and the National Initiative for Cybersecurity Education (NICE) found that there are currently 128,000 openings for Information Security Analysts in the U.S, but only 88,000 workers currently employed in those positions. It takes all kinds of backgrounds to successfully create a cybersecurity workforce, from the human-centric to the reverse engineer. So, what are you waiting for?