Table of Contents
- Key Points
- Summary and Introduction
- Section One: What are Cybersecurity Jobs?
- Section Two: How Do We Teach Cybersecurity?
- Section Three: How is Competence Measured and Communicated?
- Section Four: What is the Role of Government in Cyber Workforce Development?
- Conclusions
- Appendix: Unanswered Questions in Cybersecurity Workforce Empirics
Key Points
What Are Cybersecurity Jobs?
Cybersecurity jobs are heterogeneous
The pool of cybersecurity jobs encompasses a broad range of work, which often overlaps with other fields. Acknowledging and delineating the variations in cybersecurity jobs allows policymakers and stakeholders to develop a greater variety of solutions tailored to different specific requirements. Some systems exist to do this, but further work remains.
Degree and experience requirements
Unthinking requirements for degrees and work experience artificially narrow the pool of potential employees. Bachelor’s degrees should not be the only entry point into the field, especially in a field that prizes on-the-job experience. For jobs where a bachelor’s degree is necessary or enables long-term career growth, conventional degrees can be improved by incorporating work-based learning.
Security clearances and jobs in U.S. government
The intelligence community and military offer some of the few entry points to cybersecurity career paths, but dependence on these pathways means problems with the security clearance process have an outsized impact on the workforce.
The future of cybersecurity jobs
Technological advancements will not fix the workforce gap. Rather, emerging technologies will change the nature of cybersecurity work and requirements for the workforce.
How Do We Teach Cybersecurity?
Organizing higher education around an interdisciplinary field
To develop cybersecurity expertise not just in computer science, but in areas like law, policy, healthcare, and finance, academic decision-makers and the policymakers who define incentives in higher education should consider cybersecurity not as a single, monolithic discipline within higher education, but rather a field that cuts across–and looks very different in–many disciplines.
Mandates for higher education: teaching, research, and sustainability
Cybersecurity can be difficult to teach in a classroom, which exacerbates tensions between competing priorities in higher education. Administrators must forge a path between the university’s mandate to facilitate research and prepare students for their future jobs while also ensuring the institution’s financial sustainability. Policymakers who set incentives for higher education must reward decisions that lead to a stronger cybersecurity workforce.
Learning cybersecurity outside of higher education
The existence of a diverse array of alternatives to conventional education could enable varying subsets of learners to successfully transition into cybersecurity jobs. These alternatives exist in varying degrees of maturity. Smart policies could guide the development of these alternatives towards best outcomes for students and employers.
Apprenticeships in cybersecurity
Policies to support the growth of apprenticeship as a model for cybersecurity education could have a profound, positive impact on connecting talented individuals with open jobs.
How Do We Measure and Communicate Competence?
Industry certifications
Certifications provide a theoretical framework that could be used to create an entry point into the field, but hiring patterns suggest that the certifications are used as a proxy for work experience more than an indicator of competence. Coupled with the expense of training and testing, this limits the effectiveness of certifications in creating additional entry points into cybersecurity jobs.
Other ways to demonstrate cybersecurity competence
Forums like online platforms and cybersecurity competitions could add to the number and variety of mechanisms for demonstrating competence, but they are not yet to the point of being effectively scalable. Carefully designed hiring policies and funding mechanisms could incentivize growth and best utility of these mechanisms.
What is the Role of Government in Cyber Workforce Development?
Is the U.S. government responsible for growing the nation’s cybersecurity workforce?
The U.S. government has a unique responsibility to enable and incentivize growth in the cybersecurity workforce because an inadequate workforce exposes the nation to serious consequences for economic and national security.
What are the policy options available for building a stronger cybersecurity workforce in the United States?
Policymakers at a variety of levels of government can fund the development of research and programs, set their own spending priorities to support particular pathways, facilitate and incentivize opportunities for collaboration among stakeholders, lead by example, and more.
What can the United States learn from cybersecurity workforce development abroad?
Policymakers can look to other governments for examples of varying solutions to the cybersecurity workforce challenge, but they must recognize that many of these solutions cannot function properly in a U.S. context without significant adaptations.
The role of other stakeholders
Businesses and other actors across the cybersecurity community can and should recognize the benefits for their own long-term success in improving the overall state of the workforce, though it is unreasonable to expect them to act out of altruism alone to improve the alignment between cybersecurity education and jobs. Here too, policymakers can incentivize and educate to reduce friction in implementation of novel solutions.