Section Two: How Do We Teach Cybersecurity?

The default system for education in cybersecurity—a bachelor’s degree—has much to recommend it; it is already the cultural norm throughout much of the upwardly-mobile American workforce. But the structure of a typical bachelor’s degree program is not always ideal for teaching cybersecurity. Options outside of higher education—e.g. bootcamp-style training programs or the military—are also not universally appropriate. This conundrum offers a unique opportunity to pursue work-based learning options that would be remarkably innovative in the cybersecurity community.

This report does not delve into education at the kindergarten through twelfth grade (K12) level, but that is not to imply that it is irrelevant here. Research indicates that students’ attitudes towards STEM tend to decline dramatically around the ages of 10-14.1 Accordingly, a central challenge in long-term cybersecurity workforce development focuses on engaging students in STEM fields prior to and during that time frame, but the topic does not end there. In order to benefit from emerging interest in STEM fields, schools must have the teachers and resources to provide instruction in these areas. Only 40 percent of K12 principles say that their school offers a computer science class that teaches a coding language, but 90 percent of parents say that opportunities to learn computer science are a good use of school resources.2 Clearly there is room to expand.

The structure of a typical bachelor’s degree program is not always ideal for teaching cybersecurity. Options outside of higher education—e.g. bootcamp-style training programs or the military—are also not universally appropriate.

An interesting feature of discussions on K12 cybersecurity education is that the policy levers to shape priorities and incentives exist predominantly among state, local, territorial, and tribal governments. Meanwhile, an active community of stakeholders are working to develop mechanisms to encourage and promote education that could lead to long-term development of the cybersecurity workforce. This complex ecosystem overlaps with the discussion in this report. However, because the policy solutions are quite different at the K12 level, this report acknowledges the important influence that these years have on later education and opportunities, and begins exploring the educational system at the level of higher education.

Organizing Higher Education around an Interdisciplinary Field

To develop cybersecurity expertise not just in computer science, but areas like law, policy, healthcare, and finance, academic decision-makers and the policymakers who define incentives in higher education should consider cybersecurity not as a single, monolithic discipline within higher education, but rather a field that cuts across—and looks very different in—many disciplines.


Among the characteristics that make it difficult to teach in a degree program, cybersecurity is a “multidisciplinary problem, touching on policy issues, economic incentives, and public and business awareness and education, along with new technical challenges.”3 Programs that teach cybersecurity effectively must also incorporate elements from many fields, which could encourage innovative approaches from the outset. In theory, this basis in outside-the-box thinking could provide an excellent starting point for further innovation. This tends to be very difficult to implement in higher education systems historically segmented into different departments and colleges within a single university.

Where they are taught, cybersecurity programs are often folded into another field within computer science or engineering departments. This is appropriate insofar as cybersecurity exists within the context of technical systems—computers and networks—but it can also limit the growth of cybersecurity as an educational priority outside these departments or as an independent field. Organizations like ABET (Accreditation Board for Engineering and Technology) and the National Security Agency and Department of Homeland Security’s Centers of Academic Excellence programs provide useful frameworks around the content to be taught, but do not answer these more fundamental questions about cybersecurity as a field of study.4

Computer science and engineering disciplines have historically been the focus of cybersecurity curriculum development. One particularly well-developed example of such an effort is the Joint Task Force on Cybersecurity Education, a “collaboration between major international computing societies.”5 The curriculum is meticulously developed and reviewed by academic experts. While the task force very clearly acknowledges cybersecurity’s interdisciplinary nature, the curriculum is targeted at an understanding of the field that “advances cybersecurity as a new computing discipline and positions the cybersecurity curricular guidance within the context of the current set of defined computing disciplines,” which include computer engineering, computer science, information systems, information technology, software technology.6 Quite sensibly, the Joint Task Force has housed this effort squarely in computing disciplines.

The curriculum development effort does not end here. Other academic disciplines could emulate this work. While the Joint Task Force has incorporated legal, economical, and political considerations into their curriculum guidance,7 that incorporation does not replace curriculum development efforts for cybersecurity in each of those disciplines. The draft curriculum proposes teaching a reasonable array of policy principles for a student who expects to do research in a computer science department, but certainly not for a student interested in working for a legislator’s office informing data privacy policy. Educating the future policy wonk requires a very different cybersecurity curriculum8 and maybe even a different pedagogic framework.9 Given that cybersecurity jobs are increasingly crossing into other domains like finance, healthcare, and law, the same could be said for any number of other examples.

Although some colleges and universities have taken on the challenge of developing such interdisciplinary programs, practical considerations like interdepartmental cost sharing, program equities, and the enduring assumptions about where cybersecurity coursework should be anchored often slow down the development of cybersecurity as a cross-cutting field, (or meta-field, as scholars have termed it10) applicable and accessible to many disciplines. The work that academics in computer science and engineering are doing to identify best practices in cybersecurity curriculums is invaluable and critical. However, such efforts should be part of a larger ecosystem of offerings that teach the aspects of cybersecurity most relevant to industries ranging from law to hospitality to medicine to policy and much more.

Mandates for Higher Education: Teaching, Research, and Sustainability

Cybersecurity can be difficult to teach in a classroom, which exacerbates tensions between competing priorities in higher education. Administrators must forge a path between the university’s mandate to facilitate research and prepare students for their future jobs while also ensuring the institution’s financial sustainability. Policymakers who set incentives for higher education must reward decisions that lead to a stronger cybersecurity workforce.


Apart from the theoretical challenge of finding a home for cybersecurity programs in a university setting, the discipline also creates practical challenges. Cybersecurity changes quickly. As New York Times reporter and Harvard University adjunct lecturer David Sanger puts it, “The hardest thing about teaching anything about cybersecurity is the same thing that’s the hard part about writing and reporting about cybersecurity, which is, it’s moving so fast.”11 This makes it difficult to keep conventional classroom education up-to-date, especially when curricula can take weeks and months to make and approve. Automated cybersecurity attacks “are happening in microseconds… so today all we can do is patch and pray,” according to Dr. Arati Prabhakar, formerly the head of the Defense Advanced Research Projects Agency (DARPA) and of the National Institute of Standards and Technology (NIST). She adds, “we are looking for a fundamentally different way to get faster than the pace of the growth of the threat.”12 In an already rapidly developing industry, cutting edge technologies give way to newer tools in the span of weeks and months, a timeline prohibitively difficult to maintain in syllabi developed over much longer timelines.

As difficult as maintaining a current syllabus can be, finding teachers with experience to teach the most current techniques and tools is equally challenging. Applied courses are often taught by instructors and adjunct professors, but they are expensive to hire given competition for experts with these skill sets. Tenured faculty are generally focused on foundational research within a narrow specialty, not the newest bit of technology.

The hardest thing about teaching anything about cybersecurity is the same thing that’s the hard part about writing and reporting about cybersecurity, which is, it’s moving so fast.

Maintaining a focus on foundational education and research allows faculty to cultivate and attract top-tier graduate students to aid in that research, which fosters a fertile environment for the research and development that keeps cybersecurity on the cutting edge. Educators are also charged with the mandate that “students must be encouraged to think and learn, with the understanding that specific content isn’t as important as it would be in training scenarios.”13 These functions are critically important to the university, to the general health of the cybersecurity research community, and to the workforce writ large, but do not answer the question of where students will learn the tools and skills that will be required to enter a career in industry.

This tension between a university’s teaching and research mandates is part of a much larger conversation on the role of higher education in society. Should universities (and research universities in particular) exist to train the workers that will build the future economy, or should their purpose be to cultivate the hotbeds of innovation and deep research that fuel growth and stand as a hallmark of the U.S.’s comparative economic advantage? This question is not a central focus of this report, but understanding the role of the university is important context in considering the potential impact of higher education on the cybersecurity workforce.

Highlighting the crux of this uncertain role for institutions of higher education, Arizona State University (ASU) President Michael Crow and ASU Senior Research Fellow William B. Dabars emphasize that “the inherent limitations of the present model [of research universities] attenuate the potential of this set of institutions to educate citizens in sufficient numbers and address the host of challenges that beset the world.”14 They write about universities’ limitations generally, but in the cybersecurity context, this line of thinking leads to real questions about whether universities can adapt to create capable workers in addition to highly trained researchers. In an industry that desperately requires both types of experts—and many types in between—developing a spectrum of educational offerings is a particularly valuable strategy. Much as the medical field has different educational paths for surgeons, pharmaceutical researchers, technicians, and nurses, a thriving cybersecurity community will require a breadth of educational paths.

The role of the university not only oscillates between research and teaching mandates. Economic considerations also factor into any university’s operations. Given the demand from recent baccalaureate graduates and mid-career job changers for opportunities to break into a lucrative cybersecurity career, developing a professional master’s in cybersecurity may seem like a sound investment for any university administration. However, higher education occasionally finds itself walking an uncomfortable tightrope when it comes to this type of professional graduate degree program.

Such programs are known for their profitability. Adjusted for inflation, tuition for an average graduate degree program in 1989 cost $6,603. In 2010, it cost $14,398.15 This steep rise in cost reflects greater demand for such degrees, which has created a very tempting revenue stream for administrators at often funding-starved schools.16 The resulting incentive structure can encourage universities to provide expensive professional graduate degrees designed for profit17 rather than beneficial student outcomes. With so little data available on what kind of education or training yields best long-term student outcomes in a cybersecurity career—not to mention lukewarm industry attitudes towards skills learned in the classroom rather than on the job—universities offering a professional master’s in cybersecurity must carefully weigh financial priorities and social responsibility.

Learning Cybersecurity Outside of Higher Education

The existence of a diverse array of alternatives to conventional education could enable learners to successfully transition into cybersecurity jobs. These alternatives exist in varying degrees of maturity. Smart policies could guide the development of these alternatives towards best outcomes for students and employers.


Anecdotally, many cybersecurity employees in industry are trained in either the military or the intelligence community, rather than passing through academia. Statistics describing the exact scale of this pattern are hard to come by. There are good reasons why military and intelligence agencies would be reluctant to publish personnel statistics, but consequently it is very difficult to know what proportion of the cybersecurity workforce passes through government service. The Global Information Security Workforce Study says 16 percent of hiring managers prefer to recruit among former and active military, and 30 percent of the workforce comes from a non-technical background, which can include “business, marketing, finance, accounting, or military and defense.”18 An especially useful question to answer would be what percentage of the workforce was trained in the military or intelligence community, and what are the long-term work roles and outcomes for those individuals.

Bootcamps and skills-based short courses provide other potential pathways into the workforce. They have a long history of teaching workplace-relevant skills from stenography to coding.19 Despite this long history, not all examples are positive. Some bootcamps face criticism for over-promising and under-delivering,20 a trend that warrants a note of caution among proponents of skills-based courses.

Cybersecurity bootcamps do not have the numbers seen among their coding bootcamp cousins, but they are already gaining a profile as a viable alternative training option.21 It is uncertain whether they will remain on this promising trajectory or struggle with the challenges that have beset many coding bootcamps. Among these coding camps, surveys indicate 60 percent of bootcamp completers already had a bachelor’s degree, and graduates averaged 6.8 years of work experience,22 which does not seem to be the entry-level path to success one might hope for. The same study indicated that 43 percent of coding bootcamp students were women,23 suggesting that this pathway may help break down longstanding gender imbalances in information technology (IT) disciplines. The previous year, the same study indicated that 79 percent of students had at least a bachelor’s degree, and they averaged 7.6 years of work experience,24 perhaps suggesting a trend towards a more viable entry-level pathway. What this means for cybersecurity bootcamps is unclear.

While skills-based short courses may remain problematic for early-career job seekers, they do offer some promise to workers transitioning from other less-engaging or less-lucrative careers and for employers seeking to upskill their current workforce. For example, they could allow IT support staff to specialize in network security or other cybersecurity disciplines.

In the United Kingdom, industry association CompTIA has already invested in this market with support from the U.K. government. Their Cyber Ready retraining program targets a wide range of applicants (e.g. parents, IT hobbyists, graduates) to provide them with the skills needed to enter cybersecurity careers.25 Such programs could expand in the United States to offer a means for non-cybersecurity professionals to make their way into the industry.

The digitized options in education and training also offer upskilling and training options outside the classroom. While experts debate whether massive open online courses (MOOCs) will replace conventional college degrees more generally,26 providers remain optimistic about online education’s promise for upskilling and retraining.27 In cybersecurity, the popularity of these programs has driven rapid growth for specialized providers (e.g. Maryland-based Cybrary).28 Data-driven insight into what this means for the long-term outcomes of students and workers trained through such programs would be an excellent area for future research.

Apprenticeships in Cybersecurity

Policies to support the growth of apprenticeship as a model for cybersecurity education could have a profoundly positive impact on connecting talented individuals with open jobs.


Given the challenges of teaching cybersecurity in a conventional higher education setting, and recalling that survey data suggests industry experts do not feel that students are graduating with the skills needed to be successful in their new roles,29 what is the preferred training option? Information on what is missing from classroom education is largely anecdotal, and suffers from a community-wide lack of metrics on what exactly employers find useful in the workplace, but the emphasis on practical experience as a part of the learning process is a frequent refrain across the industry.30

Applied skills are often omitted in classrooms in favor of the theoretical principles that undergird the rapidly-changing tools used in workplaces. Teaching those evolving tools raises concerns that they may be obsolete by the time a student graduates. The approach is entirely appropriate when training future academics—a key component of any university’s mandate—but when training future industry workers (or public sector employees) this strategy ultimately relies on employers teaching new graduates the skills needed to be productive in the workplace. Employers, in turn, opt not to hire new graduates because they have no training on the tools and techniques most immediately relevant to their work.

To bridge this gap, some educators and policymakers are turning to apprenticeships. Individual apprenticeship programs can vary drastically on a case-by-case basis, but include four criteria:

  • Paid, structured, on-the-job training combined with related classroom instruction;
  • Clearly defined wage structure with increases commensurate with skill gains or credential attainment;
  • Third-party evaluation of program content, apprenticeship structure, mentorship components, and quality standards; and
  • Ongoing assessment of skills development culminating in an industry-recognized credential.31

The pool of existing registered programs in cybersecurity is still small, likely consisting of fewer than a few dozen programs with active, paid apprentices focused specifically on cybersecurity. A joint project at New America is tracking the emergence and development of these programs. Data on the programs of which we are aware is available at https://www.newamerica.org/cybersecurity-initiative/reports/cybersecurity-apprenticeships-tracker/. Early proponents of the model advocate for its ability to tailor learning to precisely fit workforce needs,32 as well as its ability to adapt to a rapidly-changing environment.33

Information on what is missing from classroom education is largely anecdotal, and suffers from a community-wide lack of metrics on what exactly employers find useful in the workplace.

Degree programs in higher education are not obsolete or without utility. Indeed, many innovative and forward-thinking programs have emerged out of universities and community colleges. Moving away from or dismissing incumbent educational systems misses a critical opportunity to harness good work already being done. One argument in defense of conventional degree programs is their long-standing popularity in the United States. In order for cybersecurity workforce solutions to be effective, they need to be scalable to the order of tens- and hundreds-of-thousands of people. Higher education has the capacity to reach that magnitude. However, existing systems can be augmented to better suit workplace needs, thus creating more pathways to a career in cybersecurity.

Scholars are already considering ways that otherwise conventional higher education pathways could be augmented to incorporate hands-on training in workforce development more generally by melding registered apprenticeship programs with bachelor’s degrees.34 Instead of choosing between degrees and apprenticeships, Mary Alice McCarthy, director of the Center on Education and Skills at New America asks, “How about both?”35 In cybersecurity, augmenting the higher education system with work-based learning mechanisms would generate not just greater numbers of available workers, but would create the options students need to be successful in the long term and the diversity of experience and education that industry badly needs. Incorporating on-the-job training would make not just a larger cybersecurity workforce, but a better one.

These adaptations to current educational and training systems are not the responsibility of the education community alone. Industry has an equal interest and mandate to prepare new talent for their future roles. Involvement in education allows employers to indicate the skills they need in their future employees. Employers must also necessarily be involved in any apprenticeship or other work-based learning system because the learning will take place in their workplaces. Accordingly, employer buy-in on two fronts—(1) meaningful and ongoing communication with educators and (2) the professional development and incentive structures to accommodate learners and reward mentorship—will be critical to success in developing the cybersecurity workforce.

Citations
  1. “The Case for Early Education about STEM Careers,” ASPIRES, King’s College London, source.
  2. “Trends in the State of Computer Science in U.S. K-12 Schools,” Gallup in partnership with Google, 2016, source.
  3. Interdisciplinary Pathways towards a More Secure Internet, National Science Foundation, Report on the Cybersecurity Ideas Lab, Arlington, Virginia, February 10-12, 2014, 9, source
  4. RK Raj and A Parrish, "Toward Standards in Undergraduate Cybersecurity Education in 2018," Computer 51, issue 2 (February 2018): pp 72-75, source.
  5. The computing societies are: Association for Computing Machinery (ACM), IEEE Computer Society (IEEE CS), Association for Information Systems Special Interest Group on Security (AIS SIGSEC), and International Federation for Information Processing Technical Committee on Information Security Education (IFIP WG 11.8). For more, see source
  6. Cybersecurity Curricula 2017 Version 0.95 Report, Joint Task Force on Cybersecurity Education, November 13, 2017, 14, source.
  7. And, indeed, others across higher education have also emphasized and developed mechanisms for incorporating other disciplines in a computer science-based cybersecurity curriculum. For example, see the University of Nevada Reno and Truckee Meadows Community College’s Interdisciplinary Cybersecurity Modules: source.
  8. For more on this, particularly with an eye to incorporating cybersecurity into policy education, see Jessica Beyer and Sara Curran, Cybersecurity Workforce Preparedness: The Need for More Policy-Focused Education, Wilson Center Digital Futures Project, November 22, 2017, source.
  9. Peter Swire, “A Pedagogic Cybersecurity Framework,” Communications of the ACM 61 [October 2018]: 23-26, source.
  10. Allen Parrish, John Impagliazzo, Rajendra K. Raj, Henrique Santos, Muhammad Rizwan Asghar, Audun Jøsang, Teresa Pereira, and Eliana Stavrou, “Global Perspectives on Cybersecurity Education for 2030: A Case for a Meta-Discipline,” Proceedings of 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education (ITiCSE’18), July 2-4, 2018, source.
  11. Kirk Carapezza, “With more than 200,000 unfilled jobs, colleges push cybersecurity,” PBS Newshour, January 22, 2015, source.
  12. Arati Prabhakar, “Cybersecurity Summit,” (panel discussion, Washington Post Live, October 01, 2014), source.
  13. RK Raj and A Parrish, "Toward Standards in Undergraduate Cybersecurity Education in 2018," Computer 51, issue 2 (February 2018): pp 72-75, source.
  14. Michael M. Crow and William B. Dabars, The New American University, Baltimore: Johns Hopkins University Press, 2015, 7-8.
  15. “Table 348. Average graduate and first-professional tuition and required fees in degree-granting institutions, by first-professional field of study and control of institution: 1988-89 through 2009-10,” National Center for Education Statistics, October 2010, source.
  16. Jon Marcus, “Graduate programs have become a cash cow for struggling colleges. What does that mean for students?,” PBS Newshour, September 18, 2017, source.
  17. Kevin Carey, “Those Master’s-Degree Programs at Elite U.? They’re For-Profit,” The Chronicle of Higher Education, April 21, 2014, source.
  18. 2017 Global Information Security Workforce Study: Benchmarking Workforce Capacity and Response to Cyber Risk, Center for Cyber Safety and Education, (ISC)2, Booz Allen Hamilton, Alta Associates, and Frost and Sullivan, 2017, source.
  19. Jessie Brown and Martin Kurzweil, The Complex Universe of Alternative Postsecondary Credentials and Pathways (Cambridge, Mass.: American Academy of Arts & Sciences, 2017), source.
  20. Elizabeth Catte, “In Appalachia, Coding Bootcamps That Aim To Retrain Coal Miners Increasingly Show Themselves To Be ‘New Collar’ Grifters,” BELT Magazine, January 11, 2018, source.
  21. Jaikumar Vijayan, “Can cybersecurity boot camps fill the workforce gap?,” The Christian Science Monitor Passcode, January 20, 2017, source.
  22. Liz Eggleston, 2016 Coding Bootcamp Outcomes and Demographics Study, Course Report, September 14, 2016, source.
  23. Ibid.
  24. Liz Eggleston, 2015 Coding Bootcamp Alumni & Demographics Study, Course Report, October 25, 2015, source.
  25. CompTIA, “CompTIA Pledges to Get the UK Cyber Ready,” news release, June 27, 2018, CompTIA, accessed September 12, 2018, 2016.source.
  26. Kevin Carey, “Here’s What Will Truly Change Higher Education: Online Degrees That Are Seen as Official,” New York Times, March 5, 2015, source.
  27. Michael Bernick, “Coursera’s Bet On The Upskilling of American Workers,” Forbes, February 21, 2018, source.
  28. Tajha Chappellet-Lanier, “Cybersecurity MOOC Cybrary hits 1 million registered users,” Technical.ly/DC, May 11, 2017, source
  29. Hacking the Skills Shortage: A Study of the international shortage in cybersecurity skills,” McAfee and Center for Strategic and International Studies, July 27, 2016, 13. source.
  30. For a sampling, see responses to NIST/NICE Executive Order on Cybersecurity Workforce Request for Information. For an example, see ISACA’s response at source.
  31. Definition and Principles for Expanding Quality Apprenticeship in the U.S., Apprenticeship Forward Collaborative, Forthcoming, source.
  32. Marian Merritt, “Cybersecurity Apprenticeships Enhance Cybersecurity Infrastructure,” United States Department of Commerce, January 31, 2018, source.
  33. Michael Prebil, “Teach Cybersecurity with Apprenticeship Instead,” New America, April 14, 2017, source.
  34. Mary Alice McCarthy, Iris Palmer, and Michael Prebil, Connecting Apprenticeship and Higher Education Eight Recommendations, Washington D.C., New America, December 06, 2017, source.
  35. Mary Alice McCarthy (McCarthyEdWork), “Apprenticeships or College? How About Both?,” December 8, 2017, 3:06pm, source.
Section Two: How Do We Teach Cybersecurity?

Table of Contents

Close