国产视频

Section Two: How Do We Teach Cybersecurity?

The default system for education in cybersecurity鈥攁 bachelor鈥檚 degree鈥攈as much to recommend it; it is already the cultural norm throughout much of the upwardly-mobile American workforce. But the structure of a typical bachelor鈥檚 degree program is not always ideal for teaching cybersecurity. Options outside of higher education鈥攅.g. bootcamp-style training programs or the military鈥攁re also not universally appropriate. This conundrum offers a unique opportunity to pursue work-based learning options that would be remarkably innovative in the cybersecurity community.

This report does not delve into education at the kindergarten through twelfth grade (K12) level, but that is not to imply that it is irrelevant here. Research indicates that students鈥 attitudes towards STEM tend to decline dramatically around the ages of 10-14.1 Accordingly, a central challenge in long-term cybersecurity workforce development focuses on engaging students in STEM fields prior to and during that time frame, but the topic does not end there. In order to benefit from emerging interest in STEM fields, schools must have the teachers and resources to provide instruction in these areas. Only 40 percent of K12 principles say that their school offers a computer science class that teaches a coding language, but 90 percent of parents say that opportunities to learn computer science are a good use of school resources.2 Clearly there is room to expand.

The structure of a typical bachelor鈥檚 degree program is not always ideal for teaching cybersecurity. Options outside of higher education鈥攅.g. bootcamp-style training programs or the military鈥攁re also not universally appropriate.

An interesting feature of discussions on K12 cybersecurity education is that the policy levers to shape priorities and incentives exist predominantly among state, local, territorial, and tribal governments. Meanwhile, an active community of stakeholders are working to develop mechanisms to encourage and promote education that could lead to long-term development of the cybersecurity workforce. This complex ecosystem overlaps with the discussion in this report. However, because the policy solutions are quite different at the K12 level, this report acknowledges the important influence that these years have on later education and opportunities, and begins exploring the educational system at the level of higher education.

Organizing Higher Education around an Interdisciplinary Field

To develop cybersecurity expertise not just in computer science, but areas like law, policy, healthcare, and finance, academic decision-makers and the policymakers who define incentives in higher education should consider cybersecurity not as a single, monolithic discipline within higher education, but rather a field that cuts across鈥攁nd looks very different in鈥攎any disciplines.


Among the characteristics that make it difficult to teach in a degree program, cybersecurity is a 鈥渕ultidisciplinary problem, touching on policy issues, economic incentives, and public and business awareness and education, along with new technical challenges.鈥3 Programs that teach cybersecurity effectively must also incorporate elements from many fields, which could encourage innovative approaches from the outset. In theory, this basis in outside-the-box thinking could provide an excellent starting point for further innovation. This tends to be very difficult to implement in higher education systems historically segmented into different departments and colleges within a single university.

Where they are taught, cybersecurity programs are often folded into another field within computer science or engineering departments. This is appropriate insofar as cybersecurity exists within the context of technical systems鈥攃omputers and networks鈥攂ut it can also limit the growth of cybersecurity as an educational priority outside these departments or as an independent field. Organizations like ABET (Accreditation Board for Engineering and Technology) and the National Security Agency and Department of Homeland Security鈥檚 Centers of Academic Excellence programs provide useful frameworks around the content to be taught, but do not answer these more fundamental questions about cybersecurity as a field of study.4

Computer science and engineering disciplines have historically been the focus of cybersecurity curriculum development. One particularly well-developed example of such an effort is the Joint Task Force on Cybersecurity Education, a 鈥渃ollaboration between major international computing societies.鈥5 The curriculum is meticulously developed and reviewed by academic experts. While the task force very clearly acknowledges cybersecurity鈥檚 interdisciplinary nature, the curriculum is targeted at an understanding of the field that 鈥渁dvances cybersecurity as a new computing discipline and positions the cybersecurity curricular guidance within the context of the current set of defined computing disciplines,鈥 which include computer engineering, computer science, information systems, information technology, software technology.6 Quite sensibly, the Joint Task Force has housed this effort squarely in computing disciplines.

The curriculum development effort does not end here. Other academic disciplines could emulate this work. While the Joint Task Force has incorporated legal, economical, and political considerations into their curriculum guidance,7 that incorporation does not replace curriculum development efforts for cybersecurity in each of those disciplines. The draft curriculum proposes teaching a reasonable array of policy principles for a student who expects to do research in a computer science department, but certainly not for a student interested in working for a legislator鈥檚 office informing data privacy policy. Educating the future policy wonk requires a very different cybersecurity curriculum8 and maybe even a different pedagogic framework.9 Given that cybersecurity jobs are increasingly crossing into other domains like finance, healthcare, and law, the same could be said for any number of other examples.

Although some colleges and universities have taken on the challenge of developing such interdisciplinary programs, practical considerations like interdepartmental cost sharing, program equities, and the enduring assumptions about where cybersecurity coursework should be anchored often slow down the development of cybersecurity as a cross-cutting field, (or meta-field, as scholars have termed it10) applicable and accessible to many disciplines. The work that academics in computer science and engineering are doing to identify best practices in cybersecurity curriculums is invaluable and critical. However, such efforts should be part of a larger ecosystem of offerings that teach the aspects of cybersecurity most relevant to industries ranging from law to hospitality to medicine to policy and much more.

Mandates for Higher Education: Teaching, Research, and Sustainability

Cybersecurity can be difficult to teach in a classroom, which exacerbates tensions between competing priorities in higher education. Administrators must forge a path between the university鈥檚 mandate to facilitate research and prepare students for their future jobs while also ensuring the institution鈥檚 financial sustainability. Policymakers who set incentives for higher education must reward decisions that lead to a stronger cybersecurity workforce.


Apart from the theoretical challenge of finding a home for cybersecurity programs in a university setting, the discipline also creates practical challenges. Cybersecurity changes quickly. As New York Times reporter and Harvard University adjunct lecturer David Sanger puts it, 鈥淭he hardest thing about teaching anything about cybersecurity is the same thing that鈥檚 the hard part about writing and reporting about cybersecurity, which is, it鈥檚 moving so fast.鈥11 This makes it difficult to keep conventional classroom education up-to-date, especially when curricula can take weeks and months to make and approve. Automated cybersecurity attacks 鈥渁re happening in microseconds… so today all we can do is patch and pray,鈥 according to Dr. Arati Prabhakar, formerly the head of the Defense Advanced Research Projects Agency (DARPA) and of the National Institute of Standards and Technology (NIST). She adds, 鈥渨e are looking for a fundamentally different way to get faster than the pace of the growth of the threat.鈥12 In an already rapidly developing industry, cutting edge technologies give way to newer tools in the span of weeks and months, a timeline prohibitively difficult to maintain in syllabi developed over much longer timelines.

As difficult as maintaining a current syllabus can be, finding teachers with experience to teach the most current techniques and tools is equally challenging. Applied courses are often taught by instructors and adjunct professors, but they are expensive to hire given competition for experts with these skill sets. Tenured faculty are generally focused on foundational research within a narrow specialty, not the newest bit of technology.

The hardest thing about teaching anything about cybersecurity is the same thing that鈥檚 the hard part about writing and reporting about cybersecurity, which is, it鈥檚 moving so fast.

Maintaining a focus on foundational education and research allows faculty to cultivate and attract top-tier graduate students to aid in that research, which fosters a fertile environment for the research and development that keeps cybersecurity on the cutting edge. Educators are also charged with the mandate that 鈥渟tudents must be encouraged to think and learn, with the understanding that specific content isn鈥檛 as important as it would be in training scenarios.鈥13 These functions are critically important to the university, to the general health of the cybersecurity research community, and to the workforce writ large, but do not answer the question of where students will learn the tools and skills that will be required to enter a career in industry.

This tension between a university鈥檚 teaching and research mandates is part of a much larger conversation on the role of higher education in society. Should universities (and research universities in particular) exist to train the workers that will build the future economy, or should their purpose be to cultivate the hotbeds of innovation and deep research that fuel growth and stand as a hallmark of the U.S.鈥檚 comparative economic advantage? This question is not a central focus of this report, but understanding the role of the university is important context in considering the potential impact of higher education on the cybersecurity workforce.

Highlighting the crux of this uncertain role for institutions of higher education, Arizona State University (ASU) President Michael Crow and ASU Senior Research Fellow William B. Dabars emphasize that 鈥渢he inherent limitations of the present model [of research universities] attenuate the potential of this set of institutions to educate citizens in sufficient numbers and address the host of challenges that beset the world.鈥14 They write about universities鈥 limitations generally, but in the cybersecurity context, this line of thinking leads to real questions about whether universities can adapt to create capable workers in addition to highly trained researchers. In an industry that desperately requires both types of experts鈥攁nd many types in between鈥攄eveloping a spectrum of educational offerings is a particularly valuable strategy. Much as the medical field has different educational paths for surgeons, pharmaceutical researchers, technicians, and nurses, a thriving cybersecurity community will require a breadth of educational paths.

The role of the university not only oscillates between research and teaching mandates. Economic considerations also factor into any university鈥檚 operations. Given the demand from recent baccalaureate graduates and mid-career job changers for opportunities to break into a lucrative cybersecurity career, developing a professional master鈥檚 in cybersecurity may seem like a sound investment for any university administration. However, higher education occasionally finds itself walking an uncomfortable tightrope when it comes to this type of professional graduate degree program.

Such programs are known for their profitability. Adjusted for inflation, tuition for an average graduate degree program in 1989 cost $6,603. In 2010, it cost $14,398.15 This steep rise in cost reflects greater demand for such degrees, which has created a very tempting revenue stream for administrators at often funding-starved schools.16 The resulting incentive structure can encourage universities to provide expensive professional graduate degrees designed for profit17 rather than beneficial student outcomes. With so little data available on what kind of education or training yields best long-term student outcomes in a cybersecurity career鈥攏ot to mention lukewarm industry attitudes towards skills learned in the classroom rather than on the job鈥攗niversities offering a professional master鈥檚 in cybersecurity must carefully weigh financial priorities and social responsibility.

Learning Cybersecurity Outside of Higher Education

The existence of a diverse array of alternatives to conventional education could enable learners to successfully transition into cybersecurity jobs. These alternatives exist in varying degrees of maturity. Smart policies could guide the development of these alternatives towards best outcomes for students and employers.


Anecdotally, many cybersecurity employees in industry are trained in either the military or the intelligence community, rather than passing through academia. Statistics describing the exact scale of this pattern are hard to come by. There are good reasons why military and intelligence agencies would be reluctant to publish personnel statistics, but consequently it is very difficult to know what proportion of the cybersecurity workforce passes through government service. The Global Information Security Workforce Study says 16 percent of hiring managers prefer to recruit among former and active military, and 30 percent of the workforce comes from a non-technical background, which can include 鈥渂usiness, marketing, finance, accounting, or military and defense.鈥18 An especially useful question to answer would be what percentage of the workforce was trained in the military or intelligence community, and what are the long-term work roles and outcomes for those individuals.

Bootcamps and skills-based short courses provide other potential pathways into the workforce. They have a long history of teaching workplace-relevant skills from stenography to coding.19 Despite this long history, not all examples are positive. Some bootcamps face criticism for over-promising and under-delivering,20 a trend that warrants a note of caution among proponents of skills-based courses.

Cybersecurity bootcamps do not have the numbers seen among their coding bootcamp cousins, but they are already gaining a profile as a viable alternative training option.21 It is uncertain whether they will remain on this promising trajectory or struggle with the challenges that have beset many coding bootcamps. Among these coding camps, surveys indicate 60 percent of bootcamp completers already had a bachelor鈥檚 degree, and graduates averaged 6.8 years of work experience,22 which does not seem to be the entry-level path to success one might hope for. The same study indicated that 43 percent of coding bootcamp students were women,23 suggesting that this pathway may help break down longstanding gender imbalances in information technology (IT) disciplines. The previous year, the same study indicated that 79 percent of students had at least a bachelor鈥檚 degree, and they averaged 7.6 years of work experience,24 perhaps suggesting a trend towards a more viable entry-level pathway. What this means for cybersecurity bootcamps is unclear.

While skills-based short courses may remain problematic for early-career job seekers, they do offer some promise to workers transitioning from other less-engaging or less-lucrative careers and for employers seeking to upskill their current workforce. For example, they could allow IT support staff to specialize in network security or other cybersecurity disciplines.

In the United Kingdom, industry association CompTIA has already invested in this market with support from the U.K. government. Their Cyber Ready retraining program targets a wide range of applicants (e.g. parents, IT hobbyists, graduates) to provide them with the skills needed to enter cybersecurity careers.25 Such programs could expand in the United States to offer a means for non-cybersecurity professionals to make their way into the industry.

The digitized options in education and training also offer upskilling and training options outside the classroom. While experts debate whether massive open online courses (MOOCs) will replace conventional college degrees more generally,26 providers remain optimistic about online education鈥檚 promise for upskilling and retraining.27 In cybersecurity, the popularity of these programs has driven rapid growth for specialized providers (e.g. Maryland-based Cybrary).28 Data-driven insight into what this means for the long-term outcomes of students and workers trained through such programs would be an excellent area for future research.

Apprenticeships in Cybersecurity

Policies to support the growth of apprenticeship as a model for cybersecurity education could have a profoundly positive impact on connecting talented individuals with open jobs.


Given the challenges of teaching cybersecurity in a conventional higher education setting, and recalling that survey data suggests industry experts do not feel that students are graduating with the skills needed to be successful in their new roles,29 what is the preferred training option? Information on what is missing from classroom education is largely anecdotal, and suffers from a community-wide lack of metrics on what exactly employers find useful in the workplace, but the emphasis on practical experience as a part of the learning process is a frequent refrain across the industry.30

Applied skills are often omitted in classrooms in favor of the theoretical principles that undergird the rapidly-changing tools used in workplaces. Teaching those evolving tools raises concerns that they may be obsolete by the time a student graduates. The approach is entirely appropriate when training future academics鈥攁 key component of any university鈥檚 mandate鈥攂ut when training future industry workers (or public sector employees) this strategy ultimately relies on employers teaching new graduates the skills needed to be productive in the workplace. Employers, in turn, opt not to hire new graduates because they have no training on the tools and techniques most immediately relevant to their work.

To bridge this gap, some educators and policymakers are turning to apprenticeships. Individual apprenticeship programs can vary drastically on a case-by-case basis, but include four criteria:

  • Paid, structured, on-the-job training combined with related classroom instruction;
  • Clearly defined wage structure with increases commensurate with skill gains or credential attainment;
  • Third-party evaluation of program content, apprenticeship structure, mentorship components, and quality standards; and
  • Ongoing assessment of skills development culminating in an industry-recognized credential.31

The pool of existing registered programs in cybersecurity is still small, likely consisting of fewer than a few dozen programs with active, paid apprentices focused specifically on cybersecurity. A joint project at 国产视频 is tracking the emergence and development of these programs. Data on the programs of which we are aware is available at /cybersecurity-initiative/reports/cybersecurity-apprenticeships-tracker/. Early proponents of the model advocate for its ability to tailor learning to precisely fit workforce needs,32 as well as its ability to adapt to a rapidly-changing environment.33

Information on what is missing from classroom education is largely anecdotal, and suffers from a community-wide lack of metrics on what exactly employers find useful in the workplace.

Degree programs in higher education are not obsolete or without utility. Indeed, many innovative and forward-thinking programs have emerged out of universities and community colleges. Moving away from or dismissing incumbent educational systems misses a critical opportunity to harness good work already being done. One argument in defense of conventional degree programs is their long-standing popularity in the United States. In order for cybersecurity workforce solutions to be effective, they need to be scalable to the order of tens- and hundreds-of-thousands of people. Higher education has the capacity to reach that magnitude. However, existing systems can be augmented to better suit workplace needs, thus creating more pathways to a career in cybersecurity.

Scholars are already considering ways that otherwise conventional higher education pathways could be augmented to incorporate hands-on training in workforce development more generally by melding registered apprenticeship programs with bachelor鈥檚 degrees.34 Instead of choosing between degrees and apprenticeships, Mary Alice McCarthy, director of the Center on Education and Skills at 国产视频 asks, 鈥淗ow about both?鈥35 In cybersecurity, augmenting the higher education system with work-based learning mechanisms would generate not just greater numbers of available workers, but would create the options students need to be successful in the long term and the diversity of experience and education that industry badly needs. Incorporating on-the-job training would make not just a larger cybersecurity workforce, but a better one.

These adaptations to current educational and training systems are not the responsibility of the education community alone. Industry has an equal interest and mandate to prepare new talent for their future roles. Involvement in education allows employers to indicate the skills they need in their future employees. Employers must also necessarily be involved in any apprenticeship or other work-based learning system because the learning will take place in their workplaces. Accordingly, employer buy-in on two fronts鈥(1) meaningful and ongoing communication with educators and (2) the professional development and incentive structures to accommodate learners and reward mentorship鈥攚ill be critical to success in developing the cybersecurity workforce.

Citations
  1. 鈥淭he Case for Early Education about STEM Careers,鈥 ASPIRES, King鈥檚 College London, .
  2. 鈥淭rends in the State of Computer Science in U.S. K-12 Schools,鈥 Gallup in partnership with Google, 2016, .
  3. Interdisciplinary Pathways towards a More Secure Internet, National Science Foundation, Report on the Cybersecurity Ideas Lab, Arlington, Virginia, February 10-12, 2014, 9,
  4. RK Raj and A Parrish, "Toward Standards in Undergraduate Cybersecurity Education in 2018," Computer 51, issue 2 (February 2018): pp 72-75, .
  5. The computing societies are: Association for Computing Machinery (ACM), IEEE Computer Society (IEEE CS), Association for Information Systems Special Interest Group on Security (AIS SIGSEC), and International Federation for Information Processing Technical Committee on Information Security Education (IFIP WG 11.8). For more, see
  6. Cybersecurity Curricula 2017 Version 0.95 Report, Joint Task Force on Cybersecurity Education, November 13, 2017, 14, .
  7. And, indeed, others across higher education have also emphasized and developed mechanisms for incorporating other disciplines in a computer science-based cybersecurity curriculum. For example, see the University of Nevada Reno and Truckee Meadows Community College鈥檚 Interdisciplinary Cybersecurity Modules: .
  8. For more on this, particularly with an eye to incorporating cybersecurity into policy education, see Jessica Beyer and Sara Curran, Cybersecurity Workforce Preparedness: The Need for More Policy-Focused Education, Wilson Center Digital Futures Project, November 22, 2017, .
  9. Peter Swire, 鈥淎 Pedagogic Cybersecurity Framework,鈥 Communications of the ACM 61 [October 2018]: 23-26, .
  10. Allen Parrish, John Impagliazzo, Rajendra K. Raj, Henrique Santos, Muhammad Rizwan Asghar, Audun J酶sang, Teresa Pereira, and Eliana Stavrou, 鈥淕lobal Perspectives on Cybersecurity Education for 2030: A Case for a Meta-Discipline,鈥 Proceedings of 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education (ITiCSE鈥18), July 2-4, 2018, .
  11. Kirk Carapezza, 鈥淲ith more than 200,000 unfilled jobs, colleges push cybersecurity,鈥 PBS Newshour, January 22, 2015, .
  12. Arati Prabhakar, 鈥淐ybersecurity Summit,鈥 (panel discussion, Washington Post Live, October 01, 2014), .
  13. RK Raj and A Parrish, "Toward Standards in Undergraduate Cybersecurity Education in 2018," Computer 51, issue 2 (February 2018): pp 72-75, .
  14. Michael M. Crow and William B. Dabars, The 国产视频n University, Baltimore: Johns Hopkins University Press, 2015, 7-8.
  15. 鈥淭able 348. Average graduate and first-professional tuition and required fees in degree-granting institutions, by first-professional field of study and control of institution: 1988-89 through 2009-10,鈥 National Center for Education Statistics, October 2010, .
  16. Jon Marcus, 鈥淕raduate programs have become a cash cow for struggling colleges. What does that mean for students?,鈥 PBS Newshour, September 18, 2017, .
  17. Kevin Carey, 鈥淭hose Master鈥檚-Degree Programs at Elite U.? They鈥檙e For-Profit,鈥 The Chronicle of Higher Education, April 21, 2014, .
  18. 2017 Global Information Security Workforce Study: Benchmarking Workforce Capacity and Response to Cyber Risk, Center for Cyber Safety and Education, (ISC)2, Booz Allen Hamilton, Alta Associates, and Frost and Sullivan, 2017, .
  19. Jessie Brown and Martin Kurzweil, The Complex Universe of Alternative Postsecondary Credentials and Pathways (Cambridge, Mass.: American Academy of Arts & Sciences, 2017), .
  20. Elizabeth Catte, 鈥淚n Appalachia, Coding Bootcamps That Aim To Retrain Coal Miners Increasingly Show Themselves To Be 鈥楴ew Collar鈥 Grifters,鈥 BELT Magazine, January 11, 2018, .
  21. Jaikumar Vijayan, 鈥淐an cybersecurity boot camps fill the workforce gap?,鈥 The Christian Science Monitor Passcode, January 20, 2017, .
  22. Liz Eggleston, 2016 Coding Bootcamp Outcomes and Demographics Study, Course Report, September 14, 2016, .
  23. Ibid.
  24. Liz Eggleston, 2015 Coding Bootcamp Alumni & Demographics Study, Course Report, October 25, 2015, .
  25. CompTIA, 鈥淐ompTIA Pledges to Get the UK Cyber Ready,鈥 news release, June 27, 2018, CompTIA, accessed September 12, 2018, 2016..
  26. Kevin Carey, 鈥淗ere鈥檚 What Will Truly Change Higher Education: Online Degrees That Are Seen as Official,鈥 New York Times, March 5, 2015, .
  27. Michael Bernick, 鈥淐oursera鈥檚 Bet On The Upskilling of American Workers,鈥 Forbes, February 21, 2018, .
  28. Tajha Chappellet-Lanier, 鈥淐ybersecurity MOOC Cybrary hits 1 million registered users,鈥 Technical.ly/DC, May 11, 2017,
  29. Hacking the Skills Shortage: A Study of the international shortage in cybersecurity skills,鈥 McAfee and Center for Strategic and International Studies, July 27, 2016, 13. .
  30. For a sampling, see responses to NIST/NICE Executive Order on Cybersecurity Workforce Request for Information. For an example, see ISACA鈥檚 response at .
  31. Definition and Principles for Expanding Quality Apprenticeship in the U.S., Apprenticeship Forward Collaborative, Forthcoming, .
  32. Marian Merritt, 鈥淐ybersecurity Apprenticeships Enhance Cybersecurity Infrastructure,鈥 United States Department of Commerce, January 31, 2018, .
  33. Michael Prebil, 鈥淭each Cybersecurity with Apprenticeship Instead,鈥 国产视频, April 14, 2017, source.
  34. Mary Alice McCarthy, Iris Palmer, and Michael Prebil, Connecting Apprenticeship and Higher Education Eight Recommendations, Washington D.C., 国产视频, December 06, 2017, .
  35. Mary Alice McCarthy (McCarthyEdWork), 鈥淎pprenticeships or College? How 国产视频 Both?,鈥 December 8, 2017, 3:06pm, .
Section Two: How Do We Teach Cybersecurity?

Table of Contents

Close