Colleges, It's Time for a General Technology Class

Why cyber education isn't just for STEM students.
Blog Post
June 5, 2018

“I wish I could code.”

“That’s so cool to be technical.”

“Yeah, I kinda suck at the computer stuff.”

These are phrases I hear regularly from my college peers, whether in the middle of my rant on Cambridge Analytica or after a glance at my evening homework assignment. Because one of my majors is computer science, I am one of those “techie” individuals unafraid to read cyber policy articles or learn about the latest cryptocurrency. The knowledge and skills that come with understanding computer science are obviously beneficial, from hunting for jobs to better analyzing global trends in articles and papers.

Lately, I’ve begun to realize that my peers’ statements are symptomatic of a larger problem: while colleges and universities around the country are increasingly offering “Computer Science 101” classes that teach basic levels of Python or simplified languages like Alice, very few colleges offer (let alone require) general technology classes that prepare students for the digital world.

As most of us are well aware, technology is impacting every sector on the planet. Social media is enabling both activist organization and terrorist radicalization. Machine learning is revolutionizing medicine just as it threatens to displace hundreds of millions of workers. The Internet of Things is empowering energy efficiency in the same breath as it enables pervasive surveillance and leaves entire infrastructures vulnerable to hacking.

That said, how many students understand how the Internet works? How many know what encryption means? How many could understand Bitcoin’s adverse impact on the environment or confidently read an article on GDPR, the EU’s latest data protection law?

Just as college students take introductory courses in mathematics, literature, or political science, American universities should offer “Technology 101” classes that introduce students to the social, legal, political, cultural, economic, and ethical impacts of digitalization.

To be sure, some institutions already have similar offerings: MIT Opencourseware lists an “Introduction to Technology and Policy” class that covers privacy, globalization, and the history of U.S. tech policy. Indiana University Bloomington offers a course covering “the social and lifestyle effects of information technology.” Carnegie Mellon University created a Technology and Policy minor with courses such as “Introduction to Engineering and Public Policy.” Even several community colleges boast options that teach core technology concepts with relevant framing.

Despite these strides, we need to do better. Most colleges don’t offer classes like those I just described, and when they do, they aren’t (roughly) standardized in the same way as “Spanish 101” or an introductory statistics course. This can leave students unprepared to solve problems of the present and future.

“Students aren’t ready to confront the Pandora’s Box that the digital world has waiting for them,” agrees Brian Fonseca, Director of the Public Policy Institute at Florida International University. “In other words, they’ve been given power, but they don’t fully understand it.”

Fonseca, who is also a Cybersecurity Policy Fellow at New America, proposed such a course idea two years ago. “We need to treat cyber like English, math, and other ‘lower-division’ courses,” he says. “Cyber education is mostly constrained to the hard sciences. However, the workforce is increasingly signaling the importance that all graduating students — from political science and hospitality students to law and business students — come to the workforce with a strong understanding of securing information in the digital age.”

It is for this reason, Fonseca argues, that “we need to develop a ‘cyber competency’ across all disciplines.”

Jeffrey Ritter, who teaches information governance at the University of Oxford’s Department of Computer Science, feels we require even greater change. “The need is equivalent to university-level deficiencies incoming students exhibit in mathematical and English literacy,” he told me. “We have to ask ourselves if today’s students are prepared to enter the digital world with essential skills to survive. The answer is, unequivocally, no.”

For this reason, Ritter argues we need to start fostering digital literacy as early as possible — building a “technological foundation” to empower later, higher-ed learning.

With this base, we can begin preparing the next generation for what is to come. “At the end of the day,” Ritter says, “those who will compete in the Digital Age will be those best prepared to act with the velocity that digital competency enables.”

Our nation needs digitally-competent leaders: In 2016, U.S. lawmakers tried to pass an encryption bill which New America’s Kevin Bankston then called “the most ludicrous, dangerous, technically illiterate proposal I’ve ever seen.” Congressional performance at the Zuckerberg hearing was dismal. Many American businesses still fail to take basic cybersecurity measures. Pew Research Center, in their latest survey on digital readiness gaps, listed only 17% of Americans as “digitally ready.”

To change this reality, we need to equip students of all disciplines with the skills to learn, analyze, and innovate in the world of digital technology. While mandating these types of courses might be infeasible in the short-run, colleges should use the optional courses at Carnegie Mellon, Indiana University Bloomington, and others as a model.

Colleges, we can’t risk our nation falling even further behind. It’s your turn now.