The Landscape of Existing Frameworks
From a global perspective, measuring and defining digital skills has long been a policy and academic priority, and a number of useful frameworks, landscape assessments, and guides have been created. There are two basic approaches to measuring and standardizing digital skills, and each informs the other.
First, digital skills frameworks create a theoretical map that defines various competency levels of relevant skills. Assessments, on the other hand, measure the competency levels of the population as it is and provide a practical means of tracking progress. The goals described by frameworks are therefore at least in part informed by the results of assessments, which are designed based on the overarching categories determined by a relevant framework. Both are necessary to a comprehensive digital skills strategy.
Major Digital Skills Frameworks
There are a number of creative and influential digital skills frameworks on a global scale, and a number of publications dedicated to cataloging and examining them. While listing the entirety of digital skills frameworks is outside the scope of this report, a few relevant examples are listed below.
- The Digital Competence Framework for Citizens, or DigComp, is one of the most widely cited and commonly referenced digital skills frameworks.1 Introduced in 2013, DigComp is the European Union’s standard framework that is used as a collective tool to “improve citizens’ digital competence, help policymakers formulate policies that support digital competence building, and plan education and training initiatives to improve the digital competence of specific target groups.”2 DigComp separates digital skills into five content areas: (1) information and data literacy, (2) communication and collaboration, (3) digital content creation, (4) safety, and (5) problem solving.3 Each area is broken down into individual competences that emphasize knowledge, skills, and attitudes that enable mastery of the area.4 For example, the information and data literacy area includes competences related to evaluating and managing digital content, information, and data. Individual proficiency levels are also outlined for each competence.5
- United Nations Educational, Scientific and Cultural Organization (UNESCO) created a Digital Literacy Global Framework based on a global consultation that expands on DigComp.6 In addition to the original five digital competence areas listed above, UNESCO adds in two additional areas: devices and software operations and career-related competences. Both areas comprise a number of specific skills that foster effective use of technologies and specific career-oriented skill sets.
- The DQ Framework approaches digital skills as a multifaceted set of “technical, cognitive, meta-cognitive, and socioemotional competencies that are grounded in universal moral values and that enable individuals to face the challenges… of digital life.” In other words, it attempts to create a “digital intelligence” quotient (DQ).7 It combines digital competency frameworks across the world to create one comprehensive standardization framework for various digital literacies, industry skills, and educational requirements. DQ separates digital life into eight areas (identity, use, safety, security, emotional intelligence, literacy, communication, and rights). It further defines four levels at which each area can be achieved—connectivity, citizenship, creativity, and competitiveness—each focusing on a different end goal.8 Ultimately, it defines 32 competencies that make up the total framework and, as in DigComp, each competency can be broken down into knowledge, skills, attitudes, and values. It is accompanied by a free digital assessment tool and is widely used across the world, including by some large U.S. employers that are part of the Digital US Employer Network.9 It was approved by the IEEE Standards Board as a global digital literacy standard in 2020.10
- The International Society for Technology in Education (ISTE) sets standards for technology usage in education for students, educators, and coaches.11 It outlines a number of desired outcomes for each group—such as, for students, being an Empowered Learner, Creative Communicator, and Knowledge Constructor—and lists the practical components that lead to achievement of each standard. ISTE has also published The Profile of a Lifelong Learner, which provides guidance on the skills and mindsets needed for adult education in the context of workforce training.12 Based on ISTE’s Standards for students combined with other existing adult skills frameworks and literature, The Profile emphasizes five categories of learners: lifelong learner, empowered worker, digital citizen, solution seeker, and mindful colleague.13 In both frameworks, ISTE emphasizes the relational aspect of digital living—how to effectively coexist in certain contexts such as a classroom or workplace—and takes an outcome-based approach to its standards-setting.
The United States also has a number of digital skills frameworks that are applicable within specific contexts or geographic locations.
- Northstar Digital Literacy is a project of Literacy Minnesota that is used by over 3,000 adult education programs, businesses, and colleges.14 It offers online assessments and lessons premised on standards for concrete “skills and knowledge a learner requires to engage in tasks using digital technologies.”15 It defines practical skills within three categories: (1) essential computer skills, (2) essential software skills, and (3) using technology in daily life.16 Each category contains basic skills such as using Microsoft, email, and social media, and each individual task is accompanied by concrete steps that must be taken to master the skill.
- In the creation of its own digital skills framework, Virginia Tech uses a definition of digital literacy as “the knowledge, skills, and attitudes that help [individuals] deal with [the complexity of the digital world] and participate in our digital society.”17 Virginia Tech’s framework constitutes a multilayered approach with the student or learner as the central component.18 Core competencies include identity and wellbeing, discovery, evaluation, ethics, creation and scholarship, communication and collaboration, and curation.19 These competencies are surrounded by key values such as curiosity, reflection, equity and social justice, creativity, and participation. All of this is housed within multiple overlapping literacies including data, information, media, and invention.20 Learners can engage with any and all of these competencies, values, and literacies through various combinations.
Common Themes
Though the differences between even this handful of frameworks reflect the diversity of priorities in the digital skills landscape, there are some common threads. Most of the frameworks generally describe what it means to participate fully and safely in digital society. They recommend a mix of proactive activities (such as content creation) and defensive ones (such as measures to protect one’s digital identity). The competencies around which frameworks are based are also generally relational. They describe how to be a good student, teacher, employee, or citizen. This naturally arises from the fact that information communication technologies (ICTs) are about communication, and using them effectively means making use of that.
Many of the frameworks also advise a mix of practical competencies and skill sets as well as values and attitudes. In fact, the National Governors Association playbook characterizes the various digital skills definitions as “[ranging] from focusing on the discrete skills needed to navigate a computer or online environment to skills that are harder to measure, such as confidence and capacity to become lifelong learners of new technologies.”21 There seems to be consensus that mindset, in addition to practical abilities, is an important component of digital success—likely in part because the technical skills associated with digital competence are regularly changing as the technologies do.
Though some of them offer specific examples of success, the frameworks can also generally be engaged with in different ways. A number of frameworks, such as Virginia Tech’s project and the ISTE standards, are outcomes-based rather than prescriptive. This creates room for flexibility in implementation based on the different contexts in which they’re applied. Similarly, many of the frameworks can and do evolve to reflect changes in technology and usage. DigComp, for example, has been updated several times, as has the DQ framework.
Furthermore, as digital skills become a growing determinant of digital inclusion, access remains a necessary precursor and is finding its place in skill-oriented frameworks. DQ, for example, updated its framework by adding “connectivity” as a baseline level to which its various competencies can be achieved. “Connectivity,” which precedes the more advanced levels of development (citizenship, creativity, and competitiveness), emphasizes the basic need to connect all individuals to digital technology and, therefore, enables them to develop digital skills at those higher levels. In doing so, it explicitly acknowledges the need for those basic digital inclusion policies as a pathway to more advanced digital skills. Similarly, UNESCO’s update to DigComp involved the addition of practical baseline components such as the ability to operate devices.
Much of this evinces a broader theme of maintaining flexibility within frameworks. The digital skills landscape is a subset of the digital inclusion landscape—or at least intimately connected. It is not a new idea that effective digital inclusion projects demand flexibility in implementation. Approaches that meet people where they are, and are tailored to the specific context and needs of participants, are the gold standard of digital inclusion. Digital skills goals in particular, which are highly subject to change as societal standards for technology use and technologies themselves advance, need to be left open enough to remain relevant as the landscape evolves. At the same time, frameworks need enough specificity to provide direction and promote the use of comparable benchmarks. Incorporating into frameworks a combination of both hard and soft skills, as well as both task-based and outcome-based objectives, is a way to achieve that balance.
Assessments
On the other end of the scale, digital skills assessments measure the practical landscape of digital skills in a population and the extent and nature of digital skills gaps. Because assessments make choices in what types of skills they measure and how they do it, they are often implicitly—or explicitly—based on standards set by digital skills frameworks. Assessments can vary widely in content and administration. For example, the International Telecommunication Union’s Digital Skills Assessment Guidebook, which provides a landscape scan of digital skills frameworks to help countries embark on their own upskilling, categorizes assessments as self-administered, knowledge-based, or performance-based.22 Similarly, Digital Resilience in the American Workforce (DRAW), a federal initiative working to improve the adult education field by providing resources on digital skills, separates assessments into standardized tests, performance-based assessments, and self-assessments.23 There is, of course, further variation within these categories as to what exactly is measured and what populations are assessed.
Perhaps the most widely cited practical skills assessment is the Programme for the International Assessment of Adult Competencies (PIAAC) by the Organization for Economic Co-operation and Development (OECD). Its “Survey of Adult Skills” includes an assessment of participants’ problem-solving abilities using technology in OECD countries.24 The PIAAC assesses digital skills alongside other adult competencies by giving participants specific tasks on various online platforms and rating their performance. In the United States, the Department of Education administers the National Assessment of Educational Progress, in which the Technology and Engineering Literacy (TEL) assessment “uses interactive scenario-based tasks” to assess how eighth-grade students can apply technological skills to different situations.25 In the test, questions are intended to assess mastery of one of three TEL practices: “understanding technological principles,” “developing solutions and achieving goals,” and “communicating and collaborating,” all within the broader TEL content areas of “technology and society,” “design and systems,” and “information and communication technology.”26 Like the PIAAC, TEL measures respondents’ problem-solving skills, but it does so in the context of an educational system.
In contrast to practical assessments, knowledge-based and theoretical testing can assess participants’ basic knowledge without a practical component. The U.S.-based Pew Research Center, for example, regularly puts out survey results assessing trends and themes among the U.S. population. In 2019, it surveyed U.S. adults’ awareness of digital knowledge on topics like cybersecurity, privacy, and the business side of major digital companies.27
Surveys that require respondents to self-assess still rely on an objective framework of necessary skills and competence levels, but they put the onus on respondents to rate their own proficiency. Because of this added element of subjectivity, they can be especially suited to measuring discrete populations such as members of a particular group or specific geographic location, or to showing the relationships among various factors. For example, an EveryoneOn survey during the COVID-19 pandemic asked a sample of low-income U.S. households to self-assess their own proficiency at a number of everyday online tasks such as finding online information or applying for a job online.28 The results created a snapshot of that particular population’s digital skill levels and the relationships between those proficiencies and other related trends, like perceived usefulness of a broadband subscription.
In short, there is a wide landscape of available digital skills literature, frameworks, and data sources, and a number of academic, policy, and public-interest groups have created resources dedicated to taxonomizing and analyzing them. The United States has no shortage of materials at hand if it only decides to make use of them.
Citations
- “Digital Competence Framework for Citizens,” EU Science Hub, European Commission, accessed July 2024, source.
- Riina Vuorikari, Stefano Kluzer, and Yves Punie, DigComp 2.2: The Digital Competence Framework for Citizens (Luxembourg, Luxembourg: Publications Office of the European Union, 2022), 2, source.
- “Digital Competence Framework,” European Commission, source.
- “Digital Competence Framework,” European Commission, source.
- “Digital Competence Framework,” European Commission, source.
- United Nations Educational, Scientific, and Cultural Organization (UNESCO), A Global Framework of Reference on Digital Literacy Skills for Indicator 4.4.2 (Montreal, Quebec: UNESCO Institute of Statistics, 2018), source.
- “What is the DQ Framework? Global Standards for Digital Literacy, Skills, and Readiness,” DQ Institute, accessed July 2024, source.
- “What is the DQ Framework?,” DQ Institute, source.
- Jobs for the Future and World Education, Assessing and Validating Digital Skills: DRAW Detailed Findings and Discussion (United States: Jobs for the Future and World Education), 5, source.
- “IEEE Standard for Digital Intelligence (DQ)—Framework for Digital Literacy, Skills, and Readiness (IEEE 3527.1-2020),” IEEE Standards Association, January 15, 2021, source.
- “ISTE Standards,” International Society for Technology in Education, accessed August 2024, source.
- Skillrise, The Profile of a Lifelong Learner (Arlington, VA: International Society for Technology in Education, 2020), source.
- Skillrise, Lifelong Learner, source.
- “About Northstar,” Northstar Digital Literacy, accessed July 2024, source.
- “Northstar Features,” Northstar Digital Literacy, accessed July 2024, source.
- “Northstar Features,” Northstar Digital Literacy, accessed July 2024, source.
- “Digital Literacy,” Virginia Tech, accessed July 2024, source; Julia Feerrar, “Development of a Framework for Digital Literacy,” Reference Services Review (2019), source.
- Julia Feerrar and Kelsey Hammer, Digital Literacy Framework Toolkit (Blacksburg, VA: Virginia Tech University Libraries, 2020), source.
- Feerrar and Hammer, Framework Toolkit, 3, source.
- Feerrar and Hammer, Framework Toolkit, 3, source.
- Ash et al., Using Data to Advance Digital Skills, 2, source.
- International Telecommunication Union, Digital Skills Assessment Guidebook (Geneva: ITU, 2020), 8, source.
- Jobs for the Future and World Education, DRAW Detailed Findings, 3, source.
- “Survey of Adult Skills (PIAAC),” Organization for Economic Cooperation and Development, accessed July 2024, source.
- “Technology and Engineering Literacy,” National Assessment of Educational Progress (NAEP), National Center for Education Statistics, accessed August 2024, source.
- International Telecommunication Union, Digital Skills Assessment Guidebook, 73, source.
- Emily A. Vogels and Monica Anderson, Americans and Digital Knowledge in 2019 (Washington, DC: Pew Research Center, 2019), source.
- EveryoneOn and Horrigan, Digital Skills and Trust, source.