Next Steps
With the IIJA programs ongoing and broad consensus on the importance of closing the digital divide, it is the optimal time to organize U.S. upskilling around consistently defined goals and priorities. The best way to do this is to adopt, and universally abide by, some type of digital skills framework that codifies methods of measuring current digital skills and fosters agreement on shared goals. There are two high-level options available to us if we decide to go down this path.
1. Make Use of Available Frameworks
One option would be to make use of one of the available frameworks or assessments already in circulation. There are a number of frameworks already used by various U.S. entities. Going this route would involve simply expanding or encouraging use of a framework already in use around the country or by relevant institutions.
For example, the Organization for Economic Co-Operation and Development’s “Survey of Adult Skills” results are quite commonly cited due to their breadth and the authoritative nature of the study. In fact, both the National Skills Coalition and the Department of Education use takeaways from its previous round of surveys—which took place between 2011 and 2018—to estimate the size of digital skills gaps in the United States.1 Results from the next cycle of surveys will begin to be released in late 2024.2
While OECD’s survey provides a helpful view into the digital proficiency of its participants, it is limited by its own scale and released only periodically. Further, the practical nature of the test may not provide insight into all aspects of digital competence (if, for example, specific necessary skills do not emerge in the test), though it effectively showcases many of the attitudes and softer skills that are mapped onto some digital skill frameworks by assessing overall problem-solving rather than specific task completion.
In the United States in particular, Northstar’s programs are already widely utilized by institutions across the country. Though they emphasize simpler capacities and discrete, task-based skills, they could provide the underpinnings for practical standards for basic internet use. They offer a helpful combination of standards, curricula, easily administered assessments and lessons that could be expanded to encompass broader skill sets or a wider range of approaches.
In addition, a number of states are making admirable efforts, many of which may be scalable. Hawaii’s digital literacy survey, for example, groups respondents based on their overall approach to technology and digital readiness.3 State digital skill surveys in particular may both provide insight into the state population and be instructive in the creation of a national framework.
Plenty of available landscape scans and guides contain advice on what types of assessments and digital upskilling frameworks will most effectively achieve certain types of digital skills goals. Digital Resilience in the American Workforce (DRAW), for example, provides considerations for users and a checklist to answer when selecting an assessment model.4 The International Telecommunication Union’s Guidebook provides explicit and detailed instructions on choosing (or creating) a national digital skills approach.5 Even when resources are intended for specific stakeholders, like adult educational institutes, the contours of the discussion remain similar.
There are also a number of broader digital inclusion and digital navigator frameworks in circulation. While they can overlap to varying degrees, skills frameworks are differentiated by their narrower focus on the content of the material to be taught or assessed. A digital skills framework is the tool that digital navigators and similar services use to help upskill a population. Existence of one doesn’t negate the need for the other, but inclusion frameworks can serve as an additional, valuable resource to inform the choice of a skills framework.
While adopting or expanding an existing framework would conserve the resources that would otherwise go into creating one, any chosen framework would still most likely need to be curated and adjusted. Moreover, it might never meet the needs of the U.S. population as directly as would a custom framework created specifically to meet those needs. Policymakers choosing to adapt an existing framework should keep those trade-offs in mind.
2. Create an Original Framework
Rather than adopting an existing framework, the United States could also officially create its own. This approach would lead to a tailored, personalized framework that could directly align with U.S. needs and fit within the existing policy landscape. For example, digital skills can include the ability to find and sign up for broadband affordability programs as necessary and the ability to conform to national privacy standards if they exist, both of which may vary based on political context. Particular digital skills may be emphasized if they align with a country’s educational context or workforce-related needs.
The downside of this approach, of course, is the opportunity cost of the resources and time that would go into crafting an original framework. Presenting a novel, untested framework rather than adopting an existing, vetted one could also result in less buy-in by communities, the private sector, and local governments—though it could also result in more because it would be specifically designed for the communities it serves (and, ideally, would have taken their input into consideration).
If the United States did decide to go this route, the work wouldn’t require starting from scratch. An existing trove of available data sources could be harnessed or expanded to provide the necessary data on digital skills and digital skills gaps. The National Telecommunications and Information Administration’s Internet Use survey is regularly administered through the Current Population Survey and collects expansive, authoritative data on the reasons people don’t adopt a broadband connection. Some of the responses available to survey-takers get at a lack of interest in a broadband connection, or similar hesitations, that can indicate a lack of digital skills (and therefore provide data on the extent of those gaps).
And the Digital Equity plans that every U.S. state and territory has submitted under the first Digital Equity Act (DEA) program provide a useful—and current—taxonomy of states’ digital inclusion resources, gaps, and their populations’ digital affinity.6 They assess broadband adoption rates in the context of “meaningful use” and put forth “measurable objectives for documenting and promoting” digital literacy among covered populations.7 The upcoming Digital Competitive Grants program funded under the DEA will additionally fund various digital inclusion projects, many of which may directly address digital skills promotion and all of which will be accompanied (per the program’s guidelines) by a measurement component.8 Indeed, the Communications Equity and Diversity Council (CEDC) report suggests aggregating best practices from states’ Digital Equity Plans into a national digital skills strategy.9
On the local end, organizations focused on advancing digital skills, which range from social service agencies to organizations entirely dedicated to providing digital training to specific populations, regularly collect data to inform their own business models. Massachusetts-based nonprofit Tech Goes Home, for example, administers entry and exit surveys to learners to inform its digital literacy lesson planning and assess its own efficacy.10 The National Digital Inclusion Alliance has similar materials available online—as part of its collectively-created “Digital Navigator Model”—to help communities and digital skills institutions conduct their own skills assessments.11 Collecting and aggregating data that already exists could help inform a national standard based on our existing priorities. It would also take the necessary step of incorporating direct community feedback and stated priorities into the formation of any resulting framework. Continued emphasis on community-based data and an open-source model framework that facilitated ongoing community input and engagement would be key to the project’s success.
The landscape of institutional avenues for work on digital upskilling and data collection is equally rich. As mentioned above, the Federal Communications Commission’s CEDC previously included a digital upskilling workstream that advised the United States to adopt a formalized national digital skills strategy and establish metrics for success.12 The working group also emphasized the importance of measuring existing digital skills and program outcomes, recommending that the United States increase data collection and suggest best practices and data standardization protocols across organizations that receive funding to promote digital skills. This could easily be baked into a broader framework that underscores particular standards. The Council’s current charter includes two separate workstreams—(1) digital empowerment and inclusion and (2) diversity and equity—that both relate to the broader need for digital skills and could provide an avenue for continued research into the area.13 Elsewhere in the government, DRAW has been funded by the U.S. Department of Education’s Office of Career, Technical, and Adult Education to improve adult educational outcomes by creating resources for digital upskilling, including a landscape scan of digital skills literature and deep dives into various areas of interest.14 It provides authoritative research and resources on digital skills in a national context.
Citations
- Bergson-Shilcock, The New Landscape of Digital Literacy, source; Saida Mamedova and Emily Pawlowski, A Description of U.S. Adults Who Are Not Digitally Literate (Washington, DC: U.S. Department of Education, 2018), source.
- “Survey of Adult Skills (PIAAC),” OECD, source.
- State of Hawai‘i Department of Labor, Hawai‘i Digital Literacy and Readiness Study (Honolulu: State of Hawai‘i Department of Labor and Industrial Relations Workforce Development, Omnitrak, 2021), 17, source.
- Rachel McDonnell and Shakari Fraser (Digital Resilience in the American Workforce), “Digital Digest: Selecting an Assessment for Digital Literacy,” Jobs for the Future, June 9, 2022, source.
- International Telecommunication Union, Digital Skills Assessment Guidebook, 18–40, source.
- “Public Notice Posting of State and Territory BEAD and Digital Equity Plan, Initial Proposals, and Challenge Process Portals,” BroadbandUSA, National Telecommunications and Information Administration, accessed July 2024, source.
- “While assessing the current landscape of broadband adoption, States should understand the population of high-speed internet users who engage in meaningful use, referring to how an individual uses their digital literacy skills to enhance educational and employment opportunities.” National Telecommunications and Information Administration, Internet For All: Digital Equity Plan Guidance (Washington, DC: NTIA, 2022), 10, 17, source.
- National Telecommunications and Information Administration, Digital Equity Competitive Grant Program, 20, source.
- Communications Equity and Diversity Council, America’s Digital Transformation, 5, source.
- “Our Programs,” Tech Goes Home, accessed July 2024, source; Mary-Clare Bietila, Mei Ngo, and Ladonna Norris, “Digital Literacy: The Key to Getting Americans Online,” moderated by Jessica Dine, Information Technology and Innovation Foundation, January 11, 2024, source.
- “The Digital Navigator Model,” National Digital Inclusion Alliance, accessed August 2024, source.
- Communications Equity and Diversity Council, America’s Digital Transformation, source.
- “Communications Equity and Diversity Council,” FCC, source.
- “Digital Resilience in the American Workforce (DRAW),” The Literacy Information and Communication System (LINCS), September 2021, source.