国产视频

Next Steps

With the IIJA programs ongoing and broad consensus on the importance of closing the digital divide, it is the optimal time to organize U.S. upskilling around consistently defined goals and priorities. The best way to do this is to adopt, and universally abide by, some type of digital skills framework that codifies methods of measuring current digital skills and fosters agreement on shared goals. There are two high-level options available to us if we decide to go down this path.

1. Make Use of Available Frameworks

One option would be to make use of one of the available frameworks or assessments already in circulation. There are a number of frameworks already used by various U.S. entities. Going this route would involve simply expanding or encouraging use of a framework already in use around the country or by relevant institutions.

For example, the Organization for Economic Co-Operation and Development鈥檚 鈥淪urvey of Adult Skills鈥 results are quite commonly cited due to their breadth and the authoritative nature of the study. In fact, both the National Skills Coalition and the Department of Education use takeaways from its previous round of surveys鈥攚hich took place between 2011 and 2018鈥攖o estimate the size of digital skills gaps in the United States.1 Results from the next cycle of surveys will begin to be released in late 2024.2

While OECD鈥檚 survey provides a helpful view into the digital proficiency of its participants, it is limited by its own scale and released only periodically. Further, the practical nature of the test may not provide insight into all aspects of digital competence (if, for example, specific necessary skills do not emerge in the test), though it effectively showcases many of the attitudes and softer skills that are mapped onto some digital skill frameworks by assessing overall problem-solving rather than specific task completion.

In the United States in particular, Northstar鈥檚 programs are already widely utilized by institutions across the country. Though they emphasize simpler capacities and discrete, task-based skills, they could provide the underpinnings for practical standards for basic internet use. They offer a helpful combination of standards, curricula, easily administered assessments and lessons that could be expanded to encompass broader skill sets or a wider range of approaches.

In addition, a number of states are making admirable efforts, many of which may be scalable. Hawaii鈥檚 digital literacy survey, for example, groups respondents based on their overall approach to technology and digital readiness.3 State digital skill surveys in particular may both provide insight into the state population and be instructive in the creation of a national framework.

Plenty of available landscape scans and guides contain advice on what types of assessments and digital upskilling frameworks will most effectively achieve certain types of digital skills goals. Digital Resilience in the American Workforce (DRAW), for example, provides considerations for users and a checklist to answer when selecting an assessment model.4 The International Telecommunication Union鈥檚 Guidebook provides explicit and detailed instructions on choosing (or creating) a national digital skills approach.5 Even when resources are intended for specific stakeholders, like adult educational institutes, the contours of the discussion remain similar.

There are also a number of broader digital inclusion and digital navigator frameworks in circulation. While they can overlap to varying degrees, skills frameworks are differentiated by their narrower focus on the content of the material to be taught or assessed. A digital skills framework is the tool that digital navigators and similar services use to help upskill a population. Existence of one doesn鈥檛 negate the need for the other, but inclusion frameworks can serve as an additional, valuable resource to inform the choice of a skills framework.

While adopting or expanding an existing framework would conserve the resources that would otherwise go into creating one, any chosen framework would still most likely need to be curated and adjusted. Moreover, it might never meet the needs of the U.S. population as directly as would a custom framework created specifically to meet those needs. Policymakers choosing to adapt an existing framework should keep those trade-offs in mind.

2. Create an Original Framework

Rather than adopting an existing framework, the United States could also officially create its own. This approach would lead to a tailored, personalized framework that could directly align with U.S. needs and fit within the existing policy landscape. For example, digital skills can include the ability to find and sign up for broadband affordability programs as necessary and the ability to conform to national privacy standards if they exist, both of which may vary based on political context. Particular digital skills may be emphasized if they align with a country鈥檚 educational context or workforce-related needs.

The downside of this approach, of course, is the opportunity cost of the resources and time that would go into crafting an original framework. Presenting a novel, untested framework rather than adopting an existing, vetted one could also result in less buy-in by communities, the private sector, and local governments鈥攖hough it could also result in more because it would be specifically designed for the communities it serves (and, ideally, would have taken their input into consideration).

If the United States did decide to go this route, the work wouldn鈥檛 require starting from scratch. An existing trove of available data sources could be harnessed or expanded to provide the necessary data on digital skills and digital skills gaps. The National Telecommunications and Information Administration鈥檚 is regularly administered through the Current Population Survey and collects expansive, authoritative data on the reasons people don鈥檛 adopt a broadband connection. Some of the responses available to survey-takers get at a lack of interest in a broadband connection, or similar hesitations, that can indicate a lack of digital skills (and therefore provide data on the extent of those gaps).

And the Digital Equity plans that every U.S. state and territory has submitted under the first Digital Equity Act (DEA) program provide a useful鈥攁nd current鈥攖axonomy of states鈥 digital inclusion resources, gaps, and their populations鈥 digital affinity.6 They assess broadband adoption rates in the context of 鈥渕eaningful use鈥 and put forth 鈥渕easurable objectives for documenting and promoting鈥 digital literacy among covered populations.7 The upcoming Digital Competitive Grants program funded under the DEA will additionally fund various digital inclusion projects, many of which may directly address digital skills promotion and all of which will be accompanied (per the program鈥檚 guidelines) by a measurement component.8 Indeed, the Communications Equity and Diversity Council (CEDC) report suggests aggregating best practices from states鈥 Digital Equity Plans into a national digital skills strategy.9

On the local end, organizations focused on advancing digital skills, which range from social service agencies to organizations entirely dedicated to providing digital training to specific populations, regularly collect data to inform their own business models. Massachusetts-based nonprofit Tech Goes Home, for example, administers entry and exit surveys to learners to inform its digital literacy lesson planning and assess its own efficacy.10 The National Digital Inclusion Alliance has similar materials available online鈥攁s part of its collectively-created 鈥淒igital Navigator Model鈥濃攖o help communities and digital skills institutions conduct their own skills assessments.11 Collecting and aggregating data that already exists could help inform a national standard based on our existing priorities. It would also take the necessary step of incorporating direct community feedback and stated priorities into the formation of any resulting framework. Continued emphasis on community-based data and an open-source model framework that facilitated ongoing community input and engagement would be key to the project鈥檚 success.

The landscape of institutional avenues for work on digital upskilling and data collection is equally rich. As mentioned above, the Federal Communications Commission鈥檚 CEDC previously included a digital upskilling workstream that advised the United States to adopt a formalized national digital skills strategy and establish metrics for success.12 The working group also emphasized the importance of measuring existing digital skills and program outcomes, recommending that the United States increase data collection and suggest best practices and data standardization protocols across organizations that receive funding to promote digital skills. This could easily be baked into a broader framework that underscores particular standards. The Council鈥檚 current charter includes two separate workstreams鈥(1) digital empowerment and inclusion and (2) diversity and equity鈥攖hat both relate to the broader need for digital skills and could provide an avenue for continued research into the area.13 Elsewhere in the government, DRAW has been funded by the U.S. Department of Education鈥檚 Office of Career, Technical, and Adult Education to improve adult educational outcomes by creating resources for digital upskilling, including a landscape scan of digital skills literature and deep dives into various areas of interest.14聽It provides authoritative research and resources on digital skills in a national context.

Citations
  1. Bergson-Shilcock, The New Landscape of Digital Literacy, ; Saida Mamedova and Emily Pawlowski, A Description of U.S. Adults Who Are Not Digitally Literate (Washington, DC: U.S. Department of Education, 2018), .
  2. 鈥淪urvey of Adult Skills (PIAAC),鈥 OECD, .
  3. State of Hawai鈥榠 Department of Labor, Hawai鈥榠 Digital Literacy and Readiness Study (Honolulu: State of Hawai鈥榠 Department of Labor and Industrial Relations Workforce Development, Omnitrak, 2021), 17, .
  4. Rachel McDonnell and Shakari Fraser (Digital Resilience in the American Workforce), 鈥淒igital Digest: Selecting an Assessment for Digital Literacy,鈥 Jobs for the Future, June 9, 2022, .
  5. International Telecommunication Union, Digital Skills Assessment Guidebook, 18鈥40, .
  6. 鈥淧ublic Notice Posting of State and Territory BEAD and Digital Equity Plan, Initial Proposals, and Challenge Process Portals,鈥 BroadbandUSA, National Telecommunications and Information Administration, accessed July 2024, .
  7. 鈥淲hile assessing the current landscape of broadband adoption, States should understand the population of high-speed internet users who engage in meaningful use, referring to how an individual uses their digital literacy skills to enhance educational and employment opportunities.鈥 National Telecommunications and Information Administration, Internet For All: Digital Equity Plan Guidance (Washington, DC: NTIA, 2022), 10, 17, .
  8. National Telecommunications and Information Administration, Digital Equity Competitive Grant Program, 20, .
  9. Communications Equity and Diversity Council, America鈥檚 Digital Transformation, 5, .
  10. 鈥淥ur Programs,鈥 Tech Goes Home, accessed July 2024, ; Mary-Clare Bietila, Mei Ngo, and Ladonna Norris, 鈥淒igital Literacy: The Key to Getting Americans Online,鈥 moderated by Jessica Dine, Information Technology and Innovation Foundation, January 11, 2024, .
  11. 鈥淭he Digital Navigator Model,鈥 National Digital Inclusion Alliance, accessed August 2024, .
  12. Communications Equity and Diversity Council, America鈥檚 Digital Transformation, .
  13. 鈥淐ommunications Equity and Diversity Council,鈥 FCC, .
  14. 鈥淒igital Resilience in the American Workforce (DRAW),鈥 The Literacy Information and Communication System (LINCS), September 2021, .

Table of Contents

Close