OTI and EveryoneOn Release Adoption Metrics Rubric and Instruments

Blog Post
Aug. 28, 2015

Download the evaluation instruments.

At least a fifth of US households do not currently have Internet service, and inequalities between those who have access and those who do not are stark. For example, the most recent data from the Current Population Survey administered by the US Census Bureau (released in 2014) shows that among the wealthiest households ($100,000 or more), 97% have computers and 96% have Internet access at home. Among low-income households ($25,000 or less), computer use stands at 57%, with Internet use at 49%. Only 39% of those without a high school diploma; 61% of African-Americans; 63% of Latinos; and 63% of rural residents use the Internet.

EveryoneOn is an independent nonprofit working with public and private funders, partners, and agencies “to help all Americans access technology through free digital literacy training, discounted high-speed Internet, and low-cost and refurbished computers.” EveryoneOn has engaged the Open Technology Institute (OTI) to develop a framework and tools for evaluation of EveryoneOn’s efforts to eliminate the “digital divide” in the United States by making high-speed, low-cost Internet service available to the one in five Americans who do not use the Internet.

In order to build a holistic understanding of the effectiveness of EveryoneOn’s programs, OTI has examined the factors that influence participants’ choices to engage (or not engage) with Internet service offers. Drawing on our experience evaluating efforts such as the Broadband Technology Opportunities Program, as well as scholarly literature and instruments from the US Census Bureau’s American Community Survey (ACS), the NTIA, and the Pew Research Center’s Internet and American Life Project; state-level efforts such as the California Emerging Technologies Fund (CETF); and the Internet Use Survey at the University of Chicago, OTI has compiled a rubric of standard indicators pertaining to the choices that people make regarding whether or not to use, and/or to subscribe to, digital technologies and Internet services.

We have tailored this rubric to emphasize outcomes-oriented indicators related to “meaningful broadband adoption,” a framework developed by OTI that takes into account contextual and historical social factors impacting digital choices. Metrics designed to understand meaningful broadband adoption thus measure not only progress towards achieving broader subscription rates among traditionally underserved and demographically likely non-adopters, but also a more holistic picture of comfort with digital tools and the availability, effectiveness, and impact of support and training resources. Meaningful metrics can also be leveraged to develop a holistic picture of overall community health, including broader social support networks, and the bearing that these factors have on adoption outcomes. Meaningful metrics do not measure simply the uptake of digital tools among individuals, but also gauge outcomes for families, civic participation, and outcomes and goals as defined by the communities themselves. Meaningful metrics can be adapted and tailored qualitatively and cooperatively through engagement with community partners.

While many stakeholders will likely suggest measuring anticipated outcomes from increased broadband adoption such as job or educational acquisition, it is often hard to gauge the impact of digital access on these outcomes without longitudinal multi-site data collection followed by complex statistical analysis. This may be a possible route following a period of ongoing data collection across digital inclusion programs. For discrete shorter-term impact assessment processes over the next three to five years, however, we would suggest collecting qualitative data on these outcomes via focus groups to inform the eventual meta-analysis of data.

Because digital access is a vital issue affecting economic viability and well-being across the population, researchers, practitioners, and policymakers need adequate information for analyzing the success and impact of different digital inclusion measures. With transparency and data openness from leading existing public-private partnership efforts like EveryoneOn, experts will gain much-needed information to improve and design more appropriate measures over time. Currently EveryoneOn hosts a growing set of data on its portal, collected via its own campaigns, from its enrollment partners, and from some of its Internet Service Provider (ISP) partners. The current evaluation planning process represents an important opportunity to create a vital open data resource that will inform our understanding of digital access, relevance, and adoption throughout the field.

We have also provided guidelines for best practices with regard to ethical data collection, management, and protection as well as user-informed consent to participation in evaluation and research.

Meaningful Metrics Rubric

Degree of comfort

  • Factors:
    • Perception of and experience with ISP companies, community organizations, or other partners
    • Skill/difficulty with hardware or software
    • Comfort with employing digital tools for a range of uses
  • Sample metrics:
    • Qualitative (assess via analysis of survey, focus group, and interview data)
    • Aggregated, anonymized usage data
    • # and type of interactions with provider/platform website and IVR systems

Availability of support

  • Factors:
    • Availability of training
    • Presence of ongoing support resources
    • Presence of social support network in community
  • Sample metrics:
    • # hours of training offered (by various partners)
    • # and type of partners offering training
    • Assessment of different training programs’ impact on participant engagement
    • # support requests
    • Qualitative data from training exit surveys

Modality

  • Factors:
    • Types of devices available or already owned
    • Availability and choice of means of access and (e.g., fixed vs. mobile; home, work, or school; satellite or cable)
    • How type of device and means of access inform kinds of online activity
  • Sample metrics:
    • # of devices distributed via programs
    • # and type of devices logging on to household networks
    • Qualitative (assess via analysis of survey, focus group, and interview data)

Cost/relevance:

  • Factors:
    • At what price point and quality of service people feel that the benefits of subscription are worth the cost
  • Sample metrics:
    • # of subscribers (via different partners/with different ISPs, signing up under different conditions, etc.)
    • Duration of subscriptions (# months active)
    • Suspensions due to non-payment
    • Amount of data and upload/download speeds of offers

External/contextual factors

  • Factors:
    • Awareness of offers
    • Subscription eligibility requirements
    • Demographic factors and local conditions
  • Sample metrics:
    • # ad impressions
    • # people entering program after hearing about it from ads
    • Census and other demographic data
    • Comparative eligibility data from different providers

Impact/outcome (change over time from baseline)

  • Factors:
    • Improved employment status
    • Skills attainment
    • Economic and civic participation
    • Access to services and benefits
    • Access to social support and connectivity
  • Sample metrics:
    • # jobs applied for and attained following training and/or uptake of services
    • # and types of digital skills attained by participants
    • Evidence of participants contributing to economic or civic activity via digital means
    • # of participants accessing government services via digital means
    • Qualitative data on perception of opportunities, degree of comfort with digital tools, or willingness to participate in economic and civic opportunity
    • Evidence of a change in economic and social conditions that can be tied to broadband uptake, for example via statistical modeling