Introduction

The Internet of Things (IoT) is rapidly expanding, and more and more manufacturers of consumer products like kitchen appliances, televisions, and security systems are connecting those devices to the internet. Last year, New America’s Open Technology Institute (OTI) undertook a project to educate people about the Digital Standard, a new framework for evaluating the privacy and security of internet-connected consumer products and software. The Standard was developed by a group of organizations, including Ranking Digital Rights, in collaboration with Consumer Reports, Aspiration, the Cyber Independent Testing Lab, Disconnect, and other partners. For companies, the Digital Standard is a useful tool to help drive privacy and security-focused development, and as a checklist to add to quality assurance processes. For the press, the Digital Standard is a benchmark by which to measure products while reporting on information security and data privacy. Civil society organizations can use the Digital Standard in policy research, or as an example as they advocate for companies to implement best practices for privacy and security.

The most common complaint we heard while discussing the Standard with product designers, developers, and manufacturers was that it didn’t provide enough guidance about how to perform the tests that it describes. The Standard contains dozens of “indicators” (individual items detailing the elements of behavior that make up a test) that need to be evaluated when testing a given device. They take the form of a broad statement to be evaluated, such as “users can control how their information is used to target advertising” or “the software does not make use of unsafe functions or libraries.” They describe what the designated outcome should be, but by and large the indicators do not spell out how to arrive at an answer. A more in-depth step-by-step instruction manual is usually called a “methodology,” however there is not yet a Digital Standard methodology available to the public.

OTI is setting out to fill that gap by selecting a few representative products and apps and putting them through the Digital Standard ringer. Throughout the process, we’ll be taking detailed notes on how exactly we judged each indicator, including what information we needed to collect in order to measure whether the indicator is met, where we looked for it and where we found it, and how we interpreted the inevitable vagueness and edge cases. We will publish our notes openly as a resource as we go along, and we’ll update the methodology in real time to keep up with our findings. We hope this enables more people to use the Digital Standard.

Table of Contents

Close