Introducing: Evaluating the Digital Standard
The Internet of Things (IoT) is rapidly expanding, and more and more manufacturers of consumer products like kitchen appliances, televisions, and security systems are connecting those devices to the internet. Last year, New America’s Open Technology Institute (OTI) undertook a project to educate people about the Digital Standard, a new framework for evaluating the privacy and security of internet-connected consumer products and software. The Standard was developed by a group of organizations including Ranking Digital Rights, in collaboration with the magazine Consumer Reports. For companies, the Digital Standard is a useful tool to help drive privacy and security-focused development, and as a checklist to add to quality assurance processes. For the press, the Digital Standard is a benchmark by which to measure products while reporting on information security and data privacy. Civil society organizations can use the Digital Standard in policy research, or as an example as they advocate for companies to implement best practices for privacy and security.
The most common complaint we heard while discussing the Standard with product designers, developers, and manufacturers was that it didn’t provide enough guidance about how to perform the tests that it describes. The Standard contains dozens of “indicators” (individual items detailing the elements of behavior that make up a test) that need to be evaluated when testing a given device. They take the form of a broad statement to be evaluated, such as “users can control how their information is used to target advertising” or “the software does not make use of unsafe functions or libraries.” They describe what the designated outcome should be, but by and large the indicators do not spell out how to arrive at an answer. A more in depth step-by-step instruction manual is usually called a “methodology,” however there is not yet a Digital Standard methodology available to the public.
Starting now, OTI is setting out to fill that gap. Over the next several months we will pick a few representative products and apps and put them through the Digital Standard ringer. Throughout the process, we’ll be taking detailed notes on how exactly we judged each indicator, including what information we needed to collect in order to measure whether the indicator is met, where we looked for it and where we found it, and how we interpreted the inevitable vagueness and edge cases. We will be publishing our notes openly as a resource for anyone who wants to use the amazing tool that is the Digital Standard.
Our first product is a smart lock (we don’t plan to mention publicly what precise make and model we’re analyzing, since the point here is to produce the methodology and not an actual review of any individual product). In the coming weeks, we'll begin publishing pieces of our methodology on a recurring basis. Once we finish with the smart lock, we’ll turn to our next product and revise and update our methodology as we come across new challenges and learn about ways to make the Digital Standard accessible to those who want to conduct their own testing.
Keep an eye on the landing page and our Twitter feed to catch our updates, and feel free to give us feedback and advice throughout the process.