Interoperability Is Fundamental to the Internet
Interoperability is a foundational principle of the internet. In fact, it could reasonably be described as what makes the internet what it is. As Mozilla puts it in a recent working paper, “[I]nteroperability is the internet’s secret sauce.”1 Interoperability is a presumption built into everything from the Transmission Control Protocol (TCP),2 which describes how devices using the internet should exchange information on a millisecond-to-millisecond basis, to the Hypertext Transmission Protocol (HTTP),3 which details how web servers respond to requests and send documents.
Technological interoperability was around long before the advent of the internet. Telephone, telegraph, and radio communications all rely on interoperable protocols, such that someone using AT&T’s phone network in Atlanta can reliably place a call to someone else using Telefônica in Rio de Janeiro.4 Morse code is still Morse code whichever side of the telegraph cable you’re sitting on. The concept even predates electrical communications, and can be seen in the development of everything from railroad gauges to bullet calibers.5
On the internet, interoperability is everywhere. Email is one of the original interoperable services, dating back to the early 1970s and working still today. The World Wide Web is also interoperable on a few different levels. Any modern browser can load any web page out there, and any page can link to any other, creating a global system of interconnected pieces of data.
Other online services are less interoperable. Most messaging services don’t work with each other—a WhatsApp user can’t reach someone using iMessage—and social networks have also remained studiously separate from one another.
In the case of these networked computer services, interoperability is often divided into two types: that which is achieved through the use of standard protocols, and that which is achieved through Application Programming Interfaces (APIs).6 Both of these paradigms allow one computer system to interact with another, but they differ in some crucial respects that usually make the standard protocols approach better suited to solving the issues discussed in this paper.7
In the case of these networked computer services, interoperability is often divided into two types: that which is achieved through the use of standard protocols, and that which is achieved through Application Programming Interfaces (APIs).
APIs are interfaces between a single computer system and the outside world. They are a set of well-defined ways to interact with a system to get the system to take some action, to get some response from the system, or often both. APIs are distinguishable from a normal web page in that API invocations and responses are conducted in machine-readable formats rather than through user interfaces that a person would gather information from. A social network system’s API might provide a function that takes in a user’s numerical identifier and returns that user’s name and profile information as long as the entity requesting the information is authenticated and has the relevant permissions. APIs will vary widely from system to system, and can completely change whenever the entity responsible for the system desires, placing the burden of remaining compatible on the third parties using the API.
One example of a publicly available API is the National Oceanographic and Atmospheric Association’s (NOAA) weather service.8 Given a location, it will return the weather and forecast for that location in a format called Javascript Object Notation (JSON).
“Standardized protocols” (also known as “open protocols”), the other category of computer interoperability, differ from APIs largely in how they are developed and how they are intended to be used. These differences can have far-reaching effects. Because of these effects, standardized protocols encourage deeper interoperability and better reciprocity between services that implement them than we usually see from APIs.
Standardized protocols are referred to as such because the details of how they operate have been negotiated between many interested parties, agreed to by a cross section of people and organizations that are writing software to implement them, and published publicly for everyone to inspect. This process generally takes place within one of a few organizations responsible for developing these types of standards. The Internet Engineering Task Force (IETF) and the World Wide Web Consortium (W3C) are two of the most prominent, and each has its own scope of topics that it covers.9
The IETF is primarily responsible for the lower-level workings of the internet. They write protocols like TCP, which describes how computers split up pieces of information into small packets to send through the network and reassemble them on the other side. They also develop the HTTP, covering how clients request web pages from servers.
The W3C focuses on a higher conceptual level, detailing how web browsers turn the code that makes up a web page into the result we see on the screen. They evaluate and agree on new web features, such as Web Bluetooth, which gives web pages the ability (with permission) to speak to Bluetooth devices, and Cascading Style Sheets (CSS), which browsers use to lay out a web page and apply styles ranging from bold fonts to dotted outlines.
These organizations are generally open and work transparently. The IETF welcomes anyone to join its mailing lists and attend its meetings.10 The W3C has a membership process and charges fees to enable its governance, but the public is able to observe and participate via mailing lists and other avenues.11 While it is outside the scope of this paper, it is worth noting that both the IETF and the W3C face continued criticism for lack of accessibility to the public and non-technical audiences, a lack of formal accountability, and for tending to favor business outcomes over other concerns.12
This process—open and transparent dialog between different people and organizations looking to implement a given standard—is, despite some arguable flaws, one of the things that makes standardized protocols a better choice than API for implementing interoperability.13 Unlike APIs, which are largely designed by one company for use with its own service without regard to other companies’ services, standard protocols represent a consensus among many different parties on what would be best for the internet. This has a number of effects that serve to incentivize interoperability: (1) If many different platforms are using the standard protocol, new entrants can advertise compatibility and access to those services to potential users; (2) Because more developers are working with the protocol, there will be mature and reliable open source software libraries implementing its functionality that can be adapted by anyone to get a new service into the hands of people rapidly; and (3) The shared ownership of the protocol can help to level the playing field such that other services are not subject to the whim of one large platform.
This process—open and transparent dialog between different people and organizations looking to implement a given standard—is, despite some arguable flaws, one of the things that makes standardized protocols a better choice than API for implementing interoperability.
Open protocols are intended to be used and implemented by anyone with the desire to do so, and copyright (and often patent) rights associated with the protocols are therefore affirmatively released as open source by the publishing body.14 This is important in the wake of a recent legal decision by the U.S. Circuit Court for the Federal Circuit in Google v. Oracle which found that APIs can, at least in some circumstances, be entitled to copyright protections (this case has been granted certiorari by the U.S. Supreme Court and is awaiting oral argument at the time of publication).15 The blanket disclaimers offered by these organizations mean that others can implement the protocol without fear of legal retribution.
Standardized protocols do have some downsides. Primarily, the inclusive and consensus-based method by which they’re developed means that they take a while to progress from idea to final specification. Relatedly, the consensus process means it can be difficult to revise standardized protocols over time, and later revisions to capture new features or technologies can lag behind experiments “in the wild.”16 Getting involved in the definition of specifications can also be an intimidating idea for anyone who hasn’t done it before. Despite these drawbacks, the collaborative and open process involved in developing protocols make them a better match for overall user freedom, and in particular the aims of interoperability.
Citations
- Chris Riley, A framework for forward-looking tech competition policy (Mozilla, 2019): 21, source
- Internet Engineering Task Force, Transmission Control Protocol, (RFC 793), source.
- Internet Engineering Task Force, HyperText Transfer Protocol (RFC 2068), source.
- See generally, United States Congress Office of Technology Assessment, Critical Connections: Communications for the Future Volume 2 (Washingdon, DC, 1990): Ch. 11.
- For example, the British Standards Group History, founded in 1901 in response to the need for common bolt thread spacing and steel sections for tramways, “Our History,” The British Standards Institute, source
- Ross Schulman, A Tech Intro to Data Portability (Washington, DC: New America, 2018) source
- Chris Riley, A framework for forward-looking tech competition policy (Mozilla, 2019): 18, source
- “API Web Service,” National Weather Service, source
- See IETF, source ; W3C, source
- “Participate in the IETF,” IETF, source
- “Participate,” W3C, source
- See, for example, “The Tao of IETF,” IETF, source , roughly 70 pages of introductory material regarding participating in the IETF.
- Chris Riley, A framework for forward-looking tech competition policy (Mozilla, 2019): 18, source
- “Copyright Policy and Trust Legal Provisions (TLP) Frequently Asked Questions,” IETF, June 22, 2010, source
- Oracle America, Inc v Google, LLC, 886 F.3d 1179 (Fed. Cir. 2018).
- Moxie Marlinspike, “Reflection: The Ecosystem is Moving,” Signal, May 10, 2016, source