Introduction and Executive Summary

In 2016, New America’s Open Technology Institute and Harvard University’s Berkman Klein Center for Internet & Society released the Transparency Reporting Toolkit, a joint project that aimed to make it easier for companies to create and improve their transparency reports around government demands for user data and to make transparency reporting more consistent, easier to understand and more effective.

As the internet has become an increasingly important tool for free expression for individuals around the world, so too have the platforms and networks that carry that expression become speech gatekeepers, often removing or blocking users’ content for various legal or policy reasons. As questions and controversies around these internet companies’ content takedown practices have multiplied, many have expanded their transparency reports to include data on content takedowns and restrictions as well as network shutdowns and service interruptions. However, as with reporting on government demands for user data, there is a great deal of variance in the approaches, styles, and scope of data that companies are using in their reporting, making it hard to combine or compare data in a meaningful way.

Transparency reporting on content takedowns is critically important because it helps holds companies accountable in their role as gatekeepers of our online speech, and helps the public identify where they think the companies are doing too much—or not enough—to address content issues on their platforms and networks. Transparency reporting also offers a number of benefits to the companies themselves, including helping to build trust with their users and policymakers. For companies who are consistently under pressure to act on problematic speech, this is an opportunity to highlight to users and policymakers the responses they have implemented, and communicate the size and complexity of the problems they are addressing, as well as the impact they have had thus far. Reporting can also help reveal where particular governments or laws are leading to a disproportionate amount of content takedowns, thus helping hold public authorities to account as well. For all these reasons, tools for promoting, improving, and standardizing the issuance of content takedown reporting are much-needed, hence this toolkit.

For this edition of the Transparency Reporting Toolkit, focused on content takedown reporting, we’ve relied not only on extensive consultations and convenings with a wide range of companies and civil society experts but also on the rigorous work of the Ranking Digital Rights project at New America, which has developed a broad set of indicators for judging how well companies perform when it comes to protecting human rights. We’ve also applied the lessons we’ve learned from our recent Getting Companies to Do The Right Thing project, which included an extensive history of the practice of transparency reporting and how it has evolved over the years.

We surveyed 24 international and 46 domestic internet and telecommunications that issue transparency reports of some kind, and found 35 of them have reported on content-related demands and takedowns, mostly falling into six categories:

  • Government and other legal content demands
  • Copyright requests
  • Trademark requests
  • Network shutdowns and service interruptions
  • Right to be Forgotten delisting requests
  • Community Guidelines-based content takedowns

Charts 1a and 1b reflect which companies are issuing reports on each of these six categories.

Based on this survey, we identified 11 general best practices for content takedown reporting which are applicable across these different types of content reports:

  • Issuing regular reports on clearly and consistently delineated reporting periods: Companies should consistently publish reports at regular intervals covering clearly defined reporting periods.
  • Issuing reports specific to the type of demand: Companies should report separately on different types of content demands.
  • Reporting on types of demands using specific numbers: Companies should report separately on the number of demands they have received for each category.
  • Breaking down demands by country: Companies should break down the demands they have received for each category by country.
  • Reporting on categories of objectionable content targeted by demands: Companies should break down demands received by the different types of objectionable content (e.g. nonconsensual pornography, extremist content) that the demands target.
  • Reporting on products targeted by demands: Companies should break down demands by which of their products or services the demands target.
  • Reporting on specific government agencies/parties that submitted demands: Companies should break down demands received by which government agencies or parties submitted them.
  • Specifying which laws pertain to specific demands: Companies should specify which laws are associated with reported demands.
  • Reporting on the number of accounts and items specified in demands: Companies should break down how many accounts and pieces of content were specified in demands they received.
  • Reporting on the number of accounts and items impacted by demands: Companies should break down how many accounts and pieces of content were impacted (i.e., taken down or otherwise restricted) in response to demands.
  • Reporting on how the company responded to demands: Companies should break down their different responses to demands (e.g., complied, partially complied, rejected, etc.).

Charts 2a through 6 in the appendix reflect, for the first five of the six different categories of reports, which companies satisfy these best practices. We did not include a chart for Community Guidelines-based content takedowns because that practice is still relatively rare and highly variable in how it is reported on.

Most of the above best practices are focused on quantitative transparency—counting incidents and items. Our survey also revealed several additional best practices that companies should follow that are mostly aimed at enhancing the qualitative transparency around takedowns by offering greater context and explanation, or features aimed at making the data more useful. These include:

  • Defining terms clearly
  • Providing meaningful explanations of internal policies
  • Offering case studies to illustrate the company’s practices and the issues it faces
  • Reporting on specific notices where reasonable and permitted by law
  • Providing meaningful numbers that reflect how many pieces of content or accounts were taken down, blocked or otherwise restricted based on automated flagging or review
  • Linking relevant reports to one another
  • Publishing reports at static and functioning URLs
  • Publishing data in a structured data format
  • Publishing reports using a non-restrictive Creative Commons license
  • Offering a Frequently Asked Questions section for the report

Finally, our survey identified a handful of additional best practices specific to reporting on certain categories of takedowns:

  • Reporting on copyright requests should include
    • The number and impact of counter-notices to DMCA-based requests
    • The different format or medium of content (audio, video, photograph, text, etc.) targeted by copyright requests
    • Data on who is submitting copyright requests
  • Reporting on network shutdowns and service interruptions should include
    • The date or date range of the network shutdown or service interruption
    • The duration of the network shutdown or service interruption
  • Right to be Forgotten delisting requests should include
    • The categories of content requested to be delisted (e.g. personal information, professional information, information about crime and professional wrongdoing, etc.).
    • The categories of websites targeted (e.g., news, social network, directory, etc.)

These recommendations—which are not legal advice, but simply our survey of the features of current reports—reflects our understanding of current best practices in this space. Our survey was based on transparency reports published as of August 2018 (notably, South Korean internet company Daum Kakao has since issued a new transparency report that unfortunately is much less detailed than its previous reports; the data and examples regarding Daum Kakao in this toolkit are based on its prior report, for which we use links to archived versions of its report at Archive.org).

As companies continue to expand their transparency reporting on these issues, we hope to revisit these resources and update them. If you believe we missed out on anything, please feel free to reach out to us and let us know. We hope this edition of the Toolkit will help facilitate discussions around best practices for content takedown reporting as well as foster future discussions on best practices for other types of transparency reporting.

Introduction and Executive Summary

Table of Contents

Close