General Best Practices for Content Takedown Reporting

Through our survey of internet and telecommunications companies and their approaches to reporting on content demands, we have identified a set of general best practices for such reporting. Some of these best practices are applicable to all forms of content-and network-related reporting, regardless of the issue area they fall under. These include:

  • Issuing regular reports on clearly and consistently delineated reporting periods
  • Issuing reports specific to the type of demand
  • Reporting on types of demands using specific numbers
  • Breaking down demands by country
  • Reporting on categories of objectionable content targeted by demands
  • Reporting on products targeted by demands
  • Reporting on specific government agencies/parties that submitted demands
  • Specifying which laws pertain to specific demands.
  • Reporting on the number of accounts and items specified in demands
  • Reporting on the number of accounts and items impacted by demands
  • Reporting on how the company responded to demands

Below are brief overviews of what these 11 best practices for reporting include and why they are important. These best practices also include examples of companies that we believe demonstrated these best practices in an outstanding manner.

1. Issuing regular reports on clearly and consistently delineated reporting periods

The first and most fundamental best practice in transparency reporting of any kind is to regularly issue such reports on a consistent timeline, and clearly and consistently delineate the reporting period for each issued report.

Currently there is no industry-wide standard for how often companies should publish transparency reports, and reports typically range from quarterly to annual publication schedules. Because reports issued more often and covering shorter periods can offer more granular information, the best practice is to publish quarterly, if practical. Regardless of the publication schedule a company adopts, it should clearly define the time period covered by each report, and subsequent reports should cover the same length of time so that reports may be easily compared. If a company fails to publish a transparency report for an expected period, they should explain why.

As visible in Charts 1a and 1b, a number of internet and telecommunications platforms have been inconsistent in publishing their reports. Below are some examples of companies that have issued regular reports, using the three most common time periods: quarterly, biannually, and annually.

CREDO Mobile: CREDO Mobile publishes a transparency report quarterly. Each report covers three months.

Verizon: Verizon publishes a transparency report bi-annually, covering the periods of January-June and July-December, respectively.

Reddit Reddit publishes a transparency report annually. The report covers the period of January 1 – December 31 of each year.

2. Issuing reports specific to the type of demand

By reporting separately on different types of demands or takedowns—breaking down numbers as between, e.g., government and other legal demands vs. copyright requests vs. trademark requests vs. Right To Be Forgotten delisting requests vs. community guidelines violations, rather than lumping them all together—a company is able to highlight the breadth of demands they have received and the volume and impact of each of these takedown categories. Some examples of companies that exhibit this best practice in their transparency reports are:

Automattic: In its latest transparency report, which covers the first half of 2018, Automattic separately reports that they received 9,166 total copyright notices and 318 trademark notices.

Microsoft: In its latest transparency report, Microsoft separately reports on government requests for content removal, copyright removal requests, and Right to be Forgotten requests.

Snap Inc.: In its latest transparency report, which covers the the second half of 2017, Snap Inc. separately reports on the number of government-issued content removal requests and copyright requests they received.

Some companies report on the number of requests per issue area in unique ways and do not always explain the rationale behind such approaches. Telefonica is one example of this. In some cases, Telefonica separately reports the number of requests received in a country for the blocking and filtering of contents and for geographical or temporary suspensions of the service. However, in some countries, such as Argentina, they lump these figures together in conjunction with the number of requests they received for access to metadata. Ideally, companies will report consistently, or at least explain when they do not.

3. Reporting on types of demands using specific numbers

By reporting separately on the number of demands a company receives over a given time period for each different type of takedown, companies can highlight which types of demands are most common. This best practice requires the publication of specific numbers; percentages alone are not sufficient (though they are a helpful supplement to specific numbers), nor are numeric ranges. Some examples of companies that exhibit this best practice in their transparency reports are:

Apple: Apple specifies that between July and December 2017 it received 7 requests for account restriction/account deletion globally.

Oath: In its latest transparency report, which covers the second half of 2017, Oath reports that it received 77 total government-issued removal requests from around the world.

Pinterest: In its latest transparency report, which covers the period of January – March 2018, Pinterest reports that it received 26 government-issued content removal requests.

4. Breaking down demands by country

In order to demonstrate the geographic scope of demands a company is receiving, and to highlight which countries’ governments or laws are most actively restricting online free expression, companies should specify how many requests originate from each specific country. The most effective way to provide this information is are to create a list or map of all countries relevant to a company’s operations and indicate the number of demands received from each. Some examples of companies that exhibit this best practice are:

Mapbox: Mapbox reports on the number of DMCA takedown notices and the number of government requests to withhold content it has received using a map of the world that shows a specific number for each country. (So far, Mapbox has not received any such requests, so all of the figures on its map are 0.)

Microsoft: In its latest transparency report, which covers the period July-December 2017, Microsoft breaks down data on Right to be Forgotten delisting requests by country. Some of the countries included in this breakdown are Austria, Finland, Germany, Romania, Russia and the United Kingdom.

Telenor: In its March 2017 Authority Requests Disclosure Report, Telenor breaks down the number of requests it has received by the countries it operates in. The countries covered in this report are Norway, Sweden, Denmark, Hungary, Serbia, Montenegro, Bulgaria, Pakistan, India, Bangladesh, Myanmar, Thailand and Malaysia.

5. Reporting on categories of objectionable content targeted by demands

By reporting on the categories of objectionable content targeted by different types of content demands, a company can highlight the varying reasons that parties are asking for content to come down, and also indirectly highlight the relative prevalence of different types of problematic content on their services. This data can be particularly revealing when combined with the prior best practice of breaking down demands by country, allowing readers to spot (e.g.) which countries are most aggressively seeking to censor which types of content. Some examples of categories of objectionable content include extremist content, obscenity and child pornography, content that violates privacy, and defamatory content.

In order to receive credit for this best practice, companies need to provide specific numbers breaking down how many requests pertained to what kinds of objectionable content.

In the context of government and other legal content demands, satisfying this best practice means highlighting the types of objectionable content specified in requests.

Google: In its latest transparency report on government requests to remove content, which covers the second half of 2017, Google enables users to filter the number of requests received and the number of items specified within those requests based on the reasons behind them. Users can also select individual countries and view quantitative breakdowns on the number of requests received for each category of objectionable content relevant to that country. Common reasons cited in government requests to remove content include national security, defamation, privacy and security, hate speech, drug abuse, religious offense, impersonation, obscenity and nudity. Google also has a separate dedicated section that explains the most common reasons cited for content removal.

In the context of copyright requests, because the category of objectionable content—content that allegedly infringes on a copyright holders’ rights—is self-evident, satisfying this best practice means reporting on the format or medium of content being targeted by requests.

Tumblr: In its latest transparency report on copyright and trademark requests, which covers the second half of 2017, Tumblr highlights the kinds of content DMCA requests targeted, including content formats (some Tumblr-specific) such as images, texts, audio, video, links, asks, quotes and chats. This is also applicable to trademark requests. However, currently no companies report on the format or medium of content being targeted by trademark requests.

In the context of network shutdowns and service interruptions, satisfying this best practice means reporting on the reasons behind those shutdowns and interruptions. This does not always require specific numbers, and can be qualitative instead.

Millicom In its latest Law Enforcement Disclosure Report, which covers the year 2017, Millicom reports that since 2014, authorities in El Salvador and Honduras have required telecommunications companies to shut down services or reduce signal capacity in and around prisons in order to prevent criminal gangs from using smuggled cell phones.

In the context of Right to be Forgotten delisting requests, satisfying this best practice means reporting on the categories of content targeted by delisting requests.

Google: In its latest transparency report on search removals under European privacy law, Google reports on the “categories of content requested for delisting”. The categories of content include insufficient information, personal information, professional information, self authored, crime and professional wrongdoing.

6. Reporting on products targeted by demands

Some companies maintain and support multiple products. For example, Facebook offers its core product, the Facebook social network service, as well as Instagram, WhatsApp, and its Facebook Messenger app/service. By specifying which of a company’s products are being targeted by which demands, a transparency report can better reflect how those demands are impacting the range of its offerings, highlight differences in impact between its services, and better enable comparison of that impact with other companies’ comparable services. Some examples of companies that exhibit this best practice are:

Google: In its latest transparency report on government requests to remove content, which covers the second half of 2017, Google enables users to filter the number of requests received and the number of content items specified within these requests based on the product that was targeted by requests, broken down into four categories: Web Search, YouTube, Google AdWords, and All Others. Users can also select individual countries and explore which of Google’s products have been targeted by how many content removal requests there. Additionally, Google’s report has a separate section that breaks down, in percentages, the top three products and services cited in government requests during the selected reporting period (currently YouTube, Web Search, and Blogger).

Wikimedia Foundation: In its latest transparency report, which covers the first half of 2018, the Wikimedia Foundation outlines which specific Wikimedia projects were targeted by requests for content alteration and takedown. Examples of Wikimedia projects that have been targeted by such requests are English Wikipedia, Wikimedia Commons, German Wikipedia and Wikispecies.

Ideally, this best practice would be implemented by a company across all of the types of takedowns it reports on. Some companies do this, but some do not. Facebook, for example, only breaks down its reporting by product in its copyright and trademark requests report. These reports provide separate data for both the Facebook and Instagram platforms. Facebook does not, however, provide similar separately reported data for Instagram in its reports on other issue areas such as government and other legal content demands, nor do they provide any specific data about content takedowns or blocking on other products and services such as WhatsApp and Facebook Messenger.

This best practice is generally not applicable to telecommunications companies who only offer connectivity services as they typically do not have more than one product or service. However, it is applicable to telecommunications companies who offer multiple services such as email.

In the context of Right to be Forgotten delisting requests, satisfying this best practice means reporting on the categories of websites that the delisting requests target. This is because Right to be Forgotten delisting requests currently target web search products only, and so a breakdown by product for this category of demands is not applicable.

Google: In its latest transparency report on search removals under European privacy law, Google reports on the “categories of websites hosting requested for delisting”. The categories reported on include news, directory and social media sites.

7. Reporting on specific government agencies/parties that submitted demands

By reporting on which specific government agencies or entities submit content-related demands, a company can highlight which elements of government in which countries are the most active in seeking to police online content, which can in turn help identify misuse or overuse of authority or actions outside of a particular part of government’s jurisdiction, as well as overall trends in what content which parts of government are targeting. Some examples of companies that exhibit this best practice are:

Daum Kakao In the last Daum Kakao report that we surveyed, which covers the first half of 2018, Daum Kakao reports on the number of requests it received from specific government agencies. In this reporting, Daum Kakao breaks down requests received from South Korean government agencies such as the Korea Communications Standards Commission, the Ministry of Food and Drug Safety, the Korea Internet and Security Agency, the National Policy Agency and the Military Manpower Administration. These requests are also broken down by requests received by Daum and requests received by Kakao (Daum Kakao is a company that formed as a result of a merger between Daum Communications and Kakao).

Google: In its latest transparency report on government requests to remove content, which covers the second half of 2017, you can filter the number of requests received and the number of items specified within these requests based on the government branch that submitted requests. In most countries, Google divides up requests based on whether they were received from the judicial or executive branches.

Reporting specific information about the private parties that submitted content demands may raise privacy and safety issues that government requests do not, and so we are not urging that companies publish the identity of every private party that makes a content request of any kind. However, it is worth highlighting that we have seen exceptional transparency from a few companies in regard to those who submit copyright requests:

Google: In its latest transparency report on removals of URLs from its search results due to copyright, Google highlights the identity of the six copyright-holding companies that submitted the most requests, as well as highlighting the top six reporting organizations that have submitted the most requests, and goes on to list the more than 15,000 other copyright holders and the more than 13,000 reporting organizations that have submitted DMCA takedown requests.

Twitter: In its latest transparency report on copyright notices, which covers the second half of 2017, Twitter highlights the top five copyright reporters on the platform within each reporting period. This reporting includes information on the number of copyright takedown notices they have filed, the percentage of total takedown notices their notices comprise and the number of materials that were withheld as a result of their copyright requests. Notably, the five top requesters during the last reporting period were responsible for over half of the requests.

8. Specifying which laws pertain to specific demands

Because most major internet and telecommunications companies operate in multiple countries, it is important to understand which laws and legal frameworks govern user speech and communications. Some examples of companies that do a good job in their transparency reports of explaining which laws are responsible for which types of takedowns include:

Daum Kakao: In the last Daum Kakao report that we surveyed, which covers the first half of 2018, Daum Kakao provides an overview of the different types of requests users and governments can submit. These include requests for content removal or moderation based on copyright infringement, trademark and portrait rights infringement, the leaking of personal information, defamation and restrictions on obscene materials and materials that purport to sell drugs and medical supplies. For each of these reasons, Daum Kakao provides the applicable law that governs these requests including the Act on Promotion of Information and Communications Network Utilization and Information Protection, etc. and the Pharmaceutical Affairs Act.

Telefonica: In its latest transparency report, which covers the year 2016, Telefonica specifies the relevant legal framework in each country for each category of demands (if there is one). For example, In El Salvador, Telefonica specifies that geographical or temporary suspensions of service can be requested under the Special Law Against the Crime of Extortion (Art. 13 and 14). Similarly, in Spain, Telefonica specifies that the blocking and filtering of certain contents is possible under three legal frameworks: the Royal Decree 1889/2011 of 30 December, regulating the functioning of the Intellectual Property Commission (articles 22 and 23), the Revised Text of the Intellectual Property Law, approved by Royal Legislative Decree Law 1/1996 of 12 April (Article 138) and Law 34/2002 of 11 July on services of the information society and electronic commerce (article 8).

Trade Me: Government and other legal content demands on Trade Me center around New Zealand’s Harmful Digital Communications Act. In its 2017 transparency report, Trade Me explains the Act, how it works and reports on the number of relevant demands received under it.

9. Reporting on the number of accounts and items specified in demands

In addition to reporting on the number of notices or demands a company has received for each relevant content category (highlighted in best practice #2), companies should also report on the number of accounts and items specified in demands as this enables a better understanding of the full breadth of those demands. By reporting on the number of users/accounts and items/pieces of content that are specified in content demands, companies can highlight the scope and extent of requests, how many users and items within their network these requests could potentially impact, and allow comparison with the number of users or pieces of content actually impacted (best practice #10).

Currently, companies typically report on either the number of accounts specified in demands or the number of items specified in demands. Examples of these companies are offered below. At this point we believe the best practice should be to publish data about both specified accounts and specified content. However, because no companies currently do so, we gave credit in our charts to companies reporting on one or the other.

Dropbox: In its latest transparency report on government removal requests, which covers the second half of 2017, Dropbox highlights the number of accounts specified in requests it received in each relevant country. For example, Dropbox reports that in the Netherlands, 88 accounts were targeted by 45 requests.

Microsoft: In its latest transparency reporting on Right to Be Forgotten delisting requests, which covers the second half of 2017, Microsoft specifies the total number of URLs that were requested for delisting in each relevant country. For example, Microsoft reports that in Croatia, 20 URLs were requested to be delisted by 16 requests.

Twitter: In its latest transparency report on legal requests for content removal, which covers the second half of 2017, Twitter discloses the number of accounts specified by requests received in relevant countries. For example, In India, the company received 2 court order removal requests and 142 government removal requests (called “other legal demands” in the report). These requests specified 800 accounts.

10. Reporting on the number of accounts and items impacted by demands

In addition to reporting on the number of accounts and items specified in demands (highlighted in best practice #9), companies should also report on the number of accounts and items impacted by those demands. Such reporting offers the most direct measure of how many speakers and how much free expression is being silenced as a result of such demands (and how many/how much is being effectively defended by the company). It also enables a comparison of the requested impact versus the actual impact, which in turn offers a greater understanding of both the quality and legality of the requests being made and the company’s rates of compliance with those requests.

Currently, and similarly to best practice #9, companies typically report on either the number of accounts impacted by demands or the number of items impacted by demands. Although companies received credit for this best practice in the charts if they did either, the best practice would be to publish data on both, and here—unlike with best practice #9—there are companies that actual satisfy this best practice. Some examples of companies that report on both the number of accounts and the number of items of content impacted are include:

Tumblr: In its latest transparency report on copyright and trademark requests, which covers the second half of 2017, Tumblr separately reports on the number of accounts, Tumblr posts, and individual pieces of content affected by DMCA notices on a monthly basis. For trademark complaints, also on a monthly basis, Tumblr specifies the number of URLs affected by requests, indicates the percentage of blog content that was removed compared to the amount of blog content that was complained about, and the percentage of URLs that were found to be misleading and were required to be changed, compared to the number of URLs complained of.

Twitter: In its latest transparency report on legal requests for content removal, which covers the second half of 2017, Twitter specifies the number of accounts and tweets “withheld” or restricted on a country by country basis as a result of legal requests.

11. Reporting on how the company responded to demands

Reporting on how a company responds to requests across different issue areas is vital for understanding how companies comply with legal frameworks, government demands and user requests. In addition, reporting on how a company responds to requests also highlights the role companies play in protecting or censoring speech. Currently, companies demonstrate this best practice in a variety of ways including reporting on the number or percentage of requests complied with or removal percentages associated with demands for each category of content. Companies that disclosed this data received credit for satisfying this best practice in the charts. However, the preferred method of satisfying this best practice is for companies to clearly break down responses to demands using categories that highlight the range of possible responses a company can employ when they receive a request. These categories could highlight, for example, the number or percentage of requests complied with in whole, complied with in part, or rejected. This method of reporting goes beyond simply reporting on the number of impacted users or pieces of content as specified in best practice #10, or stating a compliance rate that does not adequately summarize how a company could have responded to demands. Some examples of companies that exhibit this best practice in their transparency reports are:

Automattic In its latest transparency report on copyright requests, which covers the first half of 2018, Automattic breaks down its responses to requests using the following categories: “Percentage of notices where some or all content was removed”, “Percentage of notices rejected as incomplete”, and “Percentage of notices rejected as abusive”.

Dropbox: In its latest transparency report on government removal requests, which covers the second half of 2017, Dropbox categorizes its responses to requests into three categories: content blocked in response to a request, content blocked pursuant to Acceptable Use Policy, and no action taken.

Kickstarter In its latest transparency report on copyright and trademark requests, which covers the year 2015, Kickstarter reports on the number of claims they rejected, as well as the possible reasons behind such rejections. For example, in its reporting on copyright claims, Kickstarter states “we reject claims when they are incomplete, when they involve material that can’t be protected under copyright, or when they target fair use.” In addition to reporting on the number of projects that were “hidden” (taken down) as a result of DMCA or trademark claims, Kickstarter also reports on the number of projects that were targeted by copyright and trademark requests, but that they were able to avoid hiding by “helping creators make project modifications to address a copyright claim” and “by encouraging claimants to resolve the dispute directly with the project creator or by helping the creators make a modification to their projects to address the claim”. For copyright and trademark claims, Kickstarter also reports on the number of projects that have been returned to public view and that remain hidden.

12. Additional best practices to consider

In addition to the 11 general best practices for content takedown reporting outlined above, companies should also aim to follow these additional best practices focused on enhancing the reports useability and supplementing its quantitative data with qualitative data and additional context:

Defining terms clearly: By clearly defining terms such as “content takedowns”, “content restrictions”, “network shutdowns” and “Community Guidelines-based takedowns”, a company can clarify what specific data points are being reported on in its transparency reports, and how reported data points differ from one another. Clearly defined terms also make cross-company comparisons easier. Companies that demonstrate this best practice often do so in the form of a glossary. Some examples of companies that exhibit this best practice in their transparency reports are:

Dropbox: In its latest transparency report, which covers the second half of 2017, Dropbox provides a definition and overview of the scope of information reported on for each request type. For example, regarding government removal requests, Dropbox highlights that its reporting on this area includes “court orders and written requests from law enforcement and government agencies seeking the removal of content based on the local laws of their respective jurisdictions”.

Telefonica: In its latest transparency report, which covers the year 2016, Telefonica has a specific glossary-like section dedicated to defining the different types of indicators and categories of requests they report on. For example, Telefonica defines “geographical and temporary suspensions of the service” as “A request from the competent authorities to temporarily or geographically limit the provision of a service. These requests are usually connected with circumstances involving situations of force majeure, such as natural catastrophes, acts of terrorism, etc.”

Telenor: In its March 2017 Authority Requests Disclosure Report, Telenor has a specific glossary-like section dedicated to defining the different types of authority requests they receive and engage with. In this section, it clearly defines “network shutdowns” as “requiring shut down of the operators network in part or in full” and “content restrictions” as “impos[ing] restriction on electronic content distributed through its network, such as blocking of URLs.”

Providing meaningful explanations of internal policies: By explaining the internal policies and guidelines companies use to process and act upon requests in a meaningful manner, a company can clearly communicate to users what types of requests they process, what factors are considered when processing requests, and what types of content requests fall out of scope. Some examples of companies that exhibit this best practice in their transparency reports are:

Daum Kakao: In the last Daum Kakao report that we surveyed, which covers the first half of 2018, Daum Kakao uses a detailed flow-chart style infographic to explain to users what process they use for “handling rights of infringement reports”.

Facebook: In the first half of 2018, Facebook published a comprehensive overview of its Community Standards, which delineate the kinds of content that it removes and the kinds of content they do not. For example, under the section “Violence and Criminal Behavior”, Facebook specifies that they “remove content, disable accounts, and work with law enforcement” when they assess a “genuine risk of physical harm or direct threats to public safety.”

Google: In its latest YouTube Community Guidelines enforcement report, which covers the second quarter of 2018, Google has a video called “The Life of a Flag” which explains how flagged content is assessed and if needed, moderated.

Tumblr: In its latest transparency report on copyright and trademark requests, which covers the second half of 2017, Tumblr uses a detailed flow-chart style infographic to explain to users how they “handle copyright infringement notifications under the Digital Millennium Copyright Act (DMCA)”.

Vodafone: Although Vodafone does not publish quantitative information on content demands, and is therefore not otherwise included in this report, it does provide a significant amount of qualitative information on how it responds to legal and government demands for content blocking and network shutdowns, and what legal frameworks exist in each country it operates in that enable this. This includes outlining a set of “Vodafone Freedom of Expression Principles” which delineate what Vodafone does in response to requests, what it does not do and what it believes governments should do.

Offering case studies to illustrate the company's practices and the issues it faces: By publishing case study examples a company can provide greater qualitative information on how they have processed past cases. This generates greater understanding on their policies and processes, as well as on the types of cases they typically receive and how they respond to them. Some examples of companies that exhibit this best practice in their transparency reports are:

Google: In its traffic and disruptions transparency report, Google provides links to external news sources that have reported on traffic and disruption events the company has experienced. In addition, in its reporting on Right to be Forgotten delisting requests (under “Search removals under European privacy law”), Google provides examples of requests it has received. Each example provides details on the country in which the request was received, the details of the request, and details on how Google responded.

Millicom: In its latest Law Enforcement Disclosure Report, which covers the year 2017, Millicom categorizes shutdown and disruption events as “major events”. For some of these cases, Millicom provides a contextual case study. For example, they explain that since 2014 there has been an “ongoing shutdown of services in prisons in Central America” and provide relevant contextual information.

Reporting on specific notices where reasonable and permitted by law: When possible and legal, and consistent with privacy and safety considerations, companies should report on the specific legal demands and content removal requests they have received. This enables for greater understanding on the prevalence and scope of content takedowns online. One way companies can do this is by reporting specific requests to the Lumen Database, which is a project of the Berkman Klein Center for Internet & Society at Harvard University. Google, for example, does this when it receives and complies with a DMCA notice to remove content from its web search product. Instead of being able to see the copyrighted content, a user is instead directed to the Lumen Database where they can see the DMCA takedown notice. Similarly, GitHub hosts its own repository with text of DMCA takedown notices it has received.

Providing meaningful numbers that reflect how many pieces of content or accounts were blocked or otherwise restricted based on automated flagging or review: Companies utilize automated tools to seek to identify content that is illegal or that violates their Community Guidelines. Because such automated flags and even automated takedowns are prevalent, and raise unique due process and accuracy concerns, specific reporting on how often they are used—and how often they make mistakes—is desperately needed. Currently, Facebook, Google, and Etsy report on the use of automated tools, in relation to their Community Guidelines-based takedowns. This best practice should not only spread to other companies but also to other categories of reporting, because automated tools are increasingly being used to seek out and block a variety of different types of illegal and infringing content, beyond content that merely violates Community Guidelines.

Linking relevant reports to one another: All of a company’s various transparency reports ideally should be available through a single convenient portal. Companies should have a central webpage, list or drop-down menu where all of their current and past transparency reports can be easily accessed. Parent companies that own subsidiaries or products that publish independent transparency reports should similarly collect all of their available transparency reports in one central location.

Publishing reports at static and functioning URLs: Transparency reports are useless if people can’t find them. Maintaining static and functioning URLs is especially important for older versions of a company’s transparency report. If a company is acquired or re-branded and the links to its transparency reports subsequently change, they should clearly note this and provide directions to updated links.

Publishing data in a structured data format: Companies should make all report data available in a CSV (comma separated values) format, rather than or in addition to (e.g.) a flat PDF file. This format is most helpful to researchers, journalists, and others who want to make use of the report data, as it simplifies the data extraction process and makes reports more accessible.

Publishing reports using a non-restrictive Creative Commons license: Companies should use a non-restrictive Creative Commons license for their reports, such as the “ShareAlike” license, which “lets others remix, tweak, and build upon your work even for commercial purposes, as long as they credit you and license their new creations under the identical terms.” For more information on Creative Commons licensing, visit https://creativecommons.org/licenses/, and for information about choosing a Creative Commons license, visit https://creativecommons.org/choose/.

Offering a Frequently Asked Questions section: Although topics such as a company’s values, definitions of key terms and an explanation of why the company publishes a transparency report will likely be included in the narrative of the report, companies should also include a Frequently Asked Questions (FAQ) section. This goes above and beyond to inform readers about company practices and how content requests are handled.

General Best Practices for Content Takedown Reporting

Table of Contents

Close