The Santa Clara Principles 2.0

Policy Paper
Dec. 8, 2021

In 2018, alongside the Content Moderation at Scale conferences in the United States, a group of human rights organizations, advocates, and academic experts developed and launched a set of three principles for how best to obtain meaningful transparency and accountability around Internet platforms’ increasingly aggressive moderation of user-generated content. These principles, named after the group's initial meeting place in Santa Clara, CA, represent recommendations for initial steps that companies engaged in content moderation should take to provide meaningful due process to impacted speakers and better ensure that the enforcement of their content guidelines is fair, unbiased, proportional, and respectful of users’ rights. This was the first iteration of the Santa Clara Principles.

Since 2018, twelve major companies—including Apple, Facebook (Meta), Google, Reddit, Twitter, and Github—have endorsed the Santa Clara Principles and the overall number of companies providing transparency and procedural safeguards has increased, as has the level of transparency and procedural safeguards provided by many of the largest companies.

At the same time, the importance of the role these companies play in society continues to increase, resulting in an ever greater responsibility to provide sufficient levels of transparency around the decisions they make, in order to enable accountability.

For these reasons, a broad coalition of organizations, advocates and academic experts worked together in 2020 and 2021 to develop this second iteration of the Santa Clara Principles. They were developed following a broad consultation exercise involving more than 50 organizations and individuals, and a thorough process of drafting and review. By drawing on experience and expertise from all parts of the world, this second iteration of the Santa Clara Principles better reflects the expectations and needs of the global community.

This second iteration of the Santa Clara Principles is divided into Foundational and Operational Principles. Foundational Principles are overarching and cross-cutting principles that should be taken into account by all companies, of whatever business model, age, and size, when engaging in content moderation. They set out each principle and guidance as to how to implement that principle. The Operational Principles set out more granular expectations for the largest or most mature companies with respect to specific stages and aspects of the content moderation process. Smaller, newer, and less resourced companies may also wish to use the Operational Principles for guidance and to inform future compliance. In contrast to the minimum standards set out in the first iteration, this second iteration provides greater specificity regarding precisely what information is needed to ensure meaningful transparency and accountability.

This second iteration of the Santa Clara Principles expands the scope of where transparency is required with respect to what is considered “content” and “action” taken by a company. The term “content” refers to all user-generated content, paid or unpaid, on a service, including advertising. The terms “action” and “actioned” refer to any form of enforcement action taken by a company with respect to a user’s content or account due to non-compliance with their rules and policies, including (but not limited to) the removal of content, algorithmic downranking of content, and the suspension (whether temporary or permanent) of accounts.

This second iteration of the Santa Clara Principles has been developed to support companies to comply with their responsibilities to respect human rights and enhance their accountability, and to assist human rights advocates in their work. They are not designed to provide a template for regulation.

Authors

Access Now
ACLU Foundation of Northern California
ARTICLE 19
Brennan Center for Justice
Center for Democracy & Technology
Electronic Frontier Foundation
Global Partners Digital
InternetLab
National Coalition Against Censorship
New America’s Open Technology Institute
Ranking Digital Rights
Red en Defensa de los Derechos Digitales
WITNESS


Editorial disclosure: This report discusses policies by Apple, Facebook, Google (including YouTube), and Twitter, all of which are funders of work at New America but did not contribute funds directly to the research or writing of this piece. View our full list of donors at www.newamerica.org/our-funding.

Related Topics
Platform Accountability Content Moderation Transparency Reporting Algorithmic Decision-Making