One Year After the Release of the Santa Clara Principles, OTI Continues to Push Tech Companies for Transparency and Accountability Around Content Moderation Practices

Press Release
Shutterstock
May 7, 2019

One year ago, New America’s Open Technology Institute, as part of a coalition of organizations, advocates, and academic experts who support the right to free expression online, released the Santa Clara Principles on Transparency and Accountability Around Online Content Moderation. The Santa Clara Principles outline minimum standards tech platforms must meet in order to provide adequate transparency and accountability around their efforts to take down user-generated content or suspend accounts that violate their rules. The principles advocate for greater transparency and accountability by focusing on three key demands—comprehensive numbers detailing their content moderation activities, clear notice to affected users, and a robust appeals process.

On the first anniversary of the release of the Santa Clara Principles, we conducted an assessment of how three of the largest internet platforms—YouTube, Facebook, and Twitter—have implemented the recommendations outlined in the Principles. Our findings indicate that although all three platforms have demonstrated progress in implementing the recommendations related to “notice” and “appeals,” they still fall woefully short when it comes to implementing the recommendations put forth for the “numbers” category.

Although the three largest platforms have each now issued transparency reports focused on their content moderation practices, there is still significant need for improvement. For example, YouTube’s report fails to provide adequate transparency around the role automated tools play in content takedowns, Facebook’s report lacks basic information such as a single combined number indicating how many pieces of content were removed for violating the platform’s Community Standards, and Twitter’s report provides information on the number of accounts that were flagged and subsequently acted upon, but not on the amount of content that was removed.

In the coming year, we hope these platforms will demonstrate a greater commitment to implementing the recommendations set forth in the Santa Clara Principles and the best practices for transparency reporting outlined in our Transparency Reporting Toolkit focused on content takedown reporting. This will not only enable the public to hold these companies accountable for their management of online speech and expression, but it will also encourage other industry players to adopt the principles and provide transparency around their content takedown practices, thus solidifying the principles as industry-wide best practices.

The following statement can be attributed to Spandana Singh, Policy Program Associate at New America’s Open Technology Institute:

“Internet platforms such as YouTube, Facebook, and Twitter wield an enormous amount of power over what speech is permissible online, and they are facing growing pressure from across the globe to exercise this power and regulate more content. Although these platforms have taken promising first steps towards providing transparency around their content takedown efforts, there is still much more work to be done. In order to hold these platforms accountable for how they exercise their power, they must provide meaningful transparency that allows users and the public to fully comprehend how, why, and to what extent their speech is being impacted, and how they can appeal these decisions.”

Related Topics
Transparency Reporting Platform Accountability