OTI and Coalition Press Tech Platforms for More Transparency and Accountability Around Online Content Moderation

Press Release
Shutterstock
May 7, 2018

Today, New America’s Open Technology Institute, as part of a coalition of organizations, advocates, and academic experts who support the right to free expression online, published a joint statement outlining the minimum standards it believes tech platforms must meet in order to provide adequate transparency and accountability around their efforts to take down user-generated content or suspend accounts that violate their rules. The release of these new Santa Clara Principles on Transparency and Accountability Around Online Content Moderation coincides with the second Content Moderation and Removal at Scale conference being held today in Washington, DC, as a follow-up to the first such conference held in Santa Clara, CA in February.

The Santa Clara Principles offer guidance to internet platforms on how to provide users with meaningful due process when their posts are taken down or their accounts are suspended, and to help ensure that the enforcement of company content guidelines is fair, unbiased, and respectful of users’ free expression rights. The three principles urge companies to:

publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines;

provide clear notice to all users about what types of content are prohibited, and clear notice to each affected user about the reason for the removal of their content or the suspension of their account; and

enable users to engage in a meaningful and timely appeals process for any content removals or account suspensions.

The principles were created by a small group of organizations and advocates seeking to provide a basis for continued dialogue about best practices around content moderation, including at events like today’s conference. They build on recent positive steps toward greater transparency and accountability that occurred in just the past two weeks, when Google’s YouTube published the industry’s first transparency report detailing content takedowns based on content rules violations, and Facebook expanded its takedown appeals process and published its detailed internal content guidelines to better explain how the company decides which posts should be taken down.

The following statement can be attributed to Kevin Bankston, Director of New America’s Open Technology Institute:

“As internet companies are under increasing pressure to more aggressively police the content on their platforms, and especially as companies are increasingly relying on AI and other automated tools to make content decisions, the need for more transparency and accountability around their content takedowns has also increased. The steps taken in recent weeks by companies like Facebook and Google to be more transparent about how much content they take down and why is a good start, but much more remains to be done. These companies are becoming the de facto arbiters of what content is allowed and not allowed on the internet, a dangerous power and an awesome responsibility that requires meaningful checks and balances. At the very least, as outlined in these new principles, users deserve to know exactly when, why, how, and how much of their content is taken down, and have an opportunity to appeal those decisions.”


Related Topics
Transparency Reporting Content Moderation Platform Accountability