Spandana Singh
Policy Analyst, Open Technology Institute
Last Thursday, Facebook released the latest edition of its Community Standards Enforcement Report, which provides data and insights on Facebook’s removal of user content based on violations of its community standards. The latest version of the report demonstrates some progress toward increasing transparency and accountability around Facebook’s content takedown operations, particularly related to its appeals function. However, the report also shows there is still room to improve. The platform is still not reporting on some key metrics that are essential for understanding the scale and scope of its efforts to manage user expression.
As with previous editions of the Community Standards Enforcement Report, this edition of the report covers basic metrics within a range of content categories, such as how prevalent Community Standards violations were on the platform, how much content it took action on, and how much violating content Facebook found before users reported it. It also introduced three new features. The first new feature provides data for these metrics related to violations of its regulated goods policy (such as drugs and firearms). We welcome this gradual expansion of reporting on different categories of violations on the platform, although we hope to see many more content categories reported on in future reports. Two other features introduced in the report are two metrics related to the platform’s appeals processes: the first metric outlines how much of the content Facebook removed was appealed by users, and the second metric highlights how much content was restored, both in response to appeals and for other reasons.
Given that internet platforms are increasingly using automated tools to review content at scale, these important metrics enable users and the public to comprehend the accuracy of these automated tools and how they can negatively impact speech. The release of these metrics is also a positive step toward demonstrating greater accountability around the platform’s content takedown decisions, as it effectively shows how often Facebook made mistakes when reviewing content. According to the report, there were relatively few appeals (ranging from the 1000s to 100,000s) of content takedowns across most content categories. The only exceptions to this were for content takedowns related to violations of the adult nudity and sexual violence, spam, and hate speech policies, which all had appeals in the millions range. (Facebook did not provide data about the number of appeals for content removed as a result of their fake accounts policy, but their policy explicitly does not allow either fake accounts or appeals in this category.)
In addition, the report indicates that in many cases Facebook is reviewing and restoring content even where there has been no appeal. Although in the categories of the bullying and harassment, hate speech, and regulated goods policies, most instances of restored content were in response to appeals, Facebook restored more content in five other categories when Facebook reviewers recognized content had been removed erroneously or when circumstances changed, rather than due to an appeal of a takedown.
Although the appeals function may not have been widely used across all content policies, we see a high volume of appeals for controversial categories of content, such as hate speech, and subsequently a high volume of content that was restored as a result of these appeals for sensitive categories, such as bullying and harassment. This indicates that the appeals function is a useful mechanism for providing accountability around content takedown practices for certain content categories.
As highlighted in our recently published assessment of how YouTube, Facebook, and Twitter are stacking up against the Santa Clara Principles on Transparency and Accountability in Content Moderation, all three platforms have an established appeals process for users seeking further review of content takedown decisions. However, Facebook is the first of these platforms to issue concrete data around its appeals process, and how it has impacted user expression. Nonetheless, Facebook needs to do more in order to align with the recommendations outlined in the Santa Clara Principles.
The release of this data is also particularly significant given that Facebook is currently working to establish an Oversight Board for Content Decisions that would serve as an independent reviewing body for particular appealed content cases. If Facebook plans to rely on such an oversight body, it must take steps to ensure that this body is sufficiently independent and can play a meaningful role in setting policy as well as in reviewing content cases. OTI has participated in both online and offline consultations that Facebook has held about the design and function of the Board (you can read OTI’s consultation comments on the Board here). We strongly recommend that Facebook place transparency and accountability as the focal point when designing the Board, and that when the Board is operational, it issue similar data points around the Board’s appeals-related decisions in future transparency reports.
Although this latest edition of Facebook’s report makes positive strides toward greater transparency and accountability around its appeals function, there are still a number of core data points related to the scope and impact of content takedowns that are missing, such as:
We’ve highlighted the importance of these metrics for providing meaningful transparency around content takedown efforts in the Santa Clara Principles, which we drafted with a coalition of organizations, advocates, and academics who support the right to free expression, as well as through our Transparency Reporting Toolkit focused on Content Takedowns, which we released last fall. In addition, Ranking Digital Rights, an affiliate program at New America, has also been vocal in its calls for greater transparency around online content takedown efforts. In their recently released 2019 Corporate Accountability Index, which ranks 24 of the world’s most powerful telecommunications, internet, and mobile companies on their commitments and policies affecting users’ freedom of expression and privacy, they emphasize that although Facebook has made some improvements related to transparency around their content takedown efforts, more must be done, particularly to improve its appeals mechanism.
This latest report demonstrates some progress toward implementing the recommendations outlined in these resources around appeals, and we hope similar efforts will be made to augment transparency and accountability around content takedown efforts at large.