Reddit is a popular social media platform that has approximately 330 million monthly active users around the world.1 The platform is distinct from other social media platforms in that it does not have a comprehensive top-down content moderation strategy. Rather, the platform operates using a high-level set of content guidelines that are enforced by a team of employee moderators (known as admins), and subreddit-specific content policies that are created and enforced by users who act as moderators of individual subreddits (known as mods).2 This localized approach to content moderation has permitted a number of niche communities and groups to flourish on the platform.3 However, this structure has also created conditions that can enable misinformation and disinformation to spread easily across the service.
In response, the company has begun promoting a number of resources containing authoritative information related to COVID-19, stating that unless a subreddit is focused on spreading misleading content, admins will prioritize educating and cooperating with users in the subreddit. If these efforts fail, the platform will then take steps to ban the subreddit, in a process known as “quarantining.”4 When a community is quarantined, it does not appear in search results. Additionally, if a user tries to visit the quarantined community, they will be notified that the subreddit may contain misleading content and they must explicitly opt-in to viewing the content.5
In addition to these efforts, Reddit has announced that its site integrity team is also working on investigating claims and evidence of coordinated attempts to spread misleading COVID-19 information across the platform. The company has stated that these efforts include detection experiments, which are being conducted in conjunction with other companies such as Microsoft and Google.6 Further, the company has been organizing “Ask Me Anything” (AMA) series in which users can ask scientific and medical experts, as well as public officials, questions about the virus, therefore enabling users to access verified, real-time information.7 The company is also using banners to highlight content that has been verified and deemed legitimate on the Reddit homepage and in search results.8
Reddit has also stated that it is working to equip both admins and mods with the necessary resources and guidance to remove misinformation. In a Reddit admin post on safety in late April, the company shared that it is striving to rapidly moderate content that contains claims that encourage violence (e.g. calls to vandalize phone towers or attack individuals of a specific nationality) or physical harm (e.g. suggesting that drinking bleach helps prevent or cure the virus).9 To this end, the company has compiled a set of resources outlining authoritative and verified information on COVID-19 for mods who are reviewing content for COVID-19-related misinformation. In its April post, Reddit also outlined that mods can use the AutoModerator tool (known as AutoMod) to identify and remove obvious forms of misinformation in their subreddits.10 The AutoMod is a built-in, customizable bot that provides basic algorithmic tools to mods to proactively identify, filter, and remove objectionable content. The AutoMod operates based on mod-chosen parameters such as keywords, website links, or specific users, that are not permitted in a particular subreddit.11 Mods who identify cases of misinformation that are spreading across the platform, or an account that is behaving suspiciously, can also report these instances to the platform.12 Reddit has stated that it will be giving all users the option to report such content shortly. 13 In this way, Reddit presents an interesting case study for fact-checking and review of misleading content during the pandemic, as rather than taking on these roles and responsibilities entirely, or partnering extensively with independent third-party groups, it distributes and localizes these tasks among a certain group of users.
Currently, Reddit’s transparency report does not outline how much content is being removed by admins or mods under the platform’s misinformation policies. During the pandemic, the company should provide periodic updates on their content moderation and advertising policy enforcement efforts, particularly related to misinformation. Following the pandemic, the company should publish a COVID-19-specific transparency report that outlines the scope and scale of content moderation efforts by both admins and mods, as well as ad policy enforcement efforts by the company. This report should also include granular information on the number and types of quarantined communities. Further, Reddit should expand its general transparency reporting to include data on removals of misleading content by admins and mods.
Citations
- Lauren Feiner, "Reddit Users Are The Least Valuable Of Any Social Network," CNBC, February 11, 2019, source
- Singh, Everything in Moderation.
- Singh, Everything in Moderation.
- Reddit, "Misinformation and COVID-19: What Reddit is Doing," r/ModSupport, last modified April 16, 2020, source
- Reddit, "Misinformation and COVID-19," r/ModSupport.
- Reddit, "Misinformation and COVID-19," r/ModSupport.
- Reddit, "Misinformation and COVID-19," r/ModSupport.
- Reddit, "Misinformation and COVID-19," r/ModSupport.
- Reddit, "Misinformation and COVID-19," r/ModSupport.
- Reddit, "Misinformation and COVID-19," r/ModSupport.
- Singh, Everything in Moderation.
- Reddit, "Misinformation and COVID-19," r/ModSupport.
- Reddit, "Misinformation and COVID-19," r/ModSupport.