YouTube
YouTube, one of Google’s subsidiaries, is the most popular video platform on the market with approximately 2 billion users on the service worldwide.1 Given its reach, the website has become a major provider of health information. In an interview, Chief Product Officer Neal Mohan said YouTube’s response to the spread of COVID-19 misinformation on its platform has been focused on a twofold approach: “making authoritative information more prominent and aggressively removing policy-violating content.”2
While the video service reports that it has been working quickly to remove misleading videos, one watchdog organization, the Tech Transparency Project, found instances of YouTube profiting from videos pushing unproven treatments for the coronavirus.3 Specifically, the platform was running advertisements with videos pushing herbs, meditative music, and potentially unsafe over-the-counter supplements as cures for the coronavirus.4 Yet, around this same time, analysis from other researchers showed that among a sample of 320 videos related to the pandemic, four-fifths of the channels sharing coronavirus news and information are maintained by professional news outlets and that search results for popular coronavirus-related terms returned mostly factual and neutral video results.5 Since the publication of both these studies, YouTube has taken other actions to address misleading information.
YouTube has taken a number of proactive steps to educate users from verified sources and to dissuade misinformation attempts, however, some of their efforts may have negative consequences for content creators. First, the company established clear guidelines and restrictions for demonetizing content related to COVID-19, including content that misinforms users about health matters related to the virus.6 Additionally, the site is directing users from YouTube’s homepage to the WHO or other locally relevant authoritative organizations when they search for terms related to COVID-19 on the site.7 Further, the company has committed to donating ad inventory to governments and NGOs in affected regions to use for education and information.8
In addition, in late April, YouTube announced that it would expand the use of its algorithmically-recommended information panels to connect users to authoritative information when they search for COVID-19-related queries. Information panels were originally introduced in 2018, and they provide users with contextual information from third-party fact-checked articles. With regard to broad misinformation, these panels were primarily used to surface contextual and authoritative information related to longstanding misinformation stories, such as “flat earth” theories.9
Information panels have also been employed by YouTube to provide the user with topical context. As mentioned above, when a user engages with videos or search results related to COVID-19, the information panels will connect them to information from the WHO, CDC, or local health authorities. Additionally, in order to tackle misinformation that spreads as part of the rapidly-moving news cycle, when a user enters a query seeking information that relates to a specific claim for which the platform has a relevant third-party fact-checked article, YouTube may display an information panel at the top of the search results that includes: the fact-checked article title, a link to the article, and the publisher’s name. If more than one relevant fact-checked article exists, YouTube will show a carousel that allows users to scroll through the available articles.10 In addition to this roll out, YouTube announced that it will provide $1 million through the Google News Initiative to the International Fact-Checking Network to bolster fact-checking and verification efforts across the world.11
Like Facebook and Twitter, YouTube’s content review capacity has significantly decreased during the pandemic, and the company is increasingly relying on automated tools for content review and moderation. Therefore, while these efforts to combat misinformation should yield positive results, YouTube has warned that the service’s reliance on automated tools may lead to an increase in erroneous removals of videos that appear to be in violation of YouTube’s policies.12 Typically, YouTube utilizes machine learning algorithms to flag potentially harmful content, which is then sent to human moderators for review.13 One major consequence of the shift to a mostly automated system is that content creators who feel that their content was mistakenly taken down or demonetized may face delays in the appeals process. Although the process has not changed, decreased human content review capacity means it will take longer to assess appeals.14 However, despite the delays, it is important that the company is still maintaining an appeals review process, as this is a vital source of redress and remedy in the content moderation process.
YouTube’s shift in content moderation operations will undoubtedly have a major impact on the amount of content that is removed. It is therefore important for YouTube to provide periodic updates on its content moderation efforts during the pandemic. In addition, following the pandemic, YouTube should create a comprehensive COVID-19 report that highlights the scope and scale of content moderation efforts during this time, and that provides data showing the amount of content that was removed as a result of automated detection as well as human flags. This reporting will help civil society organizations and researchers further understand the use of automated tools in moderating misleading content. In addition, YouTube should expand its general transparency reporting to include more granular data on the moderation of misleading content.
Citations
- Maryam Mohsin, “10 Youtube Stats Every Marketer Should Know in 2020,” Oberlo, November 11, 2019, source
- Ina Fried, “YouTube pulls coronavirus misinformation videos,” Axios, April 7, 2020, source
- Kari Paul, “YouTube profits from videos promoting unproven Covid-19 treatments,” The Guardian, April 3, 2020, source
- Paul, “YouTube profits from,” The Guardian.
- Nahema Marchal, Hubert Au, and Philip N. Howard, “Coronavirus News and Information on YouTube: A Content Analysis of Popular Search Terms,” The Computational Propaganda Project, April 20, 2020, source
- “Monetization update on COVID-19 content,” YouTube Help, Accessed May 16, 2020, source
- Pichai, “Coronavirus: How We're Helping,” Google.
- Pichai, “Coronavirus: How We're Helping,” Google.
- “Expanding fact checks on YouTube to the United States,” YouTube: Official Blog, April 28, 2020,source
- “Expanding fact checks,” YouTube: Official Blog.
- “Expanding fact checks,” YouTube: Official Blog.
- “Protecting our extended workforce and the community,” YouTube: Creator Blog, Google, March 16, 2020, source
- Sarah Perez, “YouTube warns of increased video removals during COVID-19 crisis,” Tech Crunch, March 16, 2020, source
- “Actions to reduce the need for people to come into our offices,” The Keyword, March 16, 2020, source