Welcome to New America, redesigned for what’s next.

A special message from New America’s CEO and President on our new look.

Read the Note

Facebook

Facebook is the largest social media platform in the world, with over 2.4 billion active users,1 the majority of whom are based outside the United States.2 The company therefore has significant global reach, as does the misinformation and disinformation that is spread on the platform.

Since the onset of the COVID-19 pandemic, false and misleading information related to COVID-19 has spread like wildfire on the platform, through user posts, in private groups, and through advertisements.3 Some of these posts have made innocuous claims, while others have shared more harmful ideas, such as promoting certain medicines or behaviors as preventive or curative measures, or suggesting that social distancing does not help stem the spread of the virus.4

In response, Facebook has launched a COVID-19 information center that houses all updates and information related to the platform’s efforts around the virus.5 This online hub includes a section called “Get the Facts,” which features articles that have been written by Facebook’s independent fact-checking partners and often link to fact-checked posts or articles.6 These articles are selected by Facebook’s News curation team and are updated weekly and available to Facebook’s U.S. users.7

According to Facebook, the company currently works with more than 60 fact-checking organizations that are responsible for reviewing and rating content in over 50 languages around the globe. The company has stated it is expanding its fact-checking partnerships to include more organizations and languages. It also announced the first set of recipients of the company’s $1 million grant program, which is hosted in collaboration with the International Fact-Checking Network, a forum for fact-checkers worldwide that is hosted by the Poynter Institute for Media Studies.8 Further, the company donated $1 million to the International Fact-Checking Network.9

These fact-checking organizations play an important role in misinformation and disinformation management on the platform. Pre-pandemic, when a piece of content was debunked by one of Facebook’s fact-checking partners, Facebook would append a warning label to the content and reduce the distribution of the content on the platform by demoting or downranking the post’s position in the platform’s algorithmic content ranking system. This was especially true for posts that shared misleading health information, such as sensational health claims, or that tried to sell products or services based on these exaggerated health claims.10 Facebook would also detect these types of content by identifying commonly used terms in such misleading posts and using them to predict and detect similar misleading posts.11 Facebook’s Community Standards outline that the platform does not remove false information, as there is a fine line between false news and satire and opinion.12 As a result, these fact-checking and downranking efforts have formed the foundation of Facebook’s efforts to counter such misinformation and disinformation.

Facebook’s efforts to downrank and reduce the distribution of COVID-19 misinformation and disinformation are based on these prior efforts. Through the use of automated tools, the platform is also able to identify duplicates of debunked stories and reduce their distribution.13 According to Facebook, these efforts are a continuation of work the company has been doing since 2018.14 The company has also begun directing users to authoritative sources on COVID-19 information15 and has stated that in the context of the pandemic, it will also remove content that contains misinformation that could lead to “imminent physical harm.”16

According to a Facebook post from CEO Mark Zuckerberg, over 2 billion users on Facebook and Instagram have been directed to “authoritative health resources,” and 350 million of those users actually clicked through to the resources.17 In addition, a blog post published by Facebook’s Vice President of Integrity Guy Rosen shared that the company had appended warning labels on approximately 40 million posts related to COVID-19 on the Facebook platform, based on the review of nearly 4,000 articles by the company’s independent fact-checking partners. 95 percent of the time, users did not click on content that had a warning label. 18 Additionally, when people search for information related to COVID-19 on Facebook, the platform will surface an educational pop-up with credible information from expert and governmental organizations such as the WHO and the U.S. Centers for Disease Control and Prevention (CDC).19 In addition, the company is giving free advertising credits to enable such organizations to run coronavirus education campaigns on Facebook and Instagram, and the company has said it is also discussing ways to provide additional assistance and support to health authorities.20

Further, on April 16, Facebook announced that it would begin alerting users if they had engaged with or viewed harmful misleading content related to the virus that had been debunked by the company’s fact-checking partners.21 Users who have liked, reacted to, or commented on these posts will receive alerts in their news feeds that direct them to the WHO’s “myth busters” page.22 This is a valuable method for providing transparency and accountability to users regarding their engagement with misleading content on the platform.

As highlighted above, Facebook’s Community Standards do not include clear policies related to the removal of false information. However, in the context of the pandemic, the platform has begun prioritizing the removal of COVID-19-related misinformation and disinformation that could cause imminent harm.23 Prioritizing the removal of this content during this time is especially important given that the company’s content moderation operations have radically transformed during the pandemic. Facebook announced that due to safety, privacy, and legal concerns, a large portion of their content moderation workforce, who are contractors, are unable to work from home.24 As a result, Facebook is increasingly relying on automated tools for content moderation. However, researchers and activists have extensively illustrated how these tools are limited and can often result in erroneous takedowns of content.25 In the absence of a robust content moderation workforce, Facebook has begun training a small group of its other employees, who have experience working in content policy, to moderate high-priority categories of content, such as COVID-19-related misinformation.26 The platform, however, has warned that users should expect numerous mistakes, given the decreased capacity for human review.27

During this time, the platform has also suspended its appeals process, instead enabling users to notify the company if they disagree with a moderation decision.28 This is concerning as it leaves users with no method for remedy or redress for erroneous decisions on whether to remove content.

Given that Facebook’s content moderation operations have changed drastically during this period, but are ever more important, the company should provide periodic updates on the scope and scale of its efforts to moderate and reduce the spread of misleading content during the pandemic. Following the pandemic, Facebook should publish a COVID-19-specific transparency report that outlines the scope and scale of these efforts throughout the entire pandemic. Further, Facebook should expand its general transparency reporting efforts to include data on the scope and scale of its efforts to remove and reduce the spread of misinformation more broadly.

Advertising can also promote the spread of misinformation on the platform. For example, some sellers have been advertising products that they claim can prevent or treat the virus. In response, Facebook has prohibited sellers from making COVID-19-related health or medical claims in product listings and has also banned ads that intend to foster panic related to the virus.29 In addition, the company has temporarily banned ads and commerce listings, such as those on Marketplace, which sell medical face masks,30 hand sanitizer, surface disinfecting wipes, and COVID-19 testing kits.31 The platform has also said it will remove organic posts that aim to sell these items.32 This is both to preserve this equipment for medical personnel and to prevent the sale of fraudulent or misleading items in these categories. Further, the company has established a dedicated channel for local governments to share listings they believe violate local laws.33 Going forward, Facebook should provide periodic updates on the number of listings it removes and the number of sellers it bans in Marketplace for violating its COVID-19-specific commerce policies, as well as its pre-existing commerce policies. Following the pandemic, Facebook should publish comprehensive data on its commerce policy enforcement efforts during this time. This reporting should also be expanded so that it is consistent and covers non-emergency periods as well.

Finally, after an investigation by The Markup, Facebook has removed the targeting category that enables advertisers to target users who are interested in pseudoscience. According to the investigation, this interest category contained over 78 million users, and it could enable advertisers to run and profit from ads that cater to users who are vulnerable to conspiracy theories and misleading information.34

As discussed, Facebook’s ad targeting and delivery tools can be used to promote the spread of misinformation. The platform has taken some important steps toward trying to prevent these misuses of their tools, however, there is little transparency around how effective these efforts have been. During the pandemic, Facebook should publish periodic updates on its efforts to enforce its advertising targeting and delivery policies. Following the pandemic, Facebook should publish comprehensive data that outlines the scope and scale of its ad policy enforcement during this time period, including data on the number of ads the company removed for violating its COVID-19-specific advertising policies, and data on the number of ads approved in error during this period. In addition, this is an area in which the U.S. government can use existing law to take action, as appropriate, against businesses and sellers who engage in unfair and deceptive trade practices during the pandemic. In particular, the FTC should enforce Section (5)(a) of the FTC Act to hold businesses and sellers who engage in unfair and deceptive trade practices through their online ad campaigns accountable.

Citations
  1. J. Clement, "Most Popular Social Networks Worldwide As Of April 2020, Ranked By Number of active users," Statista, last modified April 24, 2020, source
  2. J. Clement, "Leading Countries Based on Facebook Audience Size As Of April 2020," Statista, last modified April 24, 2020, source
  3. Mark Scott, "Facebook's Private Groups Are Abuzz With Coronavirus Fake News," Politico, March 30, 2020, source
  4. Avaaz, How Facebook can Flatten the Curve of the Coronavirus Infodemic, April 15, 2020, source
  5. Kang-Xing Jin, "Keeping People Safe and Informed About the Coronavirus," Facebook Newsroom, source
  6. Mark Zuckerberg, "I want to share an update on the work we're doing to connect people with accurate information and limit the spread of misinformation about Covid-19.," Facebook, April 16, 2020, 9:01 am, source
  7. Guy Rosen, "An Update on Our Work to Keep People Informed and Limit Misinformation About COVID-19," Facebook Newsroom, last modified April 16, 2020, source
  8. Rosen, "An Update," Facebook Newsroom.
  9. "Investing $100 Million in the News Industry," Facebook Newsroom, last modified March 30, 2020, source
  10. Tessa Lyons, "Increasing Our Efforts to Fight False News," Facebook Newsroom, last modified June 21, 2018, source
  11. Travis Yeh, "Addressing Sensational Health Claims," Facebook Newsroom, last modified July 2, 2019, source
  12. "Community Standards: False News," Facebook, source
  13. Rosen, "An Update," Facebook Newsroom.
  14. Lyons, "Increasing Our Efforts," Facebook Newsroom.
  15. Zuckerberg, "I want," Facebook.
  16. Zuckerberg, "I want," Facebook.
  17. Zuckerberg, "I want," Facebook.
  18. Rosen, "An Update," Facebook Newsroom.
  19. "Connecting People to Accurate Information and Helpful Resources," Facebook Newsroom, last modified February 26, 2020, source
  20. "Providing Helpful Information and Support," Facebook Newsroom, last modified January 30, 2020, source
  21. Rosen, "An Update," Facebook Newsroom.
  22. World Health Organization, "Coronavirus Disease (COVID-19) Advice For The Public: Myth Busters," World Health Organization, source
  23. "Keeping Our People and Our Platforms Safe," Facebook Newsroom, last modified March 16, 2020, source
  24. "Keeping Our People," Facebook Newsroom.
  25. Dia Kayyali and Raja Althaibani, "Vital Human Rights Evidence in Syria is Disappearing from YouTube," WITNESS Blog, entry posted August 2017, source Spandana Singh, Everything in Moderation: An Analysis of How Internet Platforms Are Using Artificial Intelligence to Moderate User-Generated Content, July 22, 2019, source Llansó, "COVID-19 Content Moderation Research Letter – in English, Spanish, & Arabic," Center for Democracy & Technology, last modified April 22, 2020, source
  26. "Press Call Transcript," Facebook, last modified March 18, 2020, source
  27. "Press Call," Facebook.
  28. "Keeping Our Platform Safe With Remote and Reduced Content Review," Facebook Newsroom, last modified March 19, 2020, source
  29. "Exploitative Tactics in Ads," Facebook Newsroom, last modified February 26, 2020, source
  30. "Banning Ads and Commerce Listings for Medical Face Masks," Facebook Newsroom, last modified March 6, 2020, source.
  31. "Banning Ads for Hand Sanitizer, Disinfecting Wipes and COVID-19 Testing Kits," Facebook Newsroom, last modified March 19, 2020, source
  32. "Banning Ads for Hand," Facebook Newsroom.
  33. "Banning Ads and Commerce," Facebook Newsroom.
  34. Aaron Sankin, "Want to Find a Misinformed Public? Facebook's Already Done It," The Markup, April 23, 2020, source

Table of Contents

Close