WhatsApp

WhatsApp is the world’s largest messaging app, with over 2 billion users across the globe.1

The company is owned by Facebook and it offers end-to-end encrypted messaging services. Because WhatsApp offers encrypted messaging services, which are critical for privacy and security, the company is not able to view or review the content that users share. As a result, the company has adopted other approaches to address the spread of misleading information. This includes identifying indicators of problematic content at large, introducing mechanisms to limit the spread of content, and introducing features that enable users to fact check content that they receive.

Generally, WhatsApp aims to reduce the virality of misleading information on the service. In particular, in April 2020, the company instituted a new policy limiting the number of times a forwarded message can be shared to five. If a message has already been forwarded five times, the receiving user can only share it to other chats one at a time.2 According to Facebook, this approach has proven effective at preventing the spread of misinformation in many different countries and locations across the world, especially during elections. As a result, in September 2020, Facebook introduced similar forwarding limits on Facebook Messenger, requiring users to forward messages one at a time.3

The company also began labeling messages that are forwarded five or more times with a double arrow to indicate that they were not written by the sender.4 Further, the platform has a feature enabling users to prevent unknown numbers and contacts from communicating with them and adding them to groups through the app.5 WhatsApp also recently piloted a feature that allows users to upload forwarded messages into their browser to see if online sources support the information in the message. Users can access this feature without having to reveal the message to WhatsApp, thus maintaining their privacy and security. They can also use this feature to fact-check information in the messages they receive.6 Some researchers also suggest that the company institute features enabling device-side hashing and comparison of images against a pre-distributed on-device hash list of known disinformation images.7 However, such an approach would not be consistent with offering fully end-to-end encrypted messaging services, and could undermine the privacy and security benefits that strong encryption provides. It would also raise some freedom of expression concerns in that it involves screening user content before it is uploaded and shared.8

In the context of the upcoming U.S. presidential election, WhatsApp is partnering with the International Fact-Checking Network and its member fact-checking organizations to establish WhatsApp tip lines. These fact-checking organizations can use these tip lines to engage with users around misleading content and debunk and verify content.During elections around the world, WhatsApp accounts are often used to distribute messages at scale.9 This can result in the rapid spread of election and voter suppression misinformation and disinformation. As a result, the company works to identify and remove accounts that engage in automated or spam-like behaviors, which do not reflect the behaviors of human users.10 To do this, WhatsApp developed machine-learning systems to detect suspicious accounts at multiple stages of the product use cycle, including registration, during messaging, and in response to user feedback such as user reports and blocks. These systems then calculate a spam score for the accounts in question based on a range of indicators,11 and subsequently ban accounts that are found to be engaging in automated or spam-like behaviors.12 According to the company, it removes over two million accounts through this process every month, and over 75 percent of these removals take place without a user report flagging an account in the first place.13 However, aside from these figures there is little transparency around the scope and scale of these moderation efforts. The company offers users the right to appeal these decisions, and has a team of reviewers who manage these appeal requests.14 This is important given that appeals are a vital mechanism for providing accountability and redress, and given that these enforcement actions are largely taking place in an opaque setting. Going forward, the platform should publish data explaining these enforcement actions, and where possible break down this data by potential relevance to the elections.

Citations
  1. J. Clement, "Most Popular Global Mobile Messenger Apps As Of July 2020, Based on Number of Monthly Active Users," Statista, last modified July 24, 2020, source.
  2. Manish Singh, "WhatsApp's New Limit Cuts Virality of 'Highly Forwarded' Messages by 70%," TechCrunch, April 27, 2020, source.
  3. Facebook, "New Steps," Facebook Newsroom.
  4. Pranav Dixit, "WhatsApp Is Now Letting Users Know When A Message Has Been Forwarded Too Many Times," BuzzFeed News, August 2, 2019, source.
  5. Manish Singh, "WhatsApp Now Lets You Control Who Can Add You To A Group," VentureBeat, April 3, 2019, source.
  6. WhatsApp, "Search the Web," WhatsApp Blog, entry posted August 3, 2020, source.
  7. Julio C.S. Reis et al., "Can WhatsApp Benefit From Debunked Fact-Checked Stories To Reduce Misinformation?," Harvard Kennedy School Misinformation Review 1, no. 5 (August 20, 2020): source.
  8. In addition, civil society experts have raised numerous concerns around the use of hash databases, as internet platforms often provide little transparency around how they vet content before including it as hashes in these databases, and what impact these databases have on user expression. Svea Windwehr and Jillian C. York, "One Database to Rule Them All: The Invisible Content Cartel that Undermines the Freedom of Expression Online," Electronic Frontier Foundation, last modified August 27, 2020, source.
  9. WhatsApp, Stopping Abuse: How WhatsApp Fights Bulk Messaging and Automated Behavior, February 6, 2019, source.
  10. Private meeting with Facebook representative, August 5, 2020
  11. Private meeting with Facebook representative, August 5, 2020
  12. WhatsApp, Stopping Abuse.
  13. WhatsApp, Stopping Abuse.
  14. WhatsApp, Stopping Abuse.

Table of Contents

Close