Snapchat
Snapchat, a multimedia messaging app owned by Snap Inc., saw roughly 210 million daily active users in Q3 2019; active Snapchat users opened the app up to 30 times per day.1
Snapchat’s main service is allowing users to upload photos and video messages that disappear after they have been viewed. The platform has affirmatively sought to engage in the electoral process through both its advertising services and user offerings. In 2018, for example, the app displayed a link to register to vote on the profile page of every user who was 18 years old or older, and allowed users to register directly within the app through a service called TurboVote.2 This helped register over 400,000 voters, 57 percent of which were later confirmed to have cast a ballot, demonstrating the influence Snapchat can have when it comes to elections.3 On August 6, 2020, Snapchat also announced it was planning to release new features in the coming months to encourage users to register to vote in the 2020 Presidential election.4 These features include a voter checklist card as well as voter-related “Minis,” which are miniature applications made by third-parties that run inside Snapchat.5 The platform is planning to release a “Before You Vote” mini that lets users know where and how to vote and whether voting by mail is available in their state.6 These digital avenues for voter registration are particularly important in the run-up to the 2020 presidential election, as the COVID-19 pandemic has made it harder for people to register to vote in person at libraries or at local DMV offices. As a result, more people may be likely to rely on online platforms to register and participate in the election process.7
Snap Inc.’s CEO Evan Spiegel has stated that the platform fact-checks all ads from political candidates and about the voting process and does not allow misinformation in these types of ads.8 Spiegel explained that Snapchat wants to allow political advertising to encourage users, particularly young people and first-time voters, to engage in the political process.9 Under its advertising policies, Snapchat prohibits ads that “are false or misleading, including deceptive claims, offers, functionality, or business practices” within its content guidelines.10
In addition, as per Snapchat’s political and advocacy advertising policies,11 political ads must adhere to all applicable laws and regulations, including national election, copyright, and defamation laws, as well as (where applicable) Federal Election Commission (FEC) regulations.12 The policies state that it is the responsibility of the advertiser to comply with all laws and regulations.13 It is therefore unclear if Snapchat itself takes steps to ensure ads are compliant with state and federal laws. Furthermore, political ads must include a “paid for by” disclaimer that contains the name of the paying person or entity.14 This applies to any ad with political messaging or any ad that links to political content. Election related ads must also state whether or not an ad was authorized by a candidate or an organization.15 If the ad is not paid for by the candidate, it must include contact information for the sponsoring organization. While Snapchat’s general ad policies ban harassment or threats, the platform does not explicitly ban attack ads, (ads that express disagreement with candidates or political parties), as long as they do not violate any other guidelines.16 Snapchat does not currently provide information on what tools and processes it uses to review and approve ads. Going forward, the company should confirm to what extent it uses automated tools and human reviewers to review and approve ads and how these tools and individuals are trained.
Currently, Snapchat maintains a Political and Advocacy Ads Library to provide transparency around political advertising on Snapchat.17 The library includes information, such as the amount of ad spend and the identity of paying entities, on all political and advocacy advertising that has run on the platform from 2018 to 2020, and it is updated daily. Although this report is a helpful first step towards providing transparency around political advertising on the service, the reports are not readily accessible, as ad information must be downloaded from the website and viewed in Excel. Going forward, the platform should improve the user-friendliness and accessibility of the report by creating a web version of the report. The company should also expand the report to include information such as how many ads were removed for violating Snap’s policies, particularly around voting and election misinformation.
When it comes to user-generated content, Snap prohibits hateful content, violent content, impersonation, spam, and illegal speech and activity.18 The platform does not explicitly prohibit political or election-related misinformation—or any categories of misinformation—from standard users (i.e. non media partners), and has no restrictions on false or misleading content that may support voter suppression. The platform should expand its guidelines and clarify what type of political content is permitted on the service, and how, or if, it intends to address election and voter suppression misinformation. In addition, Snapchat’s Community Guidelines state that, in certain cases, the platform “won’t take action against content when it is newsworthy and relates to a matter of political, social, or other general concern to our community.”19 It is unclear in what circumstances this policy would be applied, and whether it could potentially apply to political content broadly, as this content could be of general public interest.
Snapchat’s Community Guidelines also have additional requirements for media partners, whose content is displayed in the Discover feed. Snapchat’s Discover section is a space for publishers, like the New York Times, Buzzfeed, and Mashable, to post stories. These media partner-specific guidelines are not applicable to standard users, and they are similar to Snap’s advertising policies in that they require that content is fact-checked and accurate.20 Because Snapchat partners with these accounts and actively promotes their content, it is an important step that there are additional requirements on these accounts. However, Snap should provide greater transparency around how they review these accounts and ensure that their content is accurate and fact-checked.
If Snapchat does, or plans to, prohibit voter and election related misinformation in user-generated content, the platform should ensure that it provides adequate notice to parties who have had their content removed for violating the platform’s policies. The company should also allow these parties to access a robust and timely appeals process. The Community Guidelines state that Snapchat reviews reports of policy-violating content, and if an account violates the company’s policies Snapchat “may remove the offending content, terminate the account, and/or notify law enforcement.”21 The Community Guidelines also state that if an account is terminated for violating Snapchat’s content policies, the account holder may not use Snapchat again. However, it is unclear if the company currently offers users an appeals process and the Community Guidelines do not outline what steps a user can take if they feel their content was removed in error.
Unlike other platforms like Facebook and Twitter, Snap has not taken an active stance against deepfake videos.22 While manipulated media such as deepfakes can exacerbate voter suppression by misleading users on election or candidate information, there is likely less risk that user-generated images manipulated with Snapchat’s filters (and that disappear after viewing) will be a source of election misinformation. However, there is still a risk that political advertising on the platform could utilize its technology to create misleading deepfake videos. , Therefore, the company should develop explicit guidelines for this type of technology around political advertising. Snapchat’s “paid for by'' policy for political ads is one way users can understand who is creating certain videos and decide if they trust its content . However, the platform should go further in providing transparency around the allowed or restricted uses of deepfakes technology in political ads.23
Citations
- Josh Constine, “Snapchat beats in Q3, adding 7M users & revenue up 50%,” TechCrunch, October 22, 2019, source
- “Register to vote on Snapchat!,” Snap.Inc, September 25, 2018, source
- Sara Fischer, “Snapchat preps young users to vote in November,” Axios, May 14, 2020, source
- Kim Lyons, “Snapchat is planning in-app voter awareness tools to help get its users ready for November,” The Verge, August 6, 2020, source
- Casey Newton, “Snap announces Minis to bring other apps into Snapchat,” The Verge, June 11, 2020, source
- Lyons, “Snapchat is planning in-app,” The Verge
- Brad Bennett, “Get-out-the-vote effort goes digital for COVID-19 pandemic,” Southern Poverty Law Center, April 11, 2020 source
- Andrew Hutchinson, “Snapchat Says Political Ads on its Platform Are Fact-Checked, Adding to Debate,” Social Media Today, November 20, 2019, source
- Hutchinson, “Snapchat says Political,” Social Media Today
- “Snap Advertising Policies,” Snap Inc., source
- “Snap Political & Advocacy Advertising Policies,” Snap Inc., Effective May 15, 2019, source
- “Snap Political & Advocacy,” Snap.Inc.
- “Snap Political & Advocacy,” Snap.Inc.
- “Snap Political & Advocacy,” Snap.Inc.
- “Snap Political & Advocacy,” Snap.Inc.
- “Snap Political & Advocacy,” Snap.Inc.
- “Snap Political Ads Library,” Snap Inc., source
- “Community Guidelines,” Snap Inc., September 2020, source
- “Community Guidelines,” Snap Inc
- “Community Guidelines,” Snap Inc
- “Community Guidelines,” Snap Inc
- Michael Nuñez, “Snapchat and TikTok Embrace ‘Deepfake’ Video Technology Even As Facebook Shuns It,” Forbes, January 8, 2020, source
- Catherine Thorbecke, “Snapchat users can see who's behind political ads with new feature,” ABC News, September 16, 2019, source