TikTok
TikTok is a relatively new internet platform compared to platforms such as YouTube and Facebook.1 However, over the past several years the company has seen rapid growth around the world, with approximately 800 million active users globally today.2 The company’s popularity makes its platform a focal point for misleading information. Further, experts have expressed concerns that it could particularly become home to election-related misinformation and disinformation, including voter suppression-related content, given that the company is nascent and has less robust content moderation practices.3
In January 2020, in response to concerns that the platforms’ rapid growth had not been met with tandem efforts to create responsive policies that safeguard users,4 TikTok released a more comprehensive version of its Community Guidelines.5 These updated policies include a section on misleading information which states “we do not permit misinformation that could cause harm to our community or the larger public.” This policy includes a prohibition on content that is meant “to incite fear, hate, or prejudice,” “hoaxes, phishing attempts, or manipulated content meant to cause harm,” content that misleads users “about elections or other civic processes,” and “content distributed by disinformation campaigns.”6 Under the company’s integrity and authenticity content policies, it also bans spam-related activity, including “coordinated attempts to manufacture inauthentic activity” and operating accounts under false pretenses.7 In August 2020, TikTok broadened and clarified its Community Guidelines to address the spread of misinformation, disinformation, and related content that seeks to undermine the 2020 elections. As part of these efforts, the company updated its general policy on misleading content to include a clear prohibition on manipulated media such as deepfakes.8 In addition, the company clarified that it does not permit coordinated inauthentic behavior.9 Although these policies can apply to election-related content, the Community Guidelines do not include specific voter suppression-related content policies. In an August 2020 announcement, TikTok stated that it was expanding its partnerships with fact-checking organizations, such as PolitiFact and Lead Stories, to help review content and identify and debunk misleading election-related information. Further, the company shared that it would soon introduce an in-app feature permitting users to report content or accounts for election misinformation. The company will also establish an election information center to connect users to authoritative information surrounding the 2020 elections.10
TikTok’s mission is “to inspire creativity and bring joy.” According to the company, politics is not a topic that it views as bringing joy to its users. As a result, the company has actively discouraged the use of its services for political means. To this end, in 2019, the company banned political advertising.11 The ban prohibits any paid advertising that references, promotes, or opposes a political candidate, current or former elected official, or political party or group. The ban also includes any content that advocates for a particular position on a local, state, or federal issue of public relevance that could influence political outcomes.12 It is unclear how effective policy enforcement around political advertising is, however, as the company does not share any data related to enforcement of its political ads policy.
Some research indicates that the company has discouraged political content by suppressing its promotion and recommendation during election periods.13 The company states it does not remove political content, although it has not explicitly addressed the topic of algorithmic amplification and suppression.14 Although TikTok says it does not view politics and political content as creating joy for its users, and it therefore has instituted a political ads ban, some experts suggest the real reason for the ban is that the company is a small and nascent one that lacks the capacity to moderate and engage with such content in a scalable manner.15 However, the majority of TikTok users in the United States are between the ages of 18 and 24, 16 and as a result, avoiding politics has been challenging given that this demographic often uses social media to engage in social and political discussions.17 In addition, 70 percent of TikTok users are of voting age. TikTok videos with political content have been found to generate a significant amount of traffic on the platform.18 In addition, despite the prohibition on paid political advertising, politicians and political groups have partnered with TikTok influencers to promote their ideas and gain popularity.19 For example, progressive nonprofit ACRONYM has worked with influencers to encourage voter registration. Similarly, a Trump campaign manager stated he is exploring solidifying deals with TikTok influencers.20
TikTok has also introduced a range of programs which could help address the spread of voter suppression misinformation and disinformation on its service. For example, in March 2020, the company established the TikTok Content Advisory Council, a group of external technology and safety experts tasked with providing the company guidance on its content moderation policies, including its misinformation and hate speech related policies. Further, in July 2020, the company introduced a media literacy and safety video series titled “Be Informed,” which features popular creators on the platform encouraging users to be cognizant of the spread of false information on the service. The video includes guidance on how users can evaluate content and sources on the platform, use in-app features to protect against the spread of misleading content, and recognize facts versus opinions.21 According to senior officials from the U.S. Department of Homeland Security (DHS) Cybersecurity and Infrastructure Security Agency (CISA), TikTok has an open line of communication and collaboration with DHS to combat election-related disinformation.22 However, TikTok is owned by a Chinese company called ByteDance; as a result, the expectations and outcomes surrounding this arrangement may be different. This is compounded by the fact that TiKTok is under heavy scrutiny from U.S. lawmakers for its ties with China,23 prompting many U.S. agencies and branches of the armed forces to ban the use of the app on employee phones, citing national security concerns.24 Most recently, President Trump called for a ban on the app.25 It is difficult to assess how these factors might impact the effectiveness of a collaboration between TikTok and CISA.
TikTok needs to implement clearer and more direct policies to address voter suppression misinformation and disinformation. In addition, the company should provide greater transparency and accountability around how it enforces these policies. In its July 2020 transparency report, which for the first time featured data on how the company enforces its own Community Guidelines, the platform states that less than 1 percent of content actioned violated the company’s policies on hate speech, integrity and authenticity, and dangerous individuals and organizations. The majority of content the company actioned violated its policies on adult nudity and sexual activities, minor safety, and illegal activities and regulated goods. The company does not, however, include data related to misinformation, although the transparency report does state that the company often proactively removes harmful misinformation.26 This lack of transparency makes it difficult to understand the scope of voter suppression or election-related misinformation and disinformation on the platform, and how the company addresses these forms of content. Transparency around the effectiveness of the company’s enforcement actions is also important, as despite the company’s ban on disinformation campaigns, researchers have found examples of disinformation related to topics such as the COVID-19 pandemic circulating on the service.27 Similar transparency is needed around the enforcement of the company's political ads policies. The company also should outline what kinds of enforcement actions it takes against misleading content, and provide disaggregated data which outlines how often the company removes such content compared to how often the company employs another enforcement action such as algorithmically downranking content or appending a label to such content.
Citations
- Over the past several months, TikTok has drawn scrutiny from U.S. lawmakers for its ties to China. On August 6, President Trump issued an Executive Order that effectively bans TikTok in the United States by blocking any transactions with its parent company ByteDance. President Trump also signed an Executive Order on August 14 requiring ByteDance to sell or spin off its TikTok U.S. business within 90 days. As this report is being published, news accounts indicate that Oracle has agreed to become a “trusted technology provider” for TikTok. The deal is still waiting for approval from the U.S. government. This issue is not a focus of this report, although it is later mentioned at a high-level. source source source
- Maryam Mohsin, "10 TikTok Statistics That You Need to Know in 2020 [Infographic]," Oberlo, last modified July 3, 2020, source.
- Andrew Marino, "How TikTok Could Be A Player In Election Disinformation," The Verge, May 12, 2020, source.
- Tony Romm and Drew Harwell, "TikTok Revamps Content Rules, Aiming to Clear Up Which Videos It Allows or Blocks," The Washington Post, January 8, 2020, source.
- Lavanya Mahendran and Nasser Alsherif, "Adding Clarity To Our Community Guidelines," TikTok Newsroom, last modified January 8, 2020, source.
- TikTok, "Community Guidelines," TikTok, last modified August 2020, source.
- TikTok, "Community Guidelines," TikTok.
- Vanessa Pappas, "Combating Misinformation and Election Interference on TikTok," TikTok Newsroom, last modified August 5, 2020, source.
- Pappas, "Combating Misinformation," TikTok Newsroom.
- Pappas, "Combating Misinformation," TikTok Newsroom.
- Sarah Perez, "TikTok Explains Its Ban on Political Advertising," TechCrunch, October 3, 2019, source.
- Mahendran and Alsherif, "Adding Clarity," TikTok Newsroom.
- Angela Chen, "A Leaked Excerpt of TikTok Moderation Rules Shows How Political Content Gets Buried," MIT Technology Review, November 25, 2019, source.
- Chen, "A Leaked".
- Mariel Soto Reyes, "TikTok Could Be the Next Battleground for Political Content," Business Insider, January 7, 2020, source.
- Georgia Wells and Emily Glazer, "TikTok Wants to Stay Politics-Free. That Could Be Tough in 2020.," The Wall Street Journal, January 5, 2020TikTok Wants to Stay Politics-Free. That Could Be Tough in 2020., source.
- Wells and Glazer, "TikTok Wants".
- Common Sense Media, 2020 Social Media Voter Scorecard, 2020, source.
- Reyes, "TikTok Could".
- Reyes, "TikTok Could".
- Stephanie Hind and Tara Wadhwa, "TikTok's 'Be Informed' Series Stars TikTok Creators to Educate Users About Media Literacy," TikTok Newsroom, last modified July 16, 2020, source.
- Alfred Ng, "US Officials in Contact With TikTok Over Political Disinformation," CNET, March 3, 2020, source.
- Ng, "US Officials".
- Ben Kesling and Georgia Wells, "U.S. Military Bans TikTok Over Ties to China," The Wall Street Journal, January 3, 2020, source.
- Riya Bhattacharjee, Amanda Macias, and Jordan Novet, "Trump Says He Will Ban TikTok Through An Executive Action," CNBC, August 1, 2020, , source.
- TikTok, TikTok Transparency Report, July 9, 2020, source.
- Marino, "How TikTok".