The Role of Policymakers
Over the past several years, the harms and discriminatory effects that can result from internet platforms’ use of algorithmic content-shaping systems have become more pronounced. Research has indicated time and time again that these negative consequences are disproportionately shouldered by marginalized and vulnerable communities, including racial minorities, women, and LGBTQ+ individuals. In response, policymakers around the world are considering how to provide greater oversight of these algorithmic systems, and whether and how these systems should be regulated.
In the United States, the First Amendment limits the extent to which the government can direct internet platforms to moderate content on their platforms. However, the government can enact greater transparency and accountability requirements for these platforms.1 Recently, members of Congress have introduced two bills, the Algorithmic Accountability Act and the PACT Act, to tackle these issues. Among other things, the Algorithmic Accountability Act would authorize the Federal Trade Commission (FTC) to issue regulations requiring internet platforms to conduct impact assessments of their algorithmic decision-making systems; evaluate these systems for bias, discrimination, privacy, fairness, and security; and refine their systems based on the results of these impact assessments.2 As previously discussed, the PACT Act, includes numerous transparency reporting obligations for internet platforms, including for algorithmic curation processes3 such as content moderation and content ranking.4
The Honest Ads Act is another piece of legislation that has been introduced in an effort to tackle the need for greater transparency and accountability in the content shaping space. Introduced shortly after the 2016 U.S presidential elections, the Act would require online platforms to provide more transparency around the scope and scale of their online political advertising operations.5
Although the U.S. federal government has not yet enacted any legislation specifically addressing algorithmic transparency and accountability, anti-discrimination statutes that pre-date the digital age can and should apply online to ensure greater fairness and accountability around these systems. Going forward, policymakers should also clarify that all offline anti-discriminatory statutes, including the Civil Rights Act of 1964, the Fair Housing Act, and the Age Discrimination in Employment Act, apply in the digital environment. Where necessary, Congress and state legislatures should enact appropriate legislation to fill gaps or clarify the applicability of such laws.6 Stakeholders should also strive to ensure that these conversations center longstanding concerns related to civil rights, discrimination, and bias, and use these opportunities to bring in and lift up the voices of individuals and communities who are disproportionately impacted by the harms of these systems.7
Further, in order to promote greater fairness, accountability, and transparency around these algorithmic systems, U.S. policymakers must pass comprehensive federal privacy legislation that reins in, and imposes guardrails on, internet platforms’ massive data collection practices.8 This legislation should, at a minimum, limit the types of data that can be collected and the purposes for which it may be used, protect civil rights, prevent unlawful dicsrimination, advance equal opportunity, and provide redress for privacy violations.9 As outlined above, extensive data collection practices are integral to fueling the creation, deployment, and refinement of algorithmic content-shaping systems, and they pose significant privacy risks to users. Today, we tend to view existing automated systems and their reliance on rampant data collection practices as a given. However, strong federal privacy legislation can change this and ensure greater rights for users.
In the European Union, policymakers are similarly considering how to encourage and require greater fairness, accountability, and transparency from platforms. For example, the EU Code of Practice on Disinformation, introduced in 2018, outlines self-regulatory standards that platforms can voluntarily sign on to in order to fight disinformation.10 The code of practice includes a number of transparency and accountability commitments, including issuing disclosures around political advertising and demonetizing accounts that spread disinformation.11 In addition, the previously discussed Digital Services Act (DSA) has emerged as a key component of the European Commission’s roadmap to rethink “Europe’s digital future”12 and revise the existing legal framework for intermediaries and their responsibilities related to user content and conduct. Although the transparency and accountability obligations under the DSA are still being deliberated, conversations have thus far centered around the need for greater transparency and accountability around content moderation, digital advertising, and the use of algorithms. These conversations have also included recommendations to create an independent body that would oversee the implementation of these transparency requirements, as well as other things.13 The European Parliament is also considering legislation that would require companies to conduct human rights due diligence around their operations.14
Unlike the United States, the European Union has passed comprehensive privacy legislation in the form of the General Data Protection Regulation (GDPR). However, experts have outlined that although the GDPR takes necessary steps to safeguard user privacy, it also creates barriers that prevent companies from sharing data with researchers.15 Going forward, policymakers in the EU, United States, and beyond should work to improve data access mechanisms and policies for researchers and journalists, as this can fuel further analyses and move the ball forward with regard to fairness, accountability, and transparency efforts.
In addition, policymakers must ensure that any form of regulation or voluntary guidelines are rights-respecting, do not infringe on the freedom of expression or privacy rights of individuals, and do not undermine critical intermediary liability provisions.16 As policymakers make these considerations, civil society groups, civil rights organizations, and researchers should further collaborate to guide these conversations and ensure that any requirements or guidelines yield meaningful outcomes.
Citations
- Congress will need to ensure that any such rules simply require transparency about existing operations and do not amount to compelled speech, which would also violate the First Amendment. Congressional Research Service, Assessing Commercial Disclosure Requirements under the First Amendment, by Valerie C. Brannon, April 23, 2019, source
- Algorithmic Accountability Act of 2019, H.R. 2231, 116th, 1st. (as introduced, Apr. 10, 2019). source
- “Platform Accountability”, H.R. 2231
- The PACT Act would also amend Section 230 of the Communications Decency Act. OTI opposes these changes and is concerned that the bill threatens to stifle free expression online and allow federal actors to police online content. Spandana Singh and Koustubh "K.J." Bagchi, "OTI Statement for the Record on the PACT Act and Section 230 Hearing," New America's Open Technology Institute, last modified July 28, 2020, source
- Honest Ads Act, S. 1989, 115th, 1st. (as introduced, Oct. 19, 2017). source
- Spandana Singh, Special Delivery: How Internet Platforms Use Artificial Intelligence to Target and Deliver Ads, February 18, 2020, source
- Roundtable discussion by Open Technology Institute, July 7, 2020
- Park, "How 'Notice," New America's Open Technology Institute (blog).
- New America's Open Technology Institute, "Principles for Privacy Legislation," news release, November 13, 2018, source
- "Code of Practice on Disinformation," European Commission, last modified September 26, 2018, source
- "Code of Practice," European Commission.
- European Commission, Shaping Europe's Digital Future, February 2020, source
- Singh, "Thinking Through," The GNI Blog.
- Markus Krajewski et al., Human Rights Due Diligence Legislation – Options for the EU, June 2020, source
- Roundtable discussion by Open Technology Institute, June 16,, 2020
- Singh, "Thinking Through," The GNI Blog.