Table of Contents
- Executive Summary
- Introduction
- Targeted Advertising and COVID-19 Misinformation: A Toxic Combination
- Human Rights: Our Best Toolbox for Platform Accountability
- Making All Ads “Honest” Through Transparency, Limited Targeting, and Enforcement
- By Protecting Data, Federal Privacy Law Can Reduce Algorithmic Targeting and the Spread of Disinformation
- Good Content Governance Requires Good Corporate Governance
- Without Civil Society, Platform Accountability is a Pipe Dream
- Key Recommendations for Policymakers
- Conclusion
Executive Summary
Before the COVID-19 pandemic hit, democracies were already struggling to address disinformation, hate, extremism, and other dangerous online content while also protecting free speech and privacy. Now, Facebook, Twitter, and Google’s YouTube are awash with disinformation and misinformation that can be deadly. Despite the companies’ commitment to take unprecedented steps to control the problem, they are failing.
This report argues that Facebook, Twitter, and Google’s targeted advertising business models, and the opaque algorithmic systems that support them, are the root cause of their failure to staunch the flow of misinformation.
The second in a two-part series aimed at U.S. policymakers and anybody concerned with the question of how internet platforms should be regulated, this report reinforces the need to adopt a human rights framework for platform accountability. We propose concrete areas where Congress needs to act to mitigate the harms of misinformation and other dangerous speech without compromising free expression and privacy: transparency and accountability for online advertising, starting with political ads; federal privacy law; and corporate governance reform.
First, we point to concerning examples from the pandemic that highlight the connection between targeted advertising and misinformation about the coronavirus and its purported remedies. We explain how international human rights standards provide a framework for holding social media platforms accountable that complements existing U.S. law and can help lawmakers determine how best to regulate these companies without curtailing users’ rights.
Drawing on our five years of research for the Ranking Digital Rights (RDR) Corporate Accountability Index, we then point to concrete ways that the three social media giants have failed to respect users’ human rights as they deploy targeted advertising business models and algorithmic systems. We describe how the absence of data protection rules enables the unrestricted use of algorithms to make assumptions about users that determine what content they see and what advertising is targeted to them. We note that such targeting can result in discriminatory practices as well as the amplification of misinformation and harmful speech.
Next, we explain why expanding the transparency requirements that currently apply to print and broadcast political ads to all types of online advertising is a prerequisite for oversight and accountability, and how a robust federal privacy law can help fight misinformation and dangerous speech, while acknowledging enforcement challenges.
The final section makes a case for corporate governance reform. We explain how trends in environmental, social, and governance (ESG) investing are prompting companies to adopt due diligence and impact assessment standards, and can strengthen corporate governance over the long term. In light of investors’ growing interest in holding companies accountable for their social impact, we describe the role Congress can play in mandating corporate disclosure of information pertaining to the social and human rights impact of targeted advertising and algorithmic systems. We also recommend actions that could be taken by the U.S. Securities and Exchange Commission (SEC) to empower shareholders. We again highlight enforcement challenges while also noting strides made by European companies that, if ignored, may portend a loss in global market share for American companies that do not follow.
To conclude, we offer some thoughts about how civil society stakeholders, including researchers, journalists, and advocacy and grassroots organizations, are critical to addressing accountability gaps, especially in the absence of effective regulation and oversight. We also explain why companies must proactively engage with civil society as a part of their efforts to mitigate the negative social impacts of their business models.
Key Recommendations for U.S. Policymakers
Enact federal privacy law that protects people from the harmful impact of targeted advertising. Specifically, this law should:
- Ensure effective enforcement by designating an existing federal agency, or create a new agency, to enforce privacy and transparency requirements applicable to digital platforms.
- Include strong data-minimization and purpose limitation provisions: Users should not be able to opt-in to discriminatory advertising or to the collection of data that would enable it.
- Give users very clear control over collection and sharing of their information.
- Restrict how companies are able to target users. The law should prohibit the use of third-party data to target specific individuals, as well as discriminatory advertising that violates users’ civil rights.
Require that platforms maintain a public ad database to ensure compliance with all privacy and civil rights laws when engaging in ad targeting: Pass the Honest Ads Act, expand the public ad database to include all advertisements, and allow regulators and researchers to audit it.
Require relevant disclosure and due diligence around the social and human rights impact of targeted advertising and algorithmic systems.
- Mandate disclosure of targeted advertising revenue along with disclosure of environmental, social, and governance (ESG) information, including information relevant to the social impact of targeted advertising and algorithmic systems.
- Require due diligence: Companies should be required to conduct assessments of their social impact and risks, including human rights risks associated with targeted advertising and algorithmic systems.
Strengthen corporate governance and oversight. The U.S. Securities and Exchange Commission (SEC) rules should empower shareholders to hold company leadership accountable for social impact.
- Phase out dual-class shares: Companies should not be able to maintain a dual class system of shares that effectively empowers the CEO to vote down shareholder resolutions in perpetuity.
- Do not make it harder to file shareholder resolutions: The SEC should scrap proposed rule changes that will make it more difficult for shareholders to file proposals.