Table of Contents
- Executive Summary
- Introduction
- Targeted Advertising and COVID-19 Misinformation: A Toxic Combination
- Human Rights: Our Best Toolbox for Platform Accountability
- Making All Ads “Honest” Through Transparency, Limited Targeting, and Enforcement
- By Protecting Data, Federal Privacy Law Can Reduce Algorithmic Targeting and the Spread of Disinformation
- Good Content Governance Requires Good Corporate Governance
- Without Civil Society, Platform Accountability is a Pipe Dream
- Key Recommendations for Policymakers
- Conclusion
Key Recommendations for Policymakers
The recommendations below call on the U.S. Congress to take legislative action to pass a federal privacy law, update advertising regulations, mandate corporate disclosure and due diligence requirements, and institute governance reform. They build on research conducted for the RDR Corporate Accountability Index, as well as our experience working with advocacy groups and investors seeking to hold social media platforms accountable for their social impact. They are in addition to the recommendations for corporate transparency around content shaping and moderation published at the end of the first report in this two-part series.1
Enact federal privacy law that protects people from the harmful impact of targeted advertising. A comprehensive federal privacy law should encompass much more than the following recommendations, which focus on necessary rules to limit the reach of misinformation and dangerous content by limiting the power and precision of content-shaping and ad-targeting algorithms.
1. Designate an existing federal agency, or create a new agency, to enforce privacy and transparency requirements applicable to digital platforms. The agency should be given the necessary authority and funding to accomplish its mission.
2. Enact strong data-minimization and purpose limitation provisions. Users should not be able to opt-in to discriminatory advertising or to the collection of data that would enable it
3. Give users very clear control over collection and sharing of user information that is not otherwise prohibited, including inferred information, that is not necessary to deliver and operate the service.
- Companies should disclose to users and to the relevant regulatory agency what user information they collect, share, and infer; and for what purpose, and how long it is retained. This information should be independently audited.“User information” is any data that is connected to an identifiable person, or may be connected to such a person by combining datasets or utilizing algorithmic data-processing techniques.
- Companies should not collect user information from third parties, or share user information with third parties, unless the companies in question have a vendor/contractor relationship and the sharing of this user information is disclosed and directly relevant and necessary for the purpose of delivering a service to the user. The “purpose of the service” does not include targeted advertising unless the service’s primary purpose is in fact clearly described as such by the company to the general public in its marketing and public communications.
- Companies should allow users to obtain all of their user information (collected and inferred) that the company holds, in a structured data format.
- Companies should delete all user information within a reasonable timeframe after users terminate their account or at the user’s request. This should be independently audited.
4. Restrict how companies are able to target users.
- Companies should not enable advertisers to use their services to target specific individuals by using personally identifying information.
- Companies should not be allowed to enable advertisers to target users on the basis of any audience category or profile attribute without active user consent.
- Any ad targeting should only take place on the basis of information that is voluntarily disclosed by the user directly within the platform itself, otherwise known as first-party data. (For example, users might specify the language(s) in which they prefer to see ads, their broad geographic region, which sports teams they support, or self-select into broad audience categories including by subscribing to an advertiser’s updates.)
- Companies should ensure that the combination of the advertising content and the targeting category does not amount to discrimination on the basis of one or more protected classes recognized under civil rights law.
Require that platforms maintain a public ad database to ensure compliance with all privacy and civil rights laws when engaging in ad targeting. Transparency is a prerequisite for accountability. Congress should pass the Honest Ads Act, expand its scope to include all types of online advertisements, thus mandating a universal database of advertisements, and enable regulators and researchers to audit it.
1. Pass the Honest Ads Act: Platforms should be required to maintain a “public file” database with detailed information about the political ads they serve, similar to existing requirements for broadcast media, including a digital copy of the advertisement, a description of the audience the advertisement targets, the number of views generated, the dates and times of publication, the rates charged, and the contact information of the purchaser.
2. Expand the online advertising database to include all types of ads: Platforms should be required to expand the database that would be required by the Honest Ads Act to all online ads.
Require relevant disclosure and due diligence. Companies should be required to disclose information that demonstrates they are tracking the social impact of their targeted advertising and algorithmic systems, taking necessary steps to mitigate risk and prevent social harm.
1. Require targeted advertising revenue disclosure: All companies offering digital services and advertising should be required to disclose the percentage of revenue generated by targeted advertising.
2. Require ESG disclosures: Companies should be required to disclose non-financial information about their environmental, social, and governance (ESG) impacts, including information about the social impact of targeted advertising and algorithmic systems. Reporting should be formal, systematic, and comparable. (See Part 1 for more detailed disclosure recommendations related to algorithmic systems.)
3. Require due diligence: Companies should be required to conduct assessments of their social impact and risks, including human rights risks associated with targeted advertising and algorithmic systems.
Strengthen corporate governance and oversight. In line with the ESG and due diligence disclosures recommended above, Congress should require that Securities and Exchange Commission (SEC) rules empower shareholders to hold company leadership accountable for social impact.
1. Require companies to phase out dual-class share structures: Once a reasonable but not excessive period of time has passed after a company’s IPO, dual class shares should be phased out.
2. Do not make it harder to file shareholder resolutions: The SEC should scrap proposed rule changes that will make it more difficult for shareholders to file proposals and to get them on proxy ballots.
The best way for companies to prepare for future regulation—and more important, to demonstrate maximum respect for users’ human rights—is to align their policies, practices, and disclosures with the indicators outlined in the RDR Corporate Accountability Index methodology.2
Citations
- See the recommendations in Maréchal, Nathalie, and Ellery Roberts Biddle. 2020. It’s Not Just the Content, It’s the Business Model: Democracy’s Online Speech Challenge – A Report from Ranking Digital Rights. Washington D.C.: New America. source.
- The 2020 RDR Index methodology has been revised and the version we will use to evaluate companies in 2020 will be published in June. For the consultation draft of the new indicators see: source. The indicators used for the 2019 RDR Index can be found at: source