Welcome to New America, redesigned for what’s next.

A special message from New America’s CEO and President on our new look.

Read the Note

Who Gets Targeted—Or Excluded—By Ad Systems?

Targeted advertising
Original art by Paweł Kuczyński

Separate from the algorithms that shape the spread of user-generated content, advertising is the other arena in which policymakers must dig deeper and examine the engines that determine and drive distribution. Traditional advertising places ads based on context: everyone who walks by a billboard or flips through a magazine sees the same ad. Online targeted advertising, on the other hand, is personalized based on what advertisers and ad networks know (or think they know) about each person, based on the thick digital dossiers they compile about each of us.

In theory, advertisers on social media are responsible for determining which audience segments (as defined by their demographics or past behavior) will see a given ad, but in practice, platforms further optimize the audience beyond what the advertiser specified.1 Due to companies’ limited transparency efforts, we know very little about how this stage of further optimization works.

Create Audience.png
A screenshot of the settings to create a Custom Audience for an ad on Facebook, with targeting by gender, age, location, interests, and other attributes. Narrowing the audience this way could enable discriminatory practices (Facebook, Feb. 28, 2020).

Nevertheless, from what we do know, these systems are already designed to discriminate—when you place an ad on a given platform, you are encouraged to select (from a list of options) different types of people you’d like the ad to reach. Such differentiation amounts to unfair and even illegal discrimination in many instances.2 One powerful example of these dynamics (and of companies’ reticence to make changes at the systemic level) emerged from an investigation conducted by ProPublica, in which the media outlet attempted to test Facebook’s ad-targeting tools by purchasing a few ads, selecting targeted recipients for the ads, and observing the process by which they were vetted and approved.3

“The ad we purchased was targeted to Facebook members who were house hunting and excluded anyone with an ‘affinity’ for African-American, Asian-American or Hispanic people,” the reporters wrote. They explained that Facebook’s ad-sales interface allowed them to tick different boxes, selecting who would—and would not—see their ads.4

Detailed Targeting.png
A screenshot from a 2016 ProPublica article. The upper panel allows advertisers to select specific audiences that they want to reach; the lower panel allows them to select those audiences that they want to exclude (ProPublica, Oct. 28, 2016).

The ads were approved 15 minutes after they were submitted for review. Civil liberties experts consulted by ProPublica confirmed that the option of excluding people with an “affinity” for African-American, Asian-American, or Hispanic people was a clear violation of the U.S. Fair Housing Act, which prohibits real estate entities from discriminating against prospective renters or buyers on the basis of their race, ethnicity, or other identity traits.

After ProPublica wrote about and confronted Facebook on the issue, the company added an "advertiser education" section to its ad portal, letting advertisers know that such discrimination was illegal under U.S. law. It also began testing machine learning that would identify discriminatory ads for review. But the company preserved the key tool that allowed these users to be excluded in the first place: the detailed targeting criteria, pictured above, which allowed advertisers to target or exclude African-Americans and Hispanics.

Neither Facebook, Google, nor Twitter show evidence that they conduct due diligence on their targeted advertising practices.

Rather than addressing systemic questions around the social and legal consequences of this type of targeting system, Facebook focused on superficial remedies that left its business model untouched. In another investigation, ProPublica learned that these criteria are in fact generated not by Facebook employees, but by technical processes that comb the data of Facebook’s billions of users and then establish targeting categories based on users’ stated interests. In short, by an algorithm.5

In spite of the company’s apparent recognition of the problem, Facebook did not take away or even modify these capabilities until many months later, after multiple groups took up the issue and filed a class action lawsuit claiming that the company had violated the Fair Housing Act.6

Here again, a company-wide policy for using and developing algorithms combined with human rights due diligence would most likely have identified these risks ahead of time and helped Facebook develop its ad-targeting systems in a way that respects free expression, privacy, and civil rights laws like the Fair Housing Act. But this is the norm rather than the exception. Neither Facebook, Google, nor Twitter show evidence that they conduct due diligence on their targeted advertising practices.7

Citations
  1. Ali, M., Sapiezynski, P., Bogen, M., Korolova, A., Mislove, A., & Rieke, A. (2019). Discrimination through Optimization: How Facebook's Ad Delivery Can Lead to Biased Outcomes. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1-30.
  2. Zuiderveen Borgesius, Frederik. 2018. Discrimination, Artificial Intelligence, and Algorithmic Decision-Making. Strasbourg: Council of Europe. source; Wachter, Sandra. 2019. Affinity Profiling and Discrimination by Association in Online Behavioural Advertising. Rochester, NY: Social Science Research Network. SSRN Scholarly Paper. source Forthcoming in Berkeley Technology Law Journal, Vol. 35, No. 2, 2020.
  3. Angwin, Julia, and Terry Parris Jr. 2016. “Facebook Lets Advertisers Exclude Users by Race.” ProPublica. source
  4. Julia Angwin, Terry Parris Jr. 2016. “Facebook Lets Advertisers Exclude Users by Race.” ProPublica. source
  5. Julia Angwin, Madeleine Varner. 2017. “Facebook Enabled Advertisers to Reach ‘Jew Haters.’” ProPublica. source
  6. In a March 2019 settlement, Facebook agreed to create a distinct advertising portal for housing, employment, and credit ads, as civil rights law prohibits discriminatory advertising in these areas. The company also committed to create a new “Housing Search Portal” allowing users to view all housing ads on the platform, regardless of whether the users are in the target audience selected by the advertiser.
  7. Ranking Digital Rights. 2020. The RDR Corporate Accountability Index: Transparency and Accountability Standards for Targeted Advertising and Algorithmic Systems – Pilot Study and Lessons Learned. Washington D.C.: New America. source
Who Gets Targeted—Or Excluded—By Ad Systems?

Table of Contents

Close