Table of Contents
- Introduction
- The Growth of Today’s Digital Advertising Ecosystem
- The Role of Data in the Targeted Advertising Industry
- The Role of Automated Tools in Digital Advertising
- Concerns Regarding Digital Advertising Policies and Practices
- Case Study: Google
- Case Study: Facebook
- Case Study: LinkedIn
- Promoting Fairness, Accountability, and Transparency Around Ad Targeting and Delivery Practices
Case Study: Facebook
A General Overview of Facebook’s Advertising Platform
Facebook is the world’s largest social media platform, with approximately 1.59 billion daily active users as of June 2019.1 The platform ranks third in global internet engagement on Alexa rankings.2 Facebook is the second largest player in the United States and global digital advertising industries, after Google. Projections from earlier in 2019 estimated that the platform would have approximately 22 percent of the U.S. market share3 and generate $67.37 billion in net digital ad revenues in 2019.4 Advertising accounts for the majority of the company’s revenue. In 2018, the company generated $55 billion in ad revenues, a significant increase from 2017, when it generated almost $40 billion in ad revenues.5
One of the primary reasons that Facebook has been able to successfully grow its advertising operations is because it collects a vast amount of data on its users.6 Facebook collects this data using social plug-ins and cookies that track Facebook users, regardless of whether they are logged into the platform or not.7 According to a 2016 report by Steven Englehardt and Arvind Narayanan, Facebook has placed trackers on approximately 30 percent of the top 10,000 sites listed on Alexa ranking’s top site list and 32 percent of the top 500 Android apps.8
Facebook’s expansive advertising operations have yielded concerning results, however. In early 2018, it was revealed that British political consulting firm Cambridge Analytica had acquired the personal data of millions of Facebook users without their consent and used it for political advertising. Following the scandal, Facebook imposed some limits on data sharing with outside third parties. For example, in March 2018, the company asserted that it would shut down its Partner Categories program. This program had enabled advertisers to use data from third-party data brokers like Axiom for ad campaign targeting. In exchange for this data, data brokers would receive a portion of the profit of an advertising sale.9 News outlets speculated that shutting down the Partner Categories program was in direct response to Cambridge Analyica’s mishandling of user data, though the explicit link was never confirmed by Facebook.10
A Technical Overview of Facebook’s Advertising Platform
The data Facebook collects on users includes granular data on factors such as user likes, engagements, interests, and behaviors. In some cases, users can also specify their own interests.11 Given that this data is detailed, its use can result in concerning ad targeting and delivery practices on the platform, including user manipulation. For example, in 2018, the Intercept acquired a confidential internal document from Facebook that outlined an internal advertising service that enables advertisers to target users based on predictions related to their behaviors, thoughts, and purchases. This artificial intelligence-powered prediction tool was first publicly revealed in 2016, and is known as “FBLearner Flow.” The document outlined how the platform could, for example, identify users who were “at risk” of switching to a competing brand, and then aggressively target them with advertising that could prevent this shift. In addition, in 2017, the Guardian acquired a leaked internal presentation for advertisers that claimed the company could track when teens felt “insecure,” “worthless,” and “stressed,” among other things. This information could be collected by monitoring posts, interactions, and photos, and could be used by advertisers to deliver ads to these users during “moments when young people need a confidence boost.”12
Although Facebook asserts it does not sell or widely share the sensitive and precise data that it collects with outside parties, the company is able to monetize the data for advertising purposes. In particular, this data enables advertisers who use the Facebook advertising platform to target users based on very specific characteristics. These characteristics can be broken down into three broad categories:
- Demographics and attributes: Like many other advertising platforms, Facebook lets advertisers select and target audiences based on demographic factors. These include age, gender, location, profile information, activity on the Facebook platform, and data from third parties. Advertisers can also target users based on specific interests, which are inferred based on signals such as the Pages a user likes and engages with and the ads that a user clicks on. They can also target users based on behaviors such as how they connect to Facebook (e.g. through a browser or mobile device).13 Research suggests that Facebook offers over 1,000 well-defined attributes and hundreds of thousands of free-form attributes to advertisers.14
- Personal information: Advertisers can also specify particular individual users that they would like to target by uploading information on these users to Facebook along with some of their personal information (e.g. name, address, date of birth). Once this information has been uploaded, Facebook can use it to identify the corresponding users on the Facebook platform, and target them with ads. Advertisers can then also target users based on various attributes such as their political affiliation or whether they visited a company’s website.15 Audiences that are created using either of these mechanisms are known as Custom Audiences on the Facebook platform.16
- Similar users: In 2013, Facebook introduced the Lookalike Audiences feature. This feature lets advertisers target users who demonstrate similar characteristics to users they have previously chosen to target during advertisement campaigns. The Lookalike Audiences tool identifies similarities between the users in an initial Custom Audience list that the advertiser has constructed. It then creates a new audience list that includes other users who share similar qualities. The Lookalike Audiences feature offers advertisers the opportunity to create granular and segmented target audience lists. For example, advertisers can upload a list of users’ personal information in the original Custom Audience list and also use demographic and attribute factors to more precisely identify and target an audience.17
Advertisers can buy ads using a range of tools on Facebook, including the Ad Manager tool, and Pages.18 When an advertiser wants to run an advertising campaign on Facebook’s advertising platform, they are presented with a number of objectives (e.g. ad impressions or engagements) to choose from. Each of these objectives aims to optimize a campaign at different stages of the marketing funnel19 and they are known as “optimization events.” The optimization events available to advertisers include “awareness,” in which a campaign is optimized for the most impressions or views; “consideration,” in which a campaign is optimized for the most clicks or engagements; and “conversion,” in which a campaign is optimized to generate a valuable outcome for an advertiser, such as sales.20 Once an advertiser selects an optimization event, they bid on the objective itself. This means that if an advertiser selected the “consideration” option, they would bid on the number of ad clicks or engagements. These bids and objectives are considered by an advertising platform if and when an advertiser’s ad campaign is delivered.21
A bid can take many formats, but it typically includes information such as the start and end period of an advertising campaign and an overall or daily budget cap for the campaign. Advertisers can either manually create a custom bid or opt for automatic bidding. When Facebook has ad slots available, it will run an automated ad auction among the active ads that are bidding for a specific user and the limited space on their newsfeed and pages they view. Using the budget information provided by an advertiser, Facebook will place bids in ad auctions for the advertiser. Advertisers also have the option to opt for a per-bid cap.22 This limits the amount that Facebook’s automated system will bid on their behalf during automatic bidding for a single optimization event during the ad auction. However, ad placement is not solely based on the bids placed by an advertiser and how the ad fares in the auction process.23
The ad that eventually wins an auction and is shown to a user is the one that represents the highest total value. This value is derived based on more than just how much an advertiser is willing to bid to show their ad. It can be characterized by the following formula:24
Total value = advertiser bid x estimated action rate + user value
The platform also calculates an advertiser value score, using the following formula:
Advertiser value = advertiser bid x estimated action rate
In these formulas, the estimated action rate and user value are both calculated using machine learning models. The estimated action rate is the likelihood that a user will perform the action the advertiser set as their objective (e.g. for conversion). This is calculated by an algorithm that considers a user’s behavior on Facebook, such as what types of ads and pages they click on and like. The algorithm also considers a user’s off-Facebook activity, which is often provided by advertisers through Facebook business tools such as Pixel and SDK. The estimated action rate is also influenced by the content of the ad itself. If one user performs the desired action when engaging with an ad, the algorithm will use pattern recognition to identify commonalities between the user and other users, and reinforce the model. In this way, the automated tools are refined in almost real-time based on the results of an auction.25
The user value is the machine learning model’s prediction of the quality of an ad, as opposed to the likelihood that a user will perform a certain action in relation to the ad. The quality of an ad is based on a range of data sources including feedback from users who view or hide ads, and Facebook’s assessments of low-quality characteristics in an ad, such as too much text in an image, sensationalized language, and engagement bait.26 According to the company, there is a significant amount of human review during this ad review process, as they seek to build models that are able to predict high-quality ads as accurately as possible.27
In addition, factors such as ad relevance are considered when delivering ads. On Facebook, ad relevance can be measured by assessing an ad’s quality and an ad’s estimated action rate.28 Each ad is assigned a relevance score between 1 and 10 on the platform, with 10 being the highest. Advertisers can view this score using Facebook’s ad reporting tools. The relevance score is constantly updated and is based on the positive and negative feedback Facebook expects an ad to receive from its target audience. The more predicted positive interactions, such as video views, conversions, etc., the higher the ad’s relevance score will be. The more predicted negative interactions, such as hiding or reporting an ad, the lower its relevance score will be. Ads that are guaranteed delivery, such as those bought through reach and frequency,29 are not affected by relevance score calculations. Relevance score calculations also affect the cost and delivery of brand awareness campaigns to a lesser degree, as these ads are optimized to reach users, rather than drive a specific action like purchases or app downloads.30 Generally, the higher an ad’s relevance, the less it will cost to be delivered.31
On the Facebook platform, users can see ads through paid or organic distribution. When an ad is delivered through paid distribution, it will be marked as “Sponsored.” When an ad is delivered based on organic or non-paid distribution (e.g. because a user’s friend shared the ad or because a user visited a company’s Page), the ad is not marked as “Sponsored.” An ad can continue to circulate on the Facebook platform through organic distribution even after its paid distribution budget has been depleted.32
Facebook offers its advertisers a range of insights and data on how well their advertising campaigns are performing, including a detailed interface and a dedicated application programming interface (API) for launching and monitoring ads. An API is a set of programming code that simplifies data transmission between two pieces of software. Both the interface and API provide semi-live updates on campaign delivery. This includes data on the number of impressions and optimization events occurring as the ad is delivered. Advertisers can request more granular data on their ads, including that performance insights be broken down based on characteristics such as age, gender, and location (known as a Designated Market Area or DMA region). Importantly, the interface and API do not break down this data based on race, as this could lead to racial discrimination.33
Although Facebook provides advertisers with a range of insights on their campaign’s performance, the reliability of these metrics has come into question. For example, a study found that a large number of ads that brands were paying for were being viewed by “bots,” or computer programs that imitate the behavior of internet users. In response, Facebook conceded that it had made measurement errors related to engagement metrics.34 Although researchers were able to uncover this error, Facebook generally provides little transparency around how the company calculates such metrics. Without greater transparency around what they are measuring in the first place, it is difficult for the public to hold the company to account when mistakes like this happen.
Facebook has also created tools that help advertisers create and deliver ads. For example, in May 2019, Facebook launched the Automated Ads platform, a tool designed for small and medium businesses that may not have large advertising budgets and technical experience. The tool walks advertisers through a set of questions about their company and the goals of their advertising campaign. Based on this information and the details on the company’s Facebook page, the tool recommends call-to-action buttons (e.g. shop now, book an appointment, donate now), advertising content such as text, and details of the ad creative based on the business’ Facebook page. The tool also automatically generates up to six different versions of an ad using these recommendations that an advertiser can then run across Facebook and its other services: Instagram, Messenger, and Audience Network. Audience Network is an off-Facebook, in-app advertising network for mobile apps that enables advertisers to deliver ads to users of other mobile sites and apps. This tool offers advertisers with ad targeting recommendations and budget guidance based on their stated goals of the advertising campaign. If an advertiser opts to set their own budget for the campaign, the tool estimates results. Once an ad campaign is live, the Automated Ads platform optimizes the campaign by delivering the “best performing” ads going forward. Advertisers using this platform similarly receive reports and insights on how their campaign is performing and how they can improve results.35
Controversies Related to Facebook’s Advertising Platform
As previously highlighted, targeted advertising is founded on discrimination, as it enables advertisers to delineate which specific categories of users they want to target during campaigns. Depending on the characteristics being used to distinguish among users, at times this discrimination can be harmless, such as distinguishing between users interested in baseball gear versus users interested in football gear. In addition, there can be legitimate instances of targeting users based on characteristics such as race and gender, such as, for example, targeting students who attend Historically Black Colleges and Universities, or professional societies for women, to advertise events focused on benefitting these groups. However, this type of targeting can also be harmful, even though these harms are often difficult to detect.36
The current structure of the Facebook advertising platform has enabled problematic and discriminatory outcomes to result from both the targeting and delivery phases. For example, in 2016, ProPublica revealed that they had been able to run housing-related ads on Facebook that explicitly prevented certain categories of users from receiving them, based on their “ethnic affinity.”37 In November 2017, ProPublica further demonstrated that they had been able to buy numerous home-rental ads on Facebook and target the audience so that they specifically excluded “African Americans, mothers of high school kids, people interested in wheelchair ramps, Jews, expats from Argentina and Spanish speakers.”38
In 2017, ProPublica also revealed that Facebook’s automated targeting category generator was generating antisemitic categories of users based on topics user’s had previously expressed interest in. These included phrases such as “How to burn Jews” or “Jew hater.” Although these categories were so small that the platform would not have enabled them to run an ad campaign on this basis, this case study demonstrated how automated tools can suggest categories of users that advertisers should target in ways that reflect societal biases and exacerbate overt, harmful biases.39
In December 2017, ProPublica and the New York Times reported that Facebook also permitted numerous major brands and companies to run recruitment advertisements for specific age groups, including those ages 25-36. At the time of the controversy, Facebook pushed back against this criticism, asserting that age-based targeting in advertising was “an accepted industry practice”40 as it enables employers to recruit candidates of all ages to find work.41 However, under the Age Discrimination in Employment Act (ADEA), it is illegal to discriminate against people over the age of 40 in employment.42
Facebook has also been the subject of a series of formal complaints and lawsuits alleging that its advertising practices can result in illegal housing and employment discrimination. In March 2018, the National Fair Housing Alliance (NFHA) and three of its member organizations filed a lawsuit against Facebook, claiming that its advertising platform can allow housing providers to engage in illegal housing discrimination as it allows advertisers to segment which audiences see housing-related ads based on characteristics such as race, religion, sex, and disability.43 In August 2018, HUD filed a formal complaint against Facebook based on similar grounds.44
In March 2019, Facebook agreed to pay approximately $5 million to settle numerous lawsuits that claimed the company’s advertising platform enabled discrimination in housing, employment, and credit ads. Shortly after the settlement, HUD formally charged Facebook for violating the Fair Housing Act. HUD Secretary Ben Carson released a statement noting that “using a computer to limit a person’s housing choices can be just as discriminatory as slamming a door in someone’s face.”45 Similarly, in September 2018, the American Civil Liberties Union (ACLU), Outten & Golden LLP, and the Communications Workers of America filed a class action employment discrimination suit alleging that Facebook permitted employers to target ads based on categories including race, national origin, age, and gender.46 Under the Fair Housing Act, as amended, which is part of the Civil Rights Act of 1968, individuals are protected from discrimination in housing based on an individual’s race, color, disability, religion, sex, familial status, or national origin.47
In October 2019, 54-year-old Neuhtah Opiotennione filed a class-action lawsuit in a federal district court in San Francisco. The suit alleged that Facebook’s ad platform permitted financial services advertisers to target audiences based on their age and gender, thus preventing users who are female and over the age of 40 across the United States from accessing financial services products such as loans, insurance coverage, and bank accounts. The lawsuit alleges that this violates California’s Unruh Civil Rights Act, which states that all people in the state, regardless of sex, age, race, medical condition, or other characteristics, are entited to “full and equal accomodations, advantages, facilities, privileges, or services in all business estabishments of every kind whatsoever.” The suit also alleges that Facebook has taken no steps to prevent financial services advertisers from excluding users based on their age or gender, except in the case of credit opportunities.48 Although we have not had the ability to assess the merits of these various lawsuits, the vast array of discrimination cases that have been brought against Facebook demonstrate that it is important to ensure that anti-discrimination statutes cover discrimination in the online ecosystem, including online advertising. This is an important area for policymakers and civil society to focus on in the future. 49
Whereas the ad targeting phase involves input from both the advertiser and the advertising platform, the ad delivery phase largely falls under the control of the advertising platform. On Facebook, the use of algorithmic decision-making in the ad delivery phase has especially yielded problematic and discriminatory outcomes. In an April 2019 study on how housing and employment ads were delivered to a group of target users on Facebook, researchers from universities and a digital rights organization found that Facebook’s ad delivery algorithm was skewed to deliver these ads to specific gender and racial groups. According to the researchers, when they ran five ads for jobs in the lumber industry and tried to deliver them to a large and inclusive audience, the ad delivery algorithm instead delivered it to over 90 percent male users and over 70 percent white users in aggregate. In addition, when the researchers ran five ads for janitor employment opportunities, the ad delivery algorithm delivered the ad to over 65 percent female users and over 75 percent black users in aggregate. Similarly, when the researchers ran an advertising campaign related to houses for sale and targeted the campaign to a broad audience, they found that the ad delivery algorithm delivered the ads to 75 percent white users, while rental ads were delivered to a more balanced group of users.50
These skews in ad delivery occurred because Facebook’s ad delivery algorithm made inferences on what categories of users would be more likely to engage with the ads, based on data it had about engagements and impressions thus far. It made these skewed deliveries despite the fact that the advertisers had specified they wanted to target a large, inclusive group of users.51 In fact, the researchers found that even when choosing to target an underrepresented or marginalized group, the ad delivery system would still not deliver those ads to this category of users, and in some cases, it would not deliver the ads at all. This is because Facebook’s ad delivery system is geared to optimize ad delivery by delivering ads to users whom the algorithm has identified as the most likely to engage with the ads, regardless of the advertisers intentions and desires.52 What ads a user engages, or doesn’t engage with, can reflect their own personal biases and experiences, including related to their race, age, and so on. When an algorithm identifies patterns in such engagements, it will seek to optimize ad campaigns based on these trends in order to drive further return on investment.
This means that in some cases, algorithms can perpetuate biases based on inferences made from historical data, which could suggest, for example, that women don’t want to be lumberjacks based on the fact that there are few women in the profession. However, in doing so, these algorithms can exacerbate societal inequities and create barriers to equal outcomes and opportunities by excluding groups from having the chance to engage with such ads.53 This prevents potentially new patterns in engagement data from being created and as a result, the algorithm can further skew the data against the underrepresented group and make false interpretations on the basis of missing data.
The researchers also concluded that the skew in ad delivery occurred due to market and financial optimization effects. According to their findings, the budget of an advertiser and the creative of an ad (e.g. headline, image, etc.) to a specific gender or race of users also affected how an ad would be delivered.54 When the researchers used ads that only included links to sites such as bodybuilding.com, which is typically targeted toward male users, or elle.com, which is typically targeted toward female users, the ad was delivered relatively equally to users of both genders. However, when the researchers added images of a weightlifting man or of makeup brushes, ad delivery was skewed so that the bodybuilding ad’s distribution was 75 percent male and the distribution of the cosmetics ad was over 90 percent to females.55 While this is a highly specific example, it showcases how the use of images can increase how automated tools skew ad targeting toward a specific user attribute.
Advertising systems work as designed, but that doesn’t always generate expected results.. Platforms such as Facebook make money when people click on ads. But an individual's tendency to click on certain types of ads (and not others) often reflects deep-seated social inequities: the neighborhood they live in, where they went to school, how much money they have, etc. An ad system that is designed to maximize clicks and profits for Facebook can therefore reinforce these social inequities and so serve as a barrier to equal opportunity.56
In response to many of these criticisms, Facebook has taken steps, though limited, toward providing more meaningful transparency and accountability around its ad targeting and delivery practices.
As previously mentioned, in March 2019, the company agreed to settle numerous lawsuits that claimed the company’s advertising platform enabled discrimination in housing, employment, and credit ads. Under the terms of the settlement, Facebook will no longer allow landlords, employers, creditors, and similar advertisers to target or exclude users based on age, gender, race, zip code, religion, and other sensitive targeting characteristics.57 The company also agreed to study the potential for algorithmic bias in ads58 and to consult with the ACLU and the other plaintiffs in the March 2019 settlement before adding future targeting categories.59
Further, after ProPublica revealed that Facebook’s automated tools had recommended antisemitic targeting categories to certain advertisers, Facebook introduced “new guardrails” to prevent this in the future.60 Prior to this settlement, Facebook’s policy simply required that advertisers had to review and self-certify that they would comply with Facebook’s non-discrimination policy for ads. 61 When advertisers use the Ads Manager tool, they see a continuous, non-dismissable prompt that reminds them about the platform’s non-discrimination policy before they begin using targeting tools.62 Both the prompt and policy have been in place since before the settlement. In 2018, in response to claims that Facebook’s ad targeting options enabled advertisers to discriminate against certain certain categories of users, Facebook removed thousands of targeting options from its advertising platform.63
Following its settlement with the Fair Housing Groups and the ACLU’s coalition, Facebook also introduced new features to attempt to remedy targeting-related discrimination issues in the future, including a separate advertising portal for housing, employment, and credit ads that offers less precise targeting options.64 Limiting the targeting options for these categories of ads is intended to minimize the likelihood of harmful discrimination in these contexts. This new system requires advertisers to flag if their ad involves housing, employment, or credit opportunities. If it does, then the advertiser will be routed to this new system. According to representatives from the company, Facebook uses machine learning tools in the background to ensure that advertisers do not intentionally or accidentally bypass this system by failing to clearly delineate their ads as falling under these categories. In addition, the company uses human review to ensure that these machine learning systems are improving, and are able to identify repeat offenders.65
However, research has demonstrated that automated tools are not always effective at detecting certain forms of content. This is especially true in cases where the definitions for the categories of content are vague, or when determinations of which category a piece of content should fall under depend on subjective and complex forms of speech or text.66 In addition, it is still unclear how Facebook plans to comprehensively enforce this policy. 67 According to representatives from the company, the platform offers an appeals process to advertisers who believe their ads were mislabeled as a housing, employment, or credit ad by the company’s machine learning systems or human reviewers. However, little is known about the scope, scale, and impact of this appeals process. 68
Further, despite these changes, researchers have found that even if an advertiser uses the new special ads portal, the composition of audiences can still be skewed toward specific demographic groups, thus yielding potentially discriminatory outcomes. The researchers found that this was because Facebook’s modified algorithm relies on proxy characteristics that correlate with factors such as age and gender. As a result, they found that removing a select number of protected features has little impact on the presence of overall bias.69
Additionally, the Lookalike Audiences feature has revolutionized how advertisers can target users on the Facebook platform, and there is little transparency around how these audience lists are developed. Facebook has not disclosed any information related to which factors its algorithms use to determine which audiences should be in a Lookalike Audience. This is because Facebook does not have access to the characteristics of the initial audience that an advertiser has and wants to duplicate. In addition, the advertiser will not be given the identities of the users who end up on the Lookalike Audience list. However, independent experiments have revealed that Facebook’s Lookalike Audience algorithm can replicate the racial affinities and political affiliations of the initial audience in the Lookalike audience.70
Steps Facebook Has Taken to Adjust Its Advertising Platform Since the 2016 U.S. Presidential Elections
As highlighted, Facebook’s advertising platform has accrued significant value over the past few years, enabling advertisers of all sizes to target, influence, and engage with users based on precise characteristics. It has also produced numerous controversial and harmful impacts, many of which became evident after the 2016 U.S. presidential elections, when it was revealed that Russian operatives had used the Facebook advertising platform to influence voting behaviors among users including to discourage African-Americans from voting.71 One of the reasons such manipulation was able to take place is the fundamental lack of transparency and accountability guidelines around Facebook’s advertising platform. As a result, Russian operatives were able to run influential advertisements on the platform without proper accountability, and users engaged with the ads without knowing the identity or objectives of those behind the campaigns.
Since then, the platform has come under heavy scrutiny and pressure to provide greater transparency around its ad operations and to develop stricter regulations governing who can buy ads, especially political ads.72 In response to these pressures, Facebook introduced a range of new advertising transparency mechanisms. These include an ad database that aims to house all of the issue, electoral, or political ads on the platform, dubbed the “Ad Library.” However, this new ad transparency tool is not user friendly and as a result searching, sorting through, and evaluating data in the database is challenging.73 In this way, the tool fails to serve as a meaningful and useful transparency and accountability mechanism. In January 2020, the company made some improvements to the Ad Library’s usability, enabling search based on exact phrases, introducing new search filters and better grouping similar ads.74
In addition, the database provides limited insight into what ads are run on the platform. Currently, the Ad Library enables users to search for political ads based on a name, topic, or organization. When a user enters a search term, they can view all related ads and filter them based on factors such as whether they are active or inactive; the impressions ads have received within the last day, 7 days, 30 days, and 90 days; relevant Pages; any disclaimers; and on which Facebook platform or product the ad was run. When a user clicks on a specific ad, they can view information such as an approximate number of impressions the ad received, an approximate figure for how much money was spent on the ad, a percentage breakdown of the genders and ages of those who viewed the ad, and a map indicating where in the United States the ad was shown, including a breakdown by percentage. Users can also see when an advertiser’s page was created, if its name has been changed, the primary country location for people who manage the Page, and recent and total figures on how much the advertiser has spent on ads about social issues, elections, or politics.75
In January 2020, the platform expanded the metrics offered in the Ad Library to include Potential Reach,76 which highlights the estimated target audience size Facebook calculated for an advertiser, based on the targeting and ad placement options they selected.77
Although Facebook has expanded the metrics reported on in the Ad Library, the Ad Library’s effectiveness as a tool is limited as it does not disclose vital metadata that is necessary for understanding the larger ad ecosystem, as well as the impact of ads run on the platform. For example, the database does not provide granular information on the reach and engagement of an ad (e.g., how many likes, shares ,and video views an ad received),78 how an ad was targeted, and granular information on the size and attributes of an audience that an ad was delivered to. This information is integral to identifying practices such as discrimination in advertising, as often times discrimination can only be identified when comparing the content of an ad with its targeting parameters.79 The company asserted that it cannot share this information publicly due to privacy concerns.80 However, many researchers have outlined effective safeguards that should be implemented in order to ensure these disclosures are made responsibly.81
Further, although the company stated it aims to create a comprehensive database of political ads run on the platform, it fails to recognize the immense challenges and complexities involved in this process. The definition of a political ad is not clear, and the boundaries to enforce these definitions are vague, often subjective, and differ in each country and region that the platform operates in. Neither algorithmic nor human decision-making are likely to be able to make these determinations reliably and consistently, and as a result many political ads will be left out of the database. In addition, political statements and beliefs can easily be circulated on the platform through non-paid methods that rely on organic engagement to accrue virality.82
Additionally, although a database of political ads is a theoretically positive step toward providing more transparency and accountability around Facebook’s advertising operations, its focus on political ads means that the public does not have insight into the broader landscape of potentially problematic ads on the platform, such as the previously discussed employment and credit ads.
To address this, Facebook has taken steps toward expanding its Ad Library to include other categories of ads. For example, in December 2019, Facebook expanded its Ad Library to include U.S. run housing ads. The Ad Library enables users to search for these ads by advertisers and by location, regardless of whether the user is part of the target audience of the ad campaign.83 This means that users can affirmatively search for and see all housing opportunities, even if they were not part of the demographic group targeted by an advertiser.
However, the data offered on housing ads is far less granular than the data offered on issue, electoral, and political ads. Currently, a user can filter ads using the same time-bound impression filters available for issue, electoral, and political ads, and by which Facebook product or platform the ad was run on. In addition, when a user clicks on an ad they are able to see the content of the ad and a brief description of the advertiser. They are unable, however, to see information on impressions, reach, and ad spend,84 which is available on issue, electoral, and political ads. The company has stated it will develop similar tools for employment and credit opportunities in 2020.85
In addition, according to U.K.-based nonprofit Privacy International, Facebook’s efforts to provide greater transparency and accountability around its advertising platform have largely been focused in certain countries. The organization found that the company provides detailed information related to political advertising in only 35 countries and regions, such as the United States, the European Union, and India.86 Facebook is the primary method used by political groups to target and influence political voters in approximately 80 percent of the world, and this therefore demonstrates a lack of meaningful transparency and accountability to all of Facebook’s users.87
Further, Facebook’s transparency efforts do not involve the disclosure of any data on how the platform enforces its ad content and targeting policies, and how many ads and accounts are subsequently removed or suspended. Publishing such information would provide a vital transparency and accountability mechanism because it would highlight how, and if, a platform is enforcing its advertising-related policies.
Beyond these deficiencies in transparency, Facebook has also raised serious concerns through its October 2019 decision to exempt political ads from its fact-checking process and rules. According to Mark Zuckerberg, Facebook introduced this change based on the notion that private companies should not be able to censor politicians or the news. The company has maintained this stance despite pressure from policymakers, civil society, and the public.88 However, researchers like Siva Vaidhyanathan have asserted that the policy change was introduced because Facebook is unable to enforce its ad fact-checking policies globally for political ads and because political advertising is a valuable source of revenue and online influence around the world.89
Given the impact that political advertising can have on the mindsets and behaviors of users, this policy change by Facebook is concerning.90 Many civil society organizations and researchers have called for Facebook to follow platforms such as Twitter and ban political ads altogether. However, some experts have suggested that in doing so Facebook would put first-time political candidates aspiring to enter politics at a disadvantage, as the internet platform has been found to create a greater level playing field for challengers than television.91 Facebook has also faced similar calls as Google, urging it to instead prohibit political ad microtargeting. 92 However, most recently they have disclosed that they will not be making changes to their microtargeting policy.93
User Controls on Facebook’s Advertising Platform
Facebook has also introduced a limited set of controls and features that enable users to understand and control how ads are targeted and delivered to them. In 2014, Facebook introduced the “Why am I seeing this ad?” tool, which lets users see how factors such as their demographic information, interests, and website visits can be used to deliver specific ads to them in their news feeds. The tool also provides additional details about the ads a user sees if they find that the information on an advertiser’s list of target users matches information on a user’s Facebook profile. This information includes when an advertiser uploaded the information, or if an advertiser worked with a marketing partner to deliver their ad campaign.94
Users also have the option to adjust the ads that they see by adjusting their ads preferences.95 Users can change preferences such as their interests; what Facebook-related information the platform uses to deliver ads to them; as well as whether Facebook should be able to deliver ads to them based on data from partners, activity on Facebook Company Products that a user sees elsewhere, and their social actions. Users can also choose to hide ads from a certain advertiser.96 In addition, in January 2020, the company announced that this year they would be introducing further controls, including allowing users to choose if they want to stop seeing ads from an advertiser who has created a Custom Audience from a customer list, or make themselves eligible to see ads if an advertiser used a list to exclude them. The company also stated it will introduce a feature that will enable users to choose if they would like to see fewer social and political ads on Facebook and Instagram. 97
Although this new set of controls provides a positive step forward, an opt out mechanism—and one which requires—users to understand Facebook’s ad platform, puts the onus on users to protect their own privacy. Similarly, requiring users to opt in to receiving ads they may have been excluded from also puts the onus on them to ensure their platform experience is equitable. These barriers are compounded because the controls are not easily comprehensible by the average user, and have been found to be difficult to understand and navigate. There are similar deficiencies with Facebook’s new options for data portability. Although Facebook now offers users the option to download a vast range of the data the company has collected on them, including information on their likes, posts, and searches,98 this information is incomplete. For example, it does not include the data that advertisers and Facebook have collected and used for ad targeting and delivery.99
Citations
- Dan Noyes, "The Top 20 Valuable Facebook Statistics – Updated July 2019," Zephoria Digital Marketing, source
- "Facebook.com Competitive Analysis, Marketing Mix and Traffic," Alexa Internet, source
- eMarketer Editors, "US Digital," eMarketer.
- Enberg, "Global Digital," eMarketer.
- J. Clement, "Facebook's Advertising Revenue Worldwide From 2009 to 2018 (In Million U.S. Dollars)," Statista, last modified February 4, 2019, source
- Zuboff, The Age of Surveillance,
- Jack Morse, "Mark Zuckerberg Doesn't Want To Talk About Tracking Users Who've Logged Out Of Facebook," Mashable, April 10, 2018, source
- Englehardt and Narayanan, Online Tracking.
- Taylor Hatmaker, "Facebook Will Cut Off Access To Third Party Data For Ad Targeting," TechCrunch, March 28, 2018, source
- Josh Horwitz, "After Cambridge Analytica, Some Targeted Ads On Facebook Will Become Less Targeted," Quartz, March 29, 2018, source
- Julia Carrie Wong, "Revealed: Facebook Enables Ads To Target Users Interested In 'Vaccine Controversies,'" The Guardian, February 15, 2019, source
- Sam Levin, "Facebook Told Advertisers It Can Identify Teens Feeling 'Insecure' And 'Worthless,'" The Guardian, May 1, 2017, source
- Private meeting with Facebook representative, January 10, 2020
- Ali et al., Discrimination Through.
- Aaron Rieke and Miranda Bogen, Leveling the Platform: Real Transparency for Paid Messages on Facebook, May 2018, source
- Ali et al., Discrimination Through.
- Ali et al., Discrimination Through.
- Private meeting with Facebook representative, January 10, 2020
- A commonly used visualization in the marketing and sales industries for understanding the process of converting potential leads into customers
- Private meeting with Facebook representative, January 10, 2020
- Ali et al., Discrimination Through,
- Ali et al., Discrimination Through,
- Ali et al., Discrimination Through,
- Private meeting with Facebook representative, January 10, 2020
- Private meeting with Facebook representative, January 10, 2020
- Facebook, "About Ad Auctions," Ads Help Center, source
- Private meeting with Facebook representative, January 10, 2020
- Facebook, "About Ad Auctions," Ads Help Center.
- Reach is the number of people that are exposed to an advertiser’s message. Frequency is the number of times an advertiser touches each person with their message.
- Facebook for Business, "Showing Relevance Scores for Ads on Facebook," Facebook for Business, last modified February 11, 2015, source
- Ali et al., Discrimination Through.
- Rieke and Bogen, Leveling the Platform.
- Ali et al., Discrimination Through.
- Sweney and Hern, "Google Ad Controversy”.
- Amy Gesenhues, "Facebook Gives Small Businesses New Advertising And Engagement Tools," Marketing Land, last modified May 7, 2019, source
- Gilliard, "Friction-Free Racism".
- Julia Angwin and Terry Parris Jr., "Facebook Lets Advertisers Exclude Users by Race," ProPublica, October 28, 2016, source
- Julia Angwin, Ariana Tobin, and Madeleine Varner, "Facebook (Still) Letting Housing Advertisers Exclude Users by Race," ProPublica, November 21, 2017, source
- Julia Angwin, Madeleine Varner, and Ariana Tobin, "Facebook Enabled Advertisers to Reach 'Jew Haters,'" ProPublica, September 14, 2017, source
- Kaya Yurieff, "HUD Charges Facebook With Housing Discrimination In Ads," CNN Business, March 28, 2019, source
- Julia Angwin, Noam Scheiber, and Ariana Tobin, "Facebook Job Ads Raise Concerns About Age Discrimination," The New York Times, December 20, 2017, source
- U.S. Equal Employment Opportunity Commission, "Age Discrimination," U.S. Equal Employment Opportunity Commission, source
- NFHA Staff, "Fair Housing Groups Settle Lawsuit with Facebook: Transforms Facebook's Ad Platform Impacting Millions of Users," news release, March 19, 2019, source
- Yurieff, "HUD Charges".
- Yurieff, "HUD Charges".
- Brian Heater, "Facebook settles ACLU Job Advertisement Discrimination Suit," TechCrunch, March 19, 2019, source
- U.S. Department of Housing and Urban Development, "Housing Discrimination Under the Fair Housing Act," U.S. Department of Housing and Urban Development, source
- Ahiza Garcia, "Facebook Denied Financial Services Opportunities To Women And Older People, Lawsuit Alleges," CNN Business, October 31, 2019, source
- Gaurav Laroia and David Brody, "Privacy Rights Are Civil Rights. We Need to Protect Them.," Free Press, last modified March 14, 2019, source
- Ali et al., Discrimination Through.
- Ali et al., Discrimination Through.
- Ali et al., Discrimination Through.
- Aaron Rieke and Corrine Yu, "Discrimination's Digital Frontier," The Atlantic, April 15, 2019, source
- Ali et al., Discrimination Through.
- Ali et al., Discrimination Through.
- Rieke and Yu, "Discrimination's Digital".
- Rieke and Yu, "Discrimination's Digital".
- Facebook et al., "Summary of Settlements Between Civil Rights Advocates and Facebook," National Fair Housing Alliance, source
- Megan Rose Dickey, "Facebook Civil Rights Audit Says White Supremacy Policy Is 'Too Narrow,'" TechCrunch, June 30, 2019, source
- Wong, "Revealed: Facebook".
- Private meeting with Facebook representative, January 10, 2020
- Private meeting with Facebook representative, January 10, 2020
- Yurieff, "HUD Charges".
- Facebook, "Updates To Housing, Employment and Credit Ads in Ads Manager," Facebook for Business, last modified August 26, 2019, source
- Private meeting with Facebook representative, January 10, 2020
- Spandana Singh, Everything in Moderation: An Analysis of How Internet Platforms Are Using Artificial Intelligence to Moderate User-Generated Content, July 22, 2019, source
- Singh, Everything in Moderation.
- Private meeting with Facebook representative, January 10, 2020
- Piotr Sapiezynski et al., Algorithms that "Don't See Color": Comparing Biases in Lookalike and Special Ad Audiences, December 17, 2019, source
- Rieke and Bogen, Leveling the Platform.
- Hsu, "Voter Suppression".
- Ranking Digital Rights, Draft Indicators.
- Rieke and Bogen, Leveling the Platform.
- Rob Leathern, "Expanded Transparency and More Controls for Political Ads," news release, January 9, 2020, source
- Facebook, "Facebook Ad Library," Facebook Ad Library, source
- Leathern, "Expanded Transparency."
- Facebook, "About Potential Reach," Ads Help Center, source
- Rieke and Bogen, Leveling the Platform.
- Rieke and Bogen, Leveling the Platform.
- source. This is a customized settings page that is available to each logged in user.
- Rieke and Bogen, Leveling the Platform.
- Privacy International, "Social Media Companies Have Failed To Provide Adequate Advertising Transparency To Users Globally," Privacy International, last modified October 3, 2019, source
- Facebook, "Doing More to Protect Against Discrimination in Housing, Employment and Credit Advertising," Facebook Newsroom, last modified March 19, 2019, source
- Facebook, "Facebook Ad Library," Facebook Ad Library.
- Dickey, "Facebook Civil".
- Privacy International, "Social Media," Privacy International.
- Privacy International, "Social Media," Privacy International.
- Mike Isaac and Cecilia Kang, "Facebook Says It Won't Back Down From Allowing Lies in Political Ads," New York Times, January 9, 2020, source
- Siva Vaidhyanathan, "The Real Reason Facebook Won't Fact-Check Political Ads," The New York Times, November 2, 2019, source
- Alex Hern, "Facebook Exempts Political Ads From Ban On Making False Claims," The Guardian, October 4, 2019, source
- Isaac Stanley-Becker, "Ban Political Ads On Facebook? Upstart, Anti-Trump Candidates Object.," The Washington Post, November 10, 2019, source
- Boyd, "Statement On Disallowing," Mozilla.
- Alexandra S. Levine and Zach Montellaro, "Facebook Sticking With Policies On Politicians' Lies and Voter Targeting," Politico, January 9, 2020, source
- Ramya Sethuraman, "Why Am I Seeing This? We Have an Answer for You," Facebook Newsroom, last modified March 31, 2019, source
- Facebook, "Why Am I Seeing Ads From An Advertiser On Facebook?," Facebook Help Center, source
- Facebook, "How Does Facebook Decide Which Ads To Show Me?," Facebook Help Center, source
- Leathern, "Expanded Transparency."
- Facebook, "Accessing & Downloading Your Information," Facebook Help Center, source
- Ross Schulman and Eric Null, "The Data Portability Act: More User Control, More Competition," New America's Open Technology Institute, last modified August 19, 2019, source