OTI Joins Coalition Opposing HUD-Proposed Changes that Would Undermine Fair Housing Act Enforcement
Today, New America’s Open Technology Institute joined a coalition of 23 civil rights and consumer advocacy organizations and individual experts opposing a recent proposal by the Department of Housing and Urban Development (HUD) to change the disparate impact test under the Fair Housing Act. OTI also filed separate comments to emphasize our key concerns. For years, HUD’s rules have recognized that practices that appear neutral on their face but result in a disparate impact on protected classes constitute discrimination that is prohibited under the Fair Housing Act.
The proposal would create new defenses that would allow housing providers to escape liability under the disparate impact standard where the providers assert that they have relied on algorithmic models. The draft rule is based on a flawed understanding of how algorithmic models work and fails to account for the risks of algorithmic bias. Under the proposal, a defendant relying on an algorithmic model could defeat a disparate impact claim by showing that (a) the model’s inputs do not include close proxies for protected classes; (b) a neutral third party determined that the model has predictive value; or (c) a third party created the model.
These proposed defenses would severely undermine HUD’s ability to ensure fair housing, because disparate impact discrimination is precisely the type of discrimination that algorithmic models can cause. These three defenses would do nothing to actually disprove that such discrimination had occurred, and would unreasonably absolve defendants of liability in algorithm-based cases under the Fair Housing Act. To redress the type of discrimination that can be caused by algorithmic decisionmaking, HUD and the courts should continue to analyze these cases under the current burden-shifting framework, which allows for discovery and fact-finding necessary to determine whether an algorithmic model created a disparate impact.
The following quote can be attributed to Spandana Singh, policy analyst at New America’s Open Technology Institute:
“Changing the rules for proving discrimination under the Fair Housing Act in the way HUD proposes would up-end years of precedent in how disparate impact claims are adjudicated. The proposed rules fail to understand how algorithms work, and that disparate impact discrimination is precisely the kind of discrimination caused by such algorithms. Algorithmic bias is generally difficult to detect, and these defenses would prevent plaintiffs from standing up for themselves.”