OTI Joins Coalition of Civil Rights Organizations on Statement Outlining Concerns Regarding Predictive Policing Technology
Aug. 31, 2016
Today, New America’s Open Technology Institute, along with 16 other civil rights, privacy, and technology organizations and advocates, released a joint statement, Predictive Policing Today: A Shared Statement of Civil Rights Concerns, and a corresponding report, Stuck in a Pattern. The concerns raised in the statement are designed to help ensure that predictive policing technologies are implemented and used in ways that enhance civil rights and minimize the disparate racial impact that the technologies currently have. The concerns outlined in this statement include:
A lack of transparency about predictive policing systems prevents a meaningful, well-informed public debate:
Predictive policing systems ignore community needs:
Predictive policing systems threaten to undermine the constitutional rights of individuals:
Predictive technologies are primarily being used to intensify enforcement, rather than to meet human needs:
Police could use predictive tools to anticipate which officers might engage in misconduct, but most departments have not done so:
Predictive policing systems are failing to monitor their racial impact.
The statement was spearheaded by the Leadership Conference on Civil & Human Rights. Other signatories include American Civil Liberties Union, Brennan Center for Justice, Center for Democracy & Technology, Center for Media Justice, Color of Change, Demand Progress, Electronic Frontier Foundation, Free Press, NAACP, National Hispanic Media Coalition, and Public Knowledge.
A copy of the statement can be found here.
The following can be attributed to Eric Null, Policy Counsel at New America’s Open Technology Institute:
“Predictive policing is no longer just the subject of sci-fi movies like Minority Report. These predictive technologies have been deployed throughout the United States, except this time they take the form of data-driven algorithms. Predictive policing technologies can be used for good or to perpetuate injustice. Algorithms fed with racially-biased data will merely perpetuate the biases already inherent in policing and lead to more injustice. We already know that police data often incorporates only reported crimes, which means policing will become biased toward those areas.
Today’s statement will help drive a conversation about how police departments can deploy these technologies in a race-neutral way and focus on actual human needs rather than simply intensifying enforcement. We are pleased to contribute to this dialogue and hope that the statement leads to a robust public discussion of predictive policing technologies and how to implement them without embedding racial biases.”