Civil Rights in the Age of Big Data

Blog Post
Feb. 27, 2014

Digital information about who we are and what we do is the currency of our digital economy and the substrate of our digital society. As the flow of information increases, fairness, justice, and equal opportunity become paramount to the design, deployment, and entrenchment of computation-based technologies in our lives. Today, a group of civil and human rights groups—including the Open Technology Institute—have released a set of principles for big data that identify the critical importance of equity in the debate of big data’s opportunities and threats.

Historically, civil and human rights concerns tend to run in the background of heated conversations about government surveillance, corporate privacy intrusions, and national security. But examples that demonstrate palpable concerns are circulating in our public sphere. Recently, we heard references by President Obama to FBI surveillance of Dr. Martin Luther King and the targeting of movements for collective self-determination. There have been reports of sustained economic stagnation among communities of color caught up in a recursive loop of bad credit data. Meanwhile, we’ve heard of the use of surveillance technologies for social research and social control of low-income populations.

These episodes raise more than just questions of personal liberty. They force us to ask: how does big data categorize us into groups? When do predictive analytics persuade us to think or act based on patterns of behavior of people like us (or algorithmically deemed to be like us)? How are computational systems making decisions for us based on an aggregate self?

Our civil rights principles for big data provide an initial roadmap for addressing these questions. They ask companies and government to stop high-tech profiling, demand fairness in automated decision making systems, preserve constitutional principles, enhance individuals’ meaningful control and access over personal data, and protect people from the consequences of inaccurate data. And they put these issues in conversation with a longer history of pre-digital data discrimination: communities of color, especially poor communities of color, have historically shouldered the burden of government surveillance and corporate intrusions of privacy.

At OTI, we’ve seen digital privacy’s concerns and anxieties arise in field work with members of underserved communities. Our research shows people trying hard to get by want fairness in their daily digital encounters. They want pragmatic solutions—practical ways to protect themselves from irresponsible or abusive behavior of those who transport their messages or information from one point to the next. And they wish to take advantage of the economic, social, and civic benefits advertised to them as they become “digitally included.” Having been preyed upon before, they would prefer that access to new digital technologies break that trend.

Embraced by government agencies and the private sector, civil rights principles for big data can help make that opportunity a reality. But it depends on a willingness of all the relevant parties to recognize technological systems are never neutral. Companies and government have the power to instantiate a set of norms commensurate with our expectations of equity. Fairness is a choice. It can be intentional.

Indeed, big data analytics and computer decision systems can be shaped to produce public value. In an article about electronic health records, Frank Pasquale described the possibility for big data analysis to serve the public good. That possibility would require a trade-off: in exchange for government subsidies, parties involved in the development and operation of new, proprietary electronic health systems should make the data on these platforms* available to government agencies tasked with monitoring public health and related industries. That is helpful not only for demonstrating fraud and abuse, but also for advancing medical research and making health and wellness a more equal opportunity.

Whether to point out the challenges or opportunities, we should—and can—think more broadly about big data. Let’s stop arguing about whether privacy is dead or the surveillance state inevitable. Let’s focus on equity. Doing so will help us better understand what we have to gain in shaping our digital destinies.

*with consent and deidentified