Dismantling the Digital Poorhouse

Blog Post
Feb. 13, 2018

This piece was first published in the February 8th issue of the New America Weekly. 

Eight years ago I stood in the checkout line of a Walmart in rural New Mexico with my mother. As she swiped her scratched debit card for the third time, a white woman behind us sneered.

“Indian freeloaders. Did her welfare run out?”

The woman had assumed that my mother was using an Electronic Benefits Transfer (EBT) card—the debit card on which welfare recipients receive cash assistance, or SNAP (Supplemental Nutrition Assistance Program, formerly known as food stamps). Introduced in the 1990s, when credit and debit card use picked up, lawmakers hoped that the EBT card would help recipients of food stamps avoid the derision and outright refusal of service they often faced in grocery stores over attempts to purchase groceries with easily identifiable coupons. As well intentioned as it was, the introduction of this new technology didn’t end the racialized stigma frequently associated with welfare use. Who would’ve thought? 

Virginia Eubanks, probably. Her new book, Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, is the culmination of years of work examining the ways in which the digital age has shaped social control of the poor. Through three case studies—an automated eligibility system for public assistance in Indiana, an algorithmically coordinated housing entry system in Los Angeles’ Skid Row, and a child abuse prediction system in Pennsylvania—Eubanks, who’s also a New America National Fellow, demonstrates how introducing new technologies to social assistance programs can disrupt the lives of the poor. Or, put another way, Eubanks investigates how this technology can disconnect the poor from vital social services and undermine their right to self-determination at unprecedented scales and speeds.

To take just one of these case studies, Indiana’s electronic system for public assistance, designed by IBM, often lost its citizens information, a mistake its algorithm blamed on recipients themselves. Eubanks highlights in her book the story of Omega Young, a Medicaid recipient who was ordered to recertify her eligibility in 2008—the same time she was undergoing cancer treatment. Though she notified a call center to let the state know that she’d be missing a recertification appointment for chemotherapy, this information never reached the electronic system. She was flagged for her “failure to cooperate” and was, in turn, cut off from food stamps, healthcare, and transportation to her appointments.

For a year Young simultaneously battled cancer and the appeals process, not winning back her benefits until March 2, 2009; she’d died the previous day. Indiana had essentially made all of its welfare recipients beholden to one giant digital caseworker—one that wasn’t only incompetent, but also incapable of being self-critical and empathetic.

“I think it’s important to say I don’t think there’s anything inherent in this technology that, for lack of a better word, makes it another boot on the neck of the poor. There’s nothing specific about automation that does that,” Eubanks said at a recent event hosted by New America’s Family-Centered Social Policy program, an event centered around the same theme as her book. She was joined by Cheri Honkala, a welfare rights organizing veteran and National Organizer of The Poor People’s Economic Human Rights Campaign; Rose Afriyie, executive director of mRelief, a web- and text-based platform for families to find out if they qualify for public support; and Mariella Saba, organizer and researcher with the Stop LAPD Spying Coalition and Our Data Bodies project.

“Any time we’re talking about data collection, we have to recognize the power dynamic that exists [between] who is managing what system of a human need—whether that human need is housing or food,” said Saba, nodding to how algorithms and human caseworkers alike have the power to make life-altering and life-ending decisions about recipients’ lives. “In Los Angeles I see a lot of empty buildings that could be used for immediate housing when there’s people freezing in the streets. I lift up the name of Barbara [Brown],” a 60-year-old woman who died of exposure on a Skid Row sidewalk in early January. Saba’s deeper point was that, before we’re ready to introduce automated processes into public assistance, we must interrogate whether this power dynamic—often premised on false narratives of scarce resources and the criminality of the poor—is one we want to replicate.

Yet at the same time, while society works toward a more generous public assistance system, people still need access to the current one to make ends meet. Indeed, one of the major barriers to assistance is the sheer difficulty of finding out how to apply.

So how to extinguish this access gap? Often, people who qualify for social assistance programs “have heard stories about how difficult it is to access services they’re entitled to and have just completely decided that it’s not worth the trouble,”  Afriyie said. That’s why her organization, mRelief, works to eliminate that trouble by allowing people to determine their eligibility anonymously and without stigma by leveraging technology. Prospective applicants can answer 10 questions via text message or an online form and receive a simple “yes” or “no” as to whether they qualify for assistance, and how to apply.

Honkala, a longtime welfare rights organizer, also weighed in on ways the digital age could move the needle on welfare rights. She explained that incorporating technology into social assistance programs hasn’t truly grappled with an underlying assumption: that the poor are to be policed and punished.

“Through this entire journey, there has been an effort to have our voices heard. And through all of these years we’ve had to take on the battle of being dehumanized—dehumanized and tracked,” Honkala said.

Over nearly three decades, she’s sought to organize demonstrations that d have not only rejected demonizing poor people, but also wasted no time in meeting their basic human needs. In the winter of 1994, for instance, the Kensington Welfare Rights Union, recognizing that the local Philadelphia government planned to do nothing about overflowing shelters and vacant homes, broke into and took over HUD housing for homeless families to occupy. We often like to describe technology as facilitating innovation, as being “disruptive,” but actions like that of the KWRU demonstrate that disruption can be as low-tech and lifesaving as breaking a lock.

Try as we might, the conditions that allowed my mother and me to be harassed for being poor and brown in public can’t be automated out of American society. Technology can only map itself over prevailing social conditions. Until we’re ready to address the historically embedded reasons the poor are all too often met with disdain and blame, the dystopia will code itself.

This blog is part of Caffeinated Commentary - a monthly series where the Millennial Fellows create interesting and engaging content around a theme. For February, the fellows have decided to respond to this quote from Dr. Cornel West: “Justice is what love looks like in public.”