April 13, 2021
This story is part of PIT UNiverse, a monthly newsletter from PIT-UN that shares news and events from around the Network. Subscribe to PIT UNiverse here.
Algorithms and machine learning tools are some of the most influential yet least well understood technologies in use today. Across the public and private sector—from social media and online advertising to housing and banking to government and criminal justice and more—organizations increasingly rely on algorithms to parse massive amounts of data and make decisions at scale.
Often, these decisions have consequences that are life altering. Today, algorithms can help decide whether someone is considered for or gets a job, receives housing, is granted a loan, gets accepted to college, and even whether someone is released from prison. And while technology has a veneer of objectivity and scientific accuracy, artificial intelligence (AI) is often trained on data that reflect historical biases and injustices. This creates a significant risk of ingrained biases perpetuating historical patterns of marginalization; examples are easy to find in health care, policing, education, and more.
Advocates, including New America’s Open Technology Institute, are pressing companies and governments to offer increased transparency into how their AI systems are developed and trained, as well as the outcomes they generate. Without this information, many influential algorithms remain “black boxes,” with little to no accountability for potential harms.
There is a clear need for more public interest researchers looking at algorithms and machine learning tools to determine their influence and detect biased outcomes. However, such complex systems require a great deal of expertise to approach. The Algorithmic Fairness and Opacity Working Group (AFOG) at the University of California, Berkeley is dedicated to this research, and understanding the real-world effects of algorithms regardless of their creators’ intentions. AFOG also engages students and the public through courses and events, encouraging people to, “think deeply and critically about technology, human values, and the social and political implications of technical systems.”
Professors Deirdre Mulligan and Jenna Burrell lead AFOG, which won a 2020 Network Challenge grant from PIT-UN focused on creating a career pipeline for PIT scholars looking to pursue research into algorithmic fairness. AFOG will partner with two other UC Berkeley PIT programs—Cal NERDS (New Experiences for Research and Diversity in Science) and the D-Lab—to create educational programming and opportunities for PIT students to explore careers in algorithmic fairness.
In a statement announcing the grant, Professor Mulligan previewed hands-on workshops, public lectures, and lunch talks to connect students to the PIT field and grow their understanding of the space. “By centering issues of justice,” Mulligan says, “rather than technology or specific approaches and methods from STEM fields, this program will develop and train diverse students and scholars with the knowledge and skills to create, use, assess, and critique technologies in service of the public interest.”
Claudia von Vacano, Executive Director of the D-Lab, says students and researchers face a number of challenges working in the AI space.
“As rigorous researchers, the contexts which our students enter are sometimes not as concerned with the questions of diversity and ethics in AI that are fomented in our programs,” says Vacano. “So they innovate and have to push against workflows that don't routinely allow them to interrogate and audit issues of bias in ML.”
The programming and education provided by the D-Lab, Vacano says, will help prepare students for the field. “Our scholars are using and thereby expanding their data science skills at the same time that they are proposing solutions to complex problems in the field. In other words, they’re applying their knowledge in sophisticated ways with hands-on and collaborative projects.”
One student who served as a data science fellow with the D-Lab had high praise for the program and said it helped secure a data science internship this year.
“The experience has been amazing,” the student says. “People have an open-minded approach to science and are always eager to help you out.”
The partnership between AFOG, Cal NERDS, and the D-Lab has already begun to hold events, including beginning a lecture series with a talk by Ethan Zuckerman. Their next lecture, on April 14, will feature Tawana Petty.
One PhD student who attended Zuckerman’s talk called it “an important conversation challenging the assumption that scale—in the context of social media, and technology more generally—is preferable, instead raising bold questions about what small social media ecosystems might me able to provide to foster communities.”
The lecture series followed a STEMinist bootcamp in January that drew 120 registered students to provide an overview of STEM fields, as well as an introduction to coding and conversations with role models in the field. One attendee who had little knowledge of data science beforehand says the bootcamp “really broadened my perspectives of technology and it’s community,” adding that hearing speakers discuss their own experiences “really has truly comforted me that even though I have not started coding early, I am still capable of great things!”
We asked the teams from AFOG, Cal NERDS, and the D-Lab a series of questions. Below, find their compiled answers. Also, interested in working with AFOG? They're hiring a PIT-UN Postdoc, and would love to support a CRA/CCC Computing and Innovation Postdoctoral Fellow as well.
What challenges do students face in making a career researching the public interest issues around algorithms and machine learning?
Undergraduate STEM education provides little room for interdisciplinary engagement, yet women and students from historically marginalized and non-dominant communities (low-income, first generation to college, re-entry, transfer, student parents etc.) are often profoundly aware of, and intensely interested in, the social and political impacts of science and technology. Likewise, graduate students in public health, social work, public policy, and law may have a strong interest in the implications of technology for their field, but little opportunity to explore these areas, particularly given that graduate students are encouraged to stay in their disciplinary lane. The questions at the center of public interest tech center justice and social welfare alongside data science, coding, algorithms and machine learning in the design of sociotechnical systems, which often require interdisciplinary collaborations.
What are you hoping students take away from the educational and career-focused programming you're planning?
PIT-UN at UC Berkeley has collaborative programming aimed at building a diverse group of undergraduate and graduate students who identify as Public Interest Technologists. We help build technical and critical thinking about information necessary for the field of public interest technology. Through hands-on and often peer-led workshops, students develop coding and data analysis skills alongside expertise using tools and methods to assess how values are embedded in technical systems across society. Together, these workshops provide students with the ability to think critically about the possibilities and limits of technological ‘solutions,’ while contributing to ideas to support legal regulation, social movements, or other means to generate social change.
Events with leaders in the public interest tech field, along with days when students can “shadow” public interest technical professionals exposes students to the wide range of exciting career opportunities in public interest technology. Our goal is that students appreciate their own resilience, understand that their potential is limitless, and feel that they have a like-minded community that creates a sense of belonging and validates their interest in further pursuing public interest technology.
For more information on programming, please check out AFOG's Public Interest Technology Website.