BruinXperiment: Ethical Considerations for UCLA's Campus Climate App

UCLA is launching an app that will help measure student experiences on campus. But will they use student data ethically?
Blog Post
Michael Gordon / Shutterstock.com
Sept. 11, 2018

In 2013, Sy Stokes—then a junior at UCLA—uploaded a spoken word poem to Youtube that would put UCLA in the center of a national dialogue about campus climate and race. The poem, which challenged UCLA’s commitment to recruiting and matriculating black male students, was among the first protests in a newest wave of modern campus activism. Tapping into both feelings of neglect for black students on his campus and raw emotions about being black in America, precipitated by the killing of Trayvon Martin just two months prior, Stokes and his video went viral quickly. To date, “The Black Bruins” has been viewed over 2.3 million times.

Five years later, BruinX, a research and development team in UCLA’s Office of Diversity and Inclusion, is taking an innovative step to better measure campus climate. This fall, the BruinX team will launch an app called BruinXperience that will use campus climate surveys to encourage students to share their thoughts and feelings about campus, their UCLA experience, and national events.

In an effort to collect accurate, timely data on UCLA’s campus climate, students who download the app will be notified to fill out a survey about their thoughts and feelings in real time every two weeks. A “Share Your Thoughts” section will also be available for students to complete at any time between surveys. Continued student participation will be encouraged through raffle giveaways for students who complete certain usage milestones.

This could be a move in the right direction for UCLA, and something for other institutions to watch. One major critique of data analytics in higher ed is that it turns students into numbers and fails to consider their humanity. Campus climate surveys take student feelings and experiences into consideration. Amplifying that process—getting a more accurate pulse on how students interact with their campus, and how national events influence these interactions—is useful. But while an app such as this one could be an innovative approach to measuring—and improving—campus climate, questions remain about the ethical implications of its use and purpose. UCLA’s BruinX team could not be reached for comment, but they and other institutions should consider these questions before launch.

How exactly will the data be used?

There are a lot of implications for data use from campus climate surveys, and unfortunately there are some prominent examples of misuse. For example, at Mount St. Mary University in 2016, incoming freshmen were asked to fill out a survey about their preparedness with the promise that their answers would not affect their standing at the university. Instead, that survey data was meant to help inform which students were most at risk of dropping out. As part of a controversial freshman retention plan set forth by former Mount St. Mary president Simon Newman, instead of using data to help at-risk students, the university would have encouraged them to drop out before retention rate had to be reported.

While most colleges don’t intentionally practice bad data ethics for the institution’s benefit, sometimes the unintended consequences of well-meaning innovation can prove just as damaging. Transparency around how the data is used, for students as well as faculty and staff, and careful consideration of the goal behind innovation, can help mitigate negative outcomes. And ultimately, any interventions informed by BruinXperience data analytics should be implemented with care and in the interest of the student.

What is the goal of the app?

Will student responses be used solely as a retroactive measure of campus climate? Will they be paired with predictive modeling and algorithms as part of a proactive solution to potential negative trends in campus climate? Will responses be made public to students for use in putting pressure on the administration? (More on this below). Ultimately, how data from BruinXperience will be used and acted upon is heavily influenced by UCLA’s goal for the app. Outlining a clear goal—what successful implementation of the app looks like—is essential to maintaining transparency with students, faculty, and staff.

Will student responses be public or private? Who will moderate public responses?

YikYak, a localized anonymous messaging app that was popular from 2013 to early 2016, was a mess of an experiment into campus-based forums, plagued by hate speech and threats of violence that detracted from the app’s original purpose. But ultimately, it did reveal a shadowy subclimate of misogyny and racism on many university campuses that could not be ignored. A Reddit-esque public forum model and self-moderation were integral in this—as YikYak’s popularity boomed on campuses across the states, students downloading the app had no choice but to confront realities from corners of campus with which they may not have otherwise had contact. And up- or down-voting gave students stake in each others’ comments (for better and sometimes for worse), allowing them to co-sign similar experiences and challenge derogatory comments.

Unlike with YikYak, BruinXperience users will be required to log-in with their BruinID and a password, which means students will not be anonymous and thus less likely to share their unfiltered responses. As YikYak shows, a public forum for responses, with self-moderated content, can be a useful model for getting more thorough insight into how students feel and what sentiments they agree with even though those insights can be upsetting. However, collecting student responses privately may help encourage students to filter their answers less, since the assumption is responses will be going directly to staff and administration, as opposed to being tendered through moderation.

Who will have access to the data?

The Black Bruins video was powerful because it forced UCLA to participate in an uncomfortable conversation about representation in the national spotlight. A privately-collected, response-oriented BruinXperience may save UCLA some drama, redirecting future potential national incidents into privately submitted critiques of campus culture. But what assurance, then, do students have that their responses are being seen—or appropriately interpreted and translated to—the administration and other actors in a position to act in their interests? Without transparency around how student responses are acted upon, an otherwise well-intentioned effort on UCLA’s behalf to create a better campus climate may come across instead as simple surveillance.

Access is also a matter of security. Since students can’t use the BruinXperience app without identifying themselves, how can students know that sensitive responses aren’t being seen by a third party vendors, or by unnecessary faculty and staff that may later implicate them? Transparency around who will have access to the data from BruinXperience, and at what stages, is important to building trust in the app. If the app cannot be trusted, it will not help UCLA accurately measure campus climate.

Ultimately, BruinXperience has potential as an app that could bridge the gap between analytics and the human experience. But UCLA and other colleges looking to start similar projects should approach such an experiment with an abundance of caution. Like with predictive analytics, prioritizing ethical use, transparency, and careful action based on the data is key. The stakes for doing otherwise are worse than just misreading the climate of a campus; it could risk student privacy and institutional trust.

Related Topics
Predictive Analytics Innovation in Higher Ed