An OTI Experiment: Open Source Surveillance Detection

Blog Post
July 25, 2017

For many of us, our cell phones are our lifelines. They witness and capture a lot of information about what we do, see, and share in our daily lives. But what happens when our cell phones are used against us? And is there any way for us to fight back? The OTI team did a technical experiment at this Spring’s March for Science in Washington, D.C., to try and answer these questions and explore new ways of detecting when your cell phone is being surveilled.

You might think that your cell phone is safe so long as you keep it tucked away in your pocket, but companies have been developing technologies to “trick” our phones into giving away information without physically accessing our device. These technologies go by many names, including cell site simulators, IMSI catchers, or the brand name “Stingray.” The names may change, but all of these devices—now regularly being used in the U.S. for law enforcement surveillance—are designed to interfere with cell phone signals by pretending to be cell phone towers. By mimicking towers, the devices intercept signals to gather data, such as metadata and content of phone-calls, personally identifying information, and data usage, and have been especially popular as a tool for tracking the location of particular cell phones.

We don’t know exactly how widely these devices are used. We do know that they have been used all over the country in places ranging from Detroit, Mich., to Hennepin County, Minn., and Gwinnett County, Ga.—and that their legality is questionable. Despite such questions, law enforcement agencies in at least 24 states and the federal government use these devices for run-of-the-mill criminal investigations, immigration enforcement, IRS investigations, and for general surveillance purposes at large public gatherings like the Super Bowl or protests.

This increasingly broad use of cell site simulators by law enforcement is controversial for many reasons. As a general matter, the devices themselves indiscriminately invade the privacy of everyone around them because they connect to, and can capture data from, all phones within their range. But the devices have also been used in controversial ways. In particular, they have been deployed disproportionately in areas made up predominantly of people of color.

One notable example of an arbitrary, extensive, and racially biased use of this surveillance technology was by the Baltimore City Police Department (BPD), which deployed cell site simulators all over Baltimore and beyond between 2007 and 2015, primarily targeting African-American communities. OTI, Color of Change, and the Center for Media Justice filed a complaint with the Federal Communications Commission (FCC), alleging that the BPD’s use of these devices constituted a violation of the Communications Act.

While BPD’s use of cell site simulators was especially egregious, it is unfortunately not unique. Given the sharp uptick in rallies and protests since the 2016 presidential election, including the nationwide women’s marches, Muslim ban protests, and the climate and science marches, the risks posed by this kind of indiscriminate surveillance have never been more clear. That is why we decided to conduct an experiment to see whether and how one might be able to detect the use of cell site simulators during a large protest. In particular, OTI conducted a spectrum survey at the March for Science in April 2017 to experiment with ways to identify these devices.

Although our results were inconclusive, they gave us new insights into how best to tackle this problem, insights that we and others can apply to future experiments with the same goal: developing tools that give us the power to watch the watchers.

Our Methodology

OTI launched a technical effort to monitor and detect cell site simulators on a rainy April Saturday, during Washington D.C.’s March for Science. Our pilot project tested whether we could confirm the presence of cell site simulators using open source software from the SITCH project, running on commonly available open hardware. Two team members carried SITCH sensors that collected data, each of which consists of the following components, pictured here:OTI Equipment
  • 1 Raspberry Pi 3

  • 1 16GB micro SD card

  • 1 Fona SIM900 GSM modem breakout kit w/ SIM card and antenna

  • 1 NooElec R820T RTL-SDR Software Defined Radio device

  • 1 GlobalSat USB GPS antenna

  • 1 ODroid brand USB WiFi adapter

  • 1 RAVPower Portable Battery and USB Charger

Designed as a general-purpose sensor used to survey wireless spectrum and cellular networks, a SITCH device passively scans for advertised cell tower data, which can later be compared with known cell tower locations in the OpenCellID database. Any potential anomalies found in this analysis may be proof of surveillance with cell site simulators.

To supplement the SITCH devices, the eight members of the OTI team who participated in the data collection carried devices installed with the Android app WiGLE WiFi. The data collected through the app served as a secondary data source to correlate and cross-check the sensor data and the database of known cell towers provided by OpenCellID.

There is a nascent open source community coalescing around the idea of detecting cellular surveillance. Based on the available literature and documentation, the community has identified a number of possible identifiers that point to the presence of cell site simulators. These identifiers primarily represent the differences between a cell tower’s “normal behaviors” (i.e., the advertised capacity and properties of carrier provided, fixed cell towers) and “abnormal behaviors” (i.e. data detected by our devices that differs from the advertised properties of the fixed cell towers). Some examples of these identifiers include:

  • Unusual Cell ID, location, frequency used, advertised capabilities

  • Short living cell towers, advertised as the most powerful nearby tower (high receive gain), guard channel usage

  • Presence of RF jamming

  • Disabled cipher

  • Neighbor list manipulation

  • Available infrastructure, event-specific infrastructure

The hardware and software we used was not able to detect all of the known indicators of potential cell site simulator use. The table below describes the indicators our methods could obtain.

OTI Chart

After collecting data, we mapped the known locations of cell towers from OpenCellID, alongside two interesting subsets found in our analysis. In that map, shown below, the heading of Unknown Cell IDs refers to cell towers that advertised Cell IDs that were not in the OpenCellID database for this area of Washington, D.C., while Towers Briefly Seen were towers only advertised for a short period of time relative to other towers that were seen. In reference to the list of indicators above, Unknown Cell ID corresponds to Unusual Cell ID or Unusual Cell Location, while Towers Briefly Seen corresponds to Short Living Cells.

OTI Map
Comparison of OpenCellID Known Carrier Towers, Unknown Cell IDs and Short Lived Towers

What We Found

The data we collected using SITCH sensors and the WiGLE app showed some promise and had some interesting results, though we could not conclusively prove the presence of cell site simulators. It is possible that Towers Briefly Seen data points were cell site simulators that were on for a brief time to masquerade as a carrier tower to collect information from phones in the areas where people were assembling. We expect that Unknown Cell IDs were additional infrastructure provided during the event, or simply were not yet in the OpenCellID database, but additional analysis is needed to confirm that hypothesis. Both of these cases require further work to provide conclusive evidence of surveillance devices. Improvements to the data collection software and additional hardware are also needed to obtain more data and to keep pace with the surveillance technologies in use, as detailed in the next section.

Next Steps

Our work, and that of the Electronic Frontier Foundation (EFF), SeaGlass, ACLU, and others all have hit the same roadblock to varying degrees: the open technologies currently available from the SITCH project and elsewhere only detect anomalies in 2G and 3G GSM networks. New software and hardware are needed to do this work on phones that use 3G and 4G/LTE and all phones on CDMA networks. With the pace of changing technology in the surveillance community, deepening the collaboration and information sharing among organizations engaging in surveillance detection is an important next direction for this work. Next steps to address this and other challenges, and move forward with this research, can include:

  • Improve the SITCH software and hardware to detect surveillance technologies on a wider range of cellular networks. Improvements to the SITCH project could include adding support for an attached phone or SIM card to connect to select cell towers to gather more data. For example, the SeaGlass project uses a “bait phone,” which is a cell phone intended to be “bait” for Stingray devices. It runs the Android app “SnoopSnitch” to collect 4G, 3G, and 2G base stations along with network packet captures and a log of suspicious events associated with Stingrays or other cell site simulator technologies.

  • Conduct a survey of the hardware being used for cell site simulator detection and outline the various data collection methods in use. This could help the community of researchers seeking to identify this type of mass surveillance work together more easily. For example, OTI identified that using multiple devices in tandem for distinct purposes would be necessary for future iterations of our work: one to scan for cell towers and cell site simulators, one to scan and log ambient RF noise as an indicator of frequency jamming, a bait phone, additional apps or software to log neighbor lists, and new hardware to extend the range of cellular networks that can be scanned.

  • Convene organizations and individuals working on this issue to compare notes and tools. As we started to dig into the open source tools and techniques, we came across other knowledgeable organizations interested in joining forces on this work. We collaborated with technologists at EFF by sharing our progress and methods, as well as our documentation and code, discussing our findings, and by borrowing some of their supporting infrastructure. The collaborative and open nature of this work is paramount to its progress. It will be important to establish a mechanism for experts working in this space, like those at EFF, the ACLU, the SITCH project, and the activist community, to discuss strategies and progress, and to build and share tools with one another.

  • Directly involve communities under threat in our research, and design research to better identify surveillance directed at those communities. Any convening of organizations or individuals working on this issue should explicitly seek participation, input, and leadership from groups and individuals working in historically underrepresented communities that are disproportionately surveilled by law enforcement. Research should also be designed so that it can better detect biased applications of cell site simulators—for example, by developing methodologies for comparing surveillance detection data with demographic data about the race and socioeconomic status of those in affected geographic areas.

  • Conduct research on how cell site simulators are used in neighborhoods and communities, outside of large organized events. While our initial pilot gathered data at an organized rally, as we saw in Baltimore, this type of surveillance is used in a wider range of contexts. If law enforcement is targeting communities of color, areas near prisons, government facilities, public housing, and locations where protests are occurring, a clear next step is to conduct this research in these other geographic spaces where we suspect cell site simulators may be used.

After conducting this initial experiment and learning some important lessons, we are more excited than ever to continue developing data gathering and analysis methods for surveillance detection, and to support similar work by others. We hope that our experience and the lessons we’ve learned are helpful to our allies, and we look forward to collaborating—the community of organizations working on anti-surveillance technologies is even stronger when we work together!