Biden’s Peloton Raised Security Concerns at The White House. Is It Secure Enough for Yours?

Finding out is a challenge.
Blog Post
viewimage/shutterstock.com
March 17, 2021

When the Bidens moved into the White House, their Peloton may not have made the move with them. Security officials feared the popular exercise bike with a camera, microphone, and wifi connection was a cybersecurity threat. We aren't surprised.

We don’t know whether Peloton has good security practices, but this episode serves as an object lesson on the Internet of Things security landscape. It highlights the risks countless consumer goods now present—they are connected to the internet, include cameras, microphones, or other sensors, and security experts can’t easily evaluate the threats they pose to privacy and security. And while the President's connected devices will all be scanned, tested, modified, and hardened by some of the best minds in cybersecurity, the rest of the smart device-buying public is not so lucky.

Over the last couple of years, OTI has spent a lot of time testing connected products using the Digital Standard—a framework for evaluating the privacy and security of internet-connected consumer products and software—and we developed a handbook to help others do the same. The Digital Standard was developed by Consumer Reports and a coalition of civil society organizations, and comprises 35 tests that measure how well a product adheres to a set of best practices for connected hardware and software. Some of the tests evaluate technical practices, like whether a product uses encryption or strong authentication processes, while others evaluate a company’s policies on issues like data collection and retention.

Given the interest in the President's Peloton, we thought it might be helpful to see how it performed on the privacy and security tests included in our Digital Standard Testing Handbook. Getting an actual Peloton bike on short notice—one that we might have to break while running our tests—would be tricky and expensive. However, Peloton also offers a fitness app for phones and tablets which allows users to access Peloton classes and pre-recorded training programs for cycling, running, strength training, and a variety of other exercises. While we couldn’t run certain hardware-related tests, like whether the bike's camera and microphone clearly indicate when they are activated, we could use the app to run a subset of Digital Standard tests that would also tell us something about the bike. The Peloton app and bike share many features, including users, user policies, and presumably connecting to the same network of servers for functionality, allowing us to test, for example, what policies a user must agree to when creating an account, or whether strong passwords are required. Which we did, using the Peloton app on an Android phone.

The Digital Standard requires that a product’s terms of service and privacy policies be “easy to find.” When users create an account on the Peloton app, there is a sentence right above the “create” button saying that they agree to the Terms of Service and Privacy Policy, with links to the policies stored on the Peloton website. This meets the criteria required under the Standard, although it doesn’t necessarily require users to read the terms before agreeing to them.

The Standard also evaluates whether products use a variety of best practices for passwords. We found the password settings for Peloton middling. The app does allow for long and complex passwords, but does not require strong passwords using these criteria. While it does require that passwords have a minimum of eight characters, the same number required by the Digital Standard, the app does not require any other forms of complexity, like capital letters, numbers, or special characters, which are also part of the Standard. Nor does the app include other methods for secure authentication that are tested under the Digital Standard, like multi-factor authentication. Also, there does not seem to be a way for users to change their passwords from within the app itself, though this option is available via Peloton’s web site.

While evaluating the Terms of Service, we noticed that the terms place strict limitations on the types of close technical examination required by many of the tests in the Digital Standard. While versions of these limitations are commonly seen in other IoT products’ policies, Peloton’s are particularly specific in their prohibitions. For example, as part of our routine testing we examine network traffic between an app and its servers to find out, among other things, what kind of encryption the app uses when transporting data. However, the Peloton terms forbid any attempt to “probe, scan or test the vulnerability of any Peloton system or network or breach any security or authentication measures,” and Peloton explicitly reserves the "right to investigate" violations of its Terms.

Overly strict policies against outside testing of products’ cybersecurity can unduly deter good faith security researchers, who may fear prosecution under the Computer Fraud and Abuse Act. Similar policies have historically dissuaded legitimate security researchers from helping companies to create more secure products. In fact, one of the tests in the Digital Standard asks whether the company commits not to pursue legal action against security researchers. Such a commitment is absent from Peloton’s terms of service, though Peloton does have a responsible disclosure policy available on their support site clearly stating that if researchers "have followed the instructions," Peloton "will not take any legal action against (them) regarding (their) report." While it is good that Peloton has a policy setting clear technical guidelines for disclosing vulnerabilities, the disclosure policy contradicts elements of the terms of service. For example, the kinds of tests we describe as forbidden by the ToS seem to be allowable under the responsible disclosure policy. The disclosure policy is not mentioned or linked to in the terms of service at all.

Despite the lack of access to a physical bike and potential restrictions limiting our technical testing, we were still able to review Peloton’s Terms of Service and Privacy Policy to see how those hold up to scrutiny. Broadly, Peloton would perform pretty well on the Digital Standard’s policy-based tests due to its comprehensive description of crucial company practices. The Digital Standard prioritizes transparency, and can reward companies who may have less-than-stellar data collection practices with partial points in the evaluation rubric if they clearly describe those practices. As an example, Peloton’s privacy policy has a detailed section titled “Types of Personal Information We Collect and How We Use It,” which addresses criteria contained in the Standard’s tests on Data Collection, Retention, Sharing, and Use. The Privacy Policy also outlines what types of user information are collected.

Although these clear disclosures would help Peloton pass the Digital Standard’s Data Collection test, that does not mean we aren’t concerned about the privacy and security implications of its data collection practices. Peloton’s products collect a user’s “voice and likeness,” location information, and fitness and health related information—all of which are quite sensitive. Similarly, the policies would win points for clear disclosures on the data sharing indicators, even though the company’s actual data sharing practices are quite broad. Specifically, the privacy policy lists several broad categories of third parties with which it shares user information, including vendors and service providers, business partners, and third-party marketers. Peloton also discloses that they do share user information with government or legal authorities as required by law, which is another test included in the Digital Standard.

While a detailed privacy policy may earn good marks from the Digital Standard due to its transparency, the company behaviors it outlines can still pose privacy risks. However, there is a reason why the Standard rewards transparency. Clear disclosures are helpful—especially with products that collect so much sensitive personal data—because they allow users to make informed choices on whether or how they use a product. Transparency also allows users (or White House cybersecurity experts) to take steps to mitigate any privacy or security concerns by, for example, choosing not to allow certain permissions or sensors on the device.

Ultimately, Peloton’s limitations on technical testing make it hard for us to do more than speculate about what exactly could happen if someone attempted to hack a Peloton bike owned by the President of the United States, and located in a building that is a dream cyber target for all sorts of nefarious actors. The sensors in the bike are especially concerning, but if the President and the First Lady want to keep up with their rides then it is probably possible to at least remove the camera and microphone, and connect the bike to the internet in a way that isolates it from other networks within the White House.

President Biden’s voice and likeness, not to mention his fitness and health information, are more sensitive than those of an average Peloton user. That doesn’t mean, however, that average users deserve any less privacy protection. The 3.1 million other Peloton users who, unlike the White House, may not have experts on hand to harden their bikes against cybersecurity risks, would all benefit from the fruits of rigorous standardized testing with tools like the Digital Standard. Companies should be expected to provide real transparency about privacy and security best practices, as well as explicit protections for good faith researchers. These kinds of information can help consumers make more informed choices when picking exercise equipment, or any other smart device. And as good information about a device’s security and privacy features becomes more common, more consumers will seek out that information when purchasing products, which in turn should provide an incentive for companies who make these products to invest in following privacy and security best practices.

May 12, 2021: This article has been updated to discuss Peloton's responsible disclosure policy. Our researchers were initially unable to find the policy on Peloton's website.

Related Topics
Data Privacy Technology Projects Cybersecurity