Around the world, technology companies are engaged in a battle for your data. And as internet companies like Google and Facebook grow their revenue through a business model of tracking your every move on the internet and selling this data to advertisers, internet service providers (ISPs) like AT&T, Verizon, and Comcast are seeking ways to monetize your data as well. Just last week, a lobby group for ISPs sent a letter to the FCC proclaiming that users’ web browsing and app usage history shouldn’t be considered “sensitive information” and thus should not be subject to new regulatory guidelines issued in October that restrict the use of this information for advertising purposes.
Luckily for ISPs, these privacy regulations may never come to fruition, anyway. Recently, the FCC, under its new leadership, voted to freeze its new broadband privacy guidelines that would require companies to inform consumers about what information they collect, how it is used and shared, and what options consumers have to control how companies can use and share that information.
This matters because the FCC’s rollback of privacy guidelines is a blow to efforts aimed at increasing corporate transparency and public accountability around how they handle users’ information.
One of the key components of the FCC’s broadband privacy rules was a requirement that companies give users clear notice about what information they collect, how it is used, and how or with whom it is shared. All companies should, at the very least, tell users what they are doing with their personal information. Different people have different privacy concerns, which is why transparency is so important: Individuals need to have enough information in order to make informed decisions about the privacy implications of their technology choices.
When we talk about the importance of privacy polices, one of the counterpoints people often lob is that users don’t even read terms of service documents or privacy policies. The assumption here is that the information people need is already there—if only they would take the time to read documents instead of hastily clicking “accept.” But while it might be true that the majority of users don’t read these policy documents, that doesn’t mean that companies are off the hook.
People don’t know what happens to their data online, and are increasingly concerned about it. In a Pew Research Study from 2014, 91 percent of adults surveyed agreed that consumers have lost control over how personal information is collected and used by companies. In a recent World Economic Forum survey, 60 percent of internet users in the United States reported that they were uncomfortable with the collection of their personal data, and 31 percent changed their behavior as a result, choosing not to use, or to stop using, certain technologies, sites, or services.
And what is one way companies could regain users’ trust, according to the survey? Greater transparency.
Ranking Digital Rights, an initiative affiliated with the Open Technology Institute at New America, created the Corporate Accountability Index to assess how well internet, mobile, and telecommunications companies are demonstrating a commitment to users’ rights, as well as to provide recommendations for how companies can do a better job of respecting users’ rights, regardless of what laws or regulations require. Unfortunately, the findings from our 2017 Index—launched this week and based on policy documents for 22 of the world’s most powerful internet, mobile, and telecommunications companies—were bleak: Companies’ disclosure, where it exists, fails to clearly explain to users what information companies collect about them, with whom companies might share this information, and what users can do about it—if anything.
Few companies give comprehensive accounts of all the types of information they collect about their users. Some companies only broadly refer to “personal information” in their policy documents, without detailing what the company actually means by this term, making it impossible for users to understand the scope of data that might be captured. RDR’s stance is that “user information” encompasses any information that identifies a user’s activities, including personal correspondence, user-generated content, account preferences and settings, log and access data, data about a user’s activities or preferences collected from third parties, and all forms of metadata.
Why? Because research shows that all of these types of information can become sensitive or personally identifiable, depending on how it is processed or grouped with other data. Multiple studies with different types of data have shown that anonymous data is rarely actually anonymous—researchers can connect “de-identified” data to individuals relatively easily. This is part of the reason why the FCC guidelines could have been useful—they required companies to have clear policies and further defined what types of information should be classified as “sensitive information,” including web browsing history and app usage.
What’s more, even fewer companies in our study disclosed what options individuals have to control what types of information the company collects about them or how it uses that information. The FCC privacy guidelines would have put some of this control back in consumers’ hands: They required broadband providers to obtain opt-in consent from consumers to use and share sensitive information, and to allow users to opt out of the use and sharing of non-sensitive information, meaning the companies could use and share non-sensitive information until the consumer tells them otherwise.
ISP lobbyists and the current FCC chairman have argued that these privacy regulations are unfair because they single out telecommunications companies, whereas “edge providers”—companies that run internet platforms and services like Google or Facebook—aren’t being regulated in the same way. Put differently, this argument basically says that companies like Google are already keeping records of users’ search history, so why can’t broadband providers also keep records of users’ browsing history? But this sort of “race to the bottom” mentality misses the point, and it’s harmful from a privacy standpoint. Both types of companies should be more explicit about their practices for handling user information, and both should give users control over how this information is used, regardless of whether it’s required.
Users’ trust, not to mention their business, is on the line.