In a letter to the President last week, seemingly every relevant technology and civil liberties organization blasted the FBI’s argument that it should have the means to access and decrypt secure data and communications. The week before, members on the House Oversight & Government Reform Committee sharply criticized the Bureau on this same subject.
Bureau critics argue that any effort to grant the FBI access to encrypted systems could introduce flaws into those systems, possibly making everyone less secure. Instead of creating those potential flaws—an apparent nonstarter as a matter of policy and politics—the FBI should rather focus on ways to streamline the process of requesting and accessing the abundance of user data that is still available.
One approach is to make more systematic use of law enforcement guidelines. Some large tech companies already provide guidelines (see here, here, and here) that describe the types of user data that might be requested by law enforcement and describe the company process created to make a request.
The law does not require these guidelines. Rather, they are intended to expedite government requests and to decrease the compliance burden on large companies that might receive a significant number of requests.
So here’s one fix that might both satisfy the FBI and America’s tech firms: a regulatory or self-regulatory regime that requires or incentivizes small to midsize companies to provide these guidelines and link to them from companies’ privacy policies.
This approach would have at least three key benefits.
First, it would aid the FBI by at least providing clarity into the data collected by those small to midsize companies and into the process used to make requests. In cases where time is short, where targets quickly switch from one communications tool to another, and where the Bureau encounters new tools made by companies without robust compliance departments, such guidelines could be critical and lifesaving. For law enforcement, this would decrease one of the friction points created by rapidly adaptive new communications platforms.
Second, it would provide greater transparency to users about what data is actually collected about them. Right now, companies are focused on providing transparency into the number of government data requests without providing real transparency to users or the government into what data may be requested. That approach has the benefit of making it seem like tech companies are trying to protect their users while at the same time not placing any of a company’s equities at stake. A self-regulatory or regulatory regime involving law enforcement guidelines would turn transparency on its head in a way that might have more meaningful benefits to both the government and users.
Third, this approach could have legitimate security benefits for small and medium size companies by creating a “forcing function” that would compel companies to develop good data hygiene earlier in product development cycle. This would reduce risk and decrease longer-term compliance problems. A requirement that companies catalog the data that could be made available upon request would oblige companies to actually know what data might be made available upon request. That might not seem like such a difficult request until one considers how quickly small tech companies iterate on their products, using ad hoc decision processes that increase security and privacy risks over time. Currently, those risks aren’t addressed until an actual problem emerges.
Yet the chief objection to this approach is an ideological one. This proposal amounts to a tech mandate that would, to quote one privacy advocate who I spoke with about the idea, “give the FBI a roadmap of every company’s user data.”
To this I responded, “Yes, that is exactly what I am talking about.”
I don’t see what would be wrong with provide such a roadmap to the FBI, to be used in those circumstances where (A) the FBI has followed appropriate legal processes; (B) where the companies themselves perform due diligence for any data requests received from the FBI, and (C) where the data in question is already being collected and stored for business purposes. In fact, to suggest that companies should be able to collect massive amounts of data about their users without giving clarity to law enforcement agencies about that data strikes me as a disingenuous argument.
This proposal doesn’t solve the FBI’s encryption problem and doesn’t help with those companies that don’t retain data about their users. But what it would do is facilitate access to the large amounts of data that is already collected and available.
There is no silver bullet to address the current challenges the FBI faces with modern communications. Indeed, traditional solutions to the FBI’s surveillance needs simply aren’t going to be viable today. Nonetheless, there might be a patchwork of solutions that could at least make the FBI’s job easier without making everyone else’s life harder. Requiring companies to produce law enforcement guidelines is one such solution.
Note: Marshall, in additional to being an Overt Action co-founder, is a senior staff analyst at Mozilla, where he works on data privacy and security.
Photo: The FBI’s Strategic Information and Operations Center, available at http://www.fbi.gov/news/galleries/2013-photo-gallery