Although both the New York Times and CNET have stories on the hearing, I don't think either publication covered the important details (nor did they take the time to extract and post video clips).
The FBI is no longer calling for encryption backdoors
When Charlie Savage at the New York Times first broke the news last year that law enforcement officials were seeking more surveillance capabilities, it seemed quite clear that the FBI wanted to be able to access to encrypted communications. Consider, for example, this statement by the General Counsel of the FBI:
"No one should be promising their customers that they will thumb their nose at a U.S. court order," Ms. Caproni said. "They can promise strong encryption. They just need to figure out how they can provide us plain text."That threat spooked the hell out of a lot of people in the privacy community and at technology companies. However, in the months that followed, rumors started to circulate that as a result of negotiations within the administration encryption was now "off the table."
Thus, many of us in Washington were not entirely surprised to see Ms. Caproni walk back her previous statements on encryption when she testified last Thursday:
Law enforcement (or at least, the FBI) has not suggested that CALEA should be expanded to cover all of the Internet...
But lets turn directly to encryption. Encryption is a problem. It is a problem we see for certain providers. Its not the only problem.
If I don't communicate anything else today, I want to make sure that everyone understands. This is a multifaceted problem. And encryption is one element of it, but it is not the entire element. There are services that are not encrypted, that do not have an intercept solution. So it's not a problem of them being encrypted. It's a problem of the provider being able to isolate the communications and deliver them to us in a reasonable way so that they are usable in response to a court order...
There are individual encryption problems that have to be dealt with on an individual basis. The solution to encryption that is part of CALEA. Which says that if the provider is encrypting the communications, and so if they have the ability to decrypt and give them in the clear, then they're they're obligated to do that. That basic premise. That provider imposed encryption, that the provider can give us communications in the clear, they should do that. We think that is the right model. No one's suggesting that Congress should re-enter the encryption battles that were fought in the late 90's, and talk about sequestered keys or escrowed keys and the like. That is no what this is about.
Why the FBI doesn't really need encryption back doors
The bit of CALEA that she is talking about is 47 USC 1002(b)(3), which states that:
A telecommunications carrier shall not be responsible for decrypting, or ensuring the government’s ability to decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt the communication.US law is surprisingly clear on the topic of encryption -- companies are free to build it into their products, and if they don't have the decryption key, they can't be forced to deliver their customers' unencrypted communications or data to law enforcement agencies.
While Skype uses some form of proprietary end-to-end encryption (although it should be noted that the security experts I've spoken to don't trust it), and RIM uses encryption for its Enterprise Blackberry messaging suite, the vast majority of services that consumers use today are not encrypted. Those few services that do use encryption, such as Google's Gmail, only use it to protect the data in transit from the user's browser to Google's servers. Once Google receives it, the data is stored in the clear.
There is one simple reason for this, which I described in a law journal article last year ago:
It is exceedingly difficult to monetize a data set that you cannot look at. Google’s popular Gmail service scans the text of individual emails, and algorithmically displays relevant advertisements next to the email. When a user receives an email from a friend relating to vacation plans, Google can display an advertisement for hotels near to the destination, rental cars or travel insurance. If those emails are encrypted with a key not known to Google, the company is unable to scan the contents and display related advertising. Sure, the company can display generic advertisements unrelated to the user’s communications contents, but these will be far less profitable.Robert Scoble also addressed this very same issue last year, writing about the reasons why major location based services have not adopted privacy preserving technologies:
Google’s Docs service, Microsoft’s Hotmail, Adobe’s Photoshop Express, Facebook, and MySpace are all made available for free. Google provides its users with gigabytes of storage space, yet doesn’t charge a penny for the service. These companies are not charities, and the data centers filled with millions of servers required to provide these services cost real money. The companies must be able to pay for their development and operating costs, and then return a profit to their shareholders. Rather than charge their users a fee, the firms have opted to monetize their user’s private data. As a result, any move to protect this data will directly impact the companies’ ability to monetize it and thus turn a profit. Barring some revolutionary developments from the cryptographic research community, advertising based business models are fundamentally incompatible with private key encrypted online data storage services.
Well, there’s huge commercial value in knowing where you’re located and [service providers] just aren't willing to build really private systems that they won’t be able to get at the location info. Think about a Foursquare where only your friends would be able to see where you were, but that Foursquare couldn’t aggregate your location together with other people, or where it wouldn’t be able to know where you are itself. They wouldn't be able to offer you deals near you when you check in, the way it does today.The FBI knows that most services are not going to be using full end-to-end encryption, and as such, there is not much to be gained by fighting a public battle over encryption backdoors. In her testimony on Thursday, Ms. Caproni drove this point home:
We're suggesting that if the provider has the communications in the clear and we have a wiretap order, that the provider should give us those communications in the clear.
For example, Google for the last 9 months has been encrypting all GMail. As it travels over the internet, its encrypted. We think that's great. We also know that Google has those communications, and in response to a wiretap order, they should give them to us, in the clear.
Privacy by design vs. insecurity by design
In the report it issued in December, the Federal Trade Commission called on companies to embrace "privacy by design":
[C]ompanies should adopt a "privacy by design" approach by building privacy protections into their everyday business practices. Such protections include providing reasonable security for consumer data, collecting only the data needed for a specific business purpose, retaining data only as long as necessary to fulfill that purpose, safely disposing of data no longer being used, and implementing reasonable procedures to promote data accuracy.Building encryption into products, turning it on by default, and using it to protect all data is the ultimate form of privacy by design. While the FTC is encouraging firms to embrace this philosophy, the FBI is betting that poor security will remain the default. Sure, a few individuals will know how to encrypt their data, but the vast majority will not. It is because of this that the FBI can avoid a fight over encryption. Why bother, when so little data is encrypted?
Consider Ms. Caproni's argument:
There will always be criminals, terrorists and spies who use very sophisticated means of communications that create very specific problems for law enforcement. We understand that there are times when you need to design an individual solution for an individual target. That's what those targets present. We're looking for a better solution for most of our targets, and the reality is I think sometimes we want to think that criminals are a lot smarter than they really are. Criminals tend to be somewhat lazy, and a lot of times, they will resort to what is easy.While I understand her perspective, the problem I have is that her description of criminals as "lazy" people who use technology that is "easy" similarly describes the vast majority of the general public. As such, for the FBI's plan to work, encryption technology needs to be kept out of the hands of the general public in order to similarly keep it out of the hands of lazy criminals.
So long as we have a solution that will get us the bulk of our targets. The bulk of criminals, the bulk of terrorists, the bulk of spies, we will be ahead of the game. We can't have to design individualized solutions as though they were sophisticated targets, who was self-encrypting, putting very difficult encryption algorithm on, for every target we find. Because not every target is not using such sophisticated communications.
If encryption is off the table, what is the FBI after?
During the hearing Ms. Caproni noted that both RIM and Skype were foreign companies, and not subject to CALEA. She had ample opportunities to call out these companies, and instead, opted to not do so. As such, at least right now, it looks like the two firms may be safe.
As such, with Skype, RIM, and the general encryption issue off the table, you must be wondering, what exactly does the FBI want? From what I can gather, quite a few things, many of which impact privacy in a big way, but which will lead to far less press than those other high profile issues.
Ms. Caproni didn't name names at the hearing, but it is pretty easy to identify the companies and services that she and her colleagues are interested in.
- Real-time interception of cloud services. Google, Microsoft, Facebook and Twitter are all legally required to provide after-the-fact access to their customers' stored data, in response to a valid legal process. The law does not require them to provide real-time interception capabilities. What this means is that while the government can go to Google and ask for all searches conducted by a particular user, they can't ask for all future searches or Google Chat instant message communications. These companies are under intense pressure to provide such real-time, prospective access to user data.
- Voice services that do not connect to the public telephone network. Google and Facebook both offer in-network audio chat to their users (Google also offers video). Microsoft's XBox 360 service, Blizzard and several other online video game platforms allow users to
insult each otherchat while they play against other users online. At least from published information, I'm not aware of any one of these companies offering interception capabilities -- and so law enforcement agencies almost certainly want access to this - Virtual Private Network (VPN) services. These services, many of them paid, are increasing in popularity among users who want a bit of privacy when they surf. They enable users to browse the web when using unsecured public WiFi networks without having to worry about hackers stealing their data; browse the web at home without having to worry about their broadband Internet Service Provider using Deep Packet Inspection technology to spy on them; access streaming content that is restricted by country (for example, allowing foreigners to watch hulu, or US residents to watch the BBC); and download files from P2P networks without having to worry about Hollywood studios, record labels and porn companies suing them.
Many users turn to these commercial VPN services in order to obtain privacy online, and it is because of this that many services have strict no-logging policies. They do not know what their users are doing online, and don't want to know. However, many of these services are based in the US (or at least, have many servers in US datacenters), and could very easily keep logs if they were forced to do so.
What happens next?
Last week's hearing was just the first step in what will likely be a long battle. There will be more hearings, and eventually, the FBI will return with draft legislation. In the mean time, all the major tech companies in Silicon Valley will no doubt continue to engage in private, high-pressure negotiations with senior FBI officials who will tell them they can avoid new legislation by voluntarily building new surveillance capabilities into their products.