Apple and others have designed products with so-called “end-to-end encryption,” meaning that a message between two users can only be decrypted by those users. In comparison, text messages are unencrypted by default, making them available to federal law enforcement or intelligence agencies that request them. The idea of end-to-end encryption is that companies would only be able to provide metadata and ciphertext. They’d be physically unable to provide the requested plaintext.
In response, government officials invoked terrorism. The talking point is that losing access to all communications at all times means the secret investigations keeping us safe would “go dark,” and the terrorists could plot in secret. Who knows what they’d do?
Unfortunately, the law hasn’t kept pace with technology, and this disconnect has created a significant public safety problem. We call it “Going Dark,” and what it means is this: Those charged with protecting our people aren’t always able to access the evidence we need to prosecute crime and prevent terrorism even with lawful authority. We have the legal authority to intercept and access communications and information pursuant to court order, but we often lack the technical ability to do so…And both real-time communication and stored data are increasingly encrypted.
The Harvard study sought to refute the talking point. It argues that “going dark” misrepresents the overall situation. As discussed in an earlier post, the government has literally invested in using the “Internet of Things” for surveillance. That effort is likely to open up many opportunities for remotely activating microphones, cameras, and other sensors that a few encrypted texts won’t make much difference. Furthermore, using cryptography is difficult, and the business models of many companies rely on access to the contents of user’s communications. For example, Google scans the contents of emails in order to provide targeted advertising. Encrypting all emails would be self-defeating for them, as a business.
Those are the main conclusions of the report, which is the result of a series of discussions with many participants. The signatories of the report endorse its “general viewpoints and judgments,” without agreeing on every detail. The more interesting viewpoints couldn’t be endorsed by everybody, and those appear in a set of three appendices. The appendices have gotten less attention than the main findings, and they come closer to arguing that things should go dark.
The first is by Susan Landau, who made an important point about BYOD (Bring Your Own Device):
Each terrorist attack grabs headlines, but the insidious theft of U.S. intellectual property – software, business plans, designs for airplanes, automobiles, pharmaceuticals, etc. – by other nations does not. The latter is the real national-security threat and a strong reason for national policy to favor ubiquitous use of encryption…
There was an era when Blackberrys were the communication device of choice for the corporate world; these devices, unlike the recent iPhones and Androids, can provide cleartext of the communications to the phone’s owner (the corporation for whom the user works). Thus businesses favored Blackberrys.
But apps drive the phone business. With the introduction of iPhones and Androids, consumers voted with their hands. People don’t like to carry two devices, and users choose to use a single consumer device for all communications. We have moved to a world of BYOD. In some instances, e.g., jobs in certain government agencies, finance, and the Defense Industrial Base, the workplace can require that work communications occur only over approved devices. But such control is largely ineffective in most work situations. So instead of Research in Motion developing a large consumer user base, the company lost market share as employees forced businesses to accept their use of personal devices for corporate communications. Thus access to U.S. intellectual property lies not only on corporate servers – which may or may not be well protected – but on millions of private communication devices.
In other words, national security may depend on the security of communications channels also used by terrorists. Math works the same for everyone, for better or worse.
Landau goes on to imply that corporate application security is, in essence, national security:
There are, after all, other ways of going after communications content than providing law enforcement with “exceptional access” to encrypted communications. These include using the existing vulnerabilities present in the apps and systems of the devices themselves. While such an approach makes investigations more expensive, this approach is a tradeoff enabling the vast majority of communications to be far more secure.
In his appendix to the Harvard paper, Bruce Schneier makes a complementary point:
Ubiquitous encryption protects us much more from bulk surveillance than from targeted surveillance. For a variety of technical reasons, computer security is extraordinarily weak. If a sufficiently skilled, funded, and motivated attacker wants in to your computer, they’re in. If they’re not, it’s because you’re not high enough on their priority list to bother with. Widespread encryption forces the listener – whether a foreign government, criminal, or terrorist – to target. And this hurts repressive governments much more than it hurts terrorists and criminals.
As always, as the NSA understands very well, the issue is defense in depth. Even with the best encryption, it’s still possible for an attacker to guess your key. Risk management is understanding the difference between what’s possible and what’s probable, and taking steps to make problems less probable. Attackers prefer scenarios where success is probable.