Earlier this month, SC magazine ran an article about this tweet from Joseph Bonneau at the Electronic Frontier Foundation:
Email from Phil Zimmerman: “Sorry, but I cannot decrypt this message. I don’t have a version of PGP that runs on any of my devices”
“The irony is not lost on me,” he says in this article at Motherboard, which is about PGP’s usability problems. Jon Callas, former Chief Scientist at PGP, Inc., tweeted that “We have done a good job of teaching people that crypto is hard, but cryptographers think that UX is easy.” As a cryptographer, it would be easy to forget that 25% of people have math anxiety. Cryptographers are used to creating timing attack-resistant implementations of AES. You can get a great sense of what’s involved from this explanation of AES, which is illustrated with stick figures. The steps are very complicated. In the general population, surprisingly few people can follow complex written instructions.
All the way back in 1999, there was a paper presented at USENIX called “Why Johnny Can’t Encrypt: A Usability Evaluation of PGP 5.0.” It included a small study with a dozen reasonably intelligent people:
The user test was run with twelve different participants, all of whom were experienced users of email, and none of whom could describe the difference between public and private key cryptography prior to the test sessions. The participants all had attended at least some college, and some had graduate degrees. Their ages ranged from 20 to 49, and their professions were diversely distributed, including graphic artists, programmers, a medical student, administrators and a writer.
The participants were given 90 minutes to learn and use PGP, under realistic conditions:
Our test scenario was that the participant had volunteered to help with a political campaign and had been given the job of campaign coordinator (the party affiliation and campaign issues were left to the participant’s imagination, so as not to offend anyone). The participant’s task was to send out campaign plan updates to the other members of the campaign team by email, using PGP for privacy and authentication. Since presumably volunteering for a political campaign implies a personal investment in the campaign’s success, we hoped that the participants would be appropriately motivated to protect the secrecy of their messages…
After briefing the participants on the test scenario and tutoring them on the use of Eudora, they were given an initial task description which provided them with a secret message (a proposed itinerary for the candidate), the names and email addresses of the campaign manager and four other campaign team members, and a request to please send the secret message to the five team members in a signed and encrypted email. In order to complete this task, a participant had to generate a key pair, get the team members’ public keys, make their own public key available to the team members, type the (short) secret message into an email, sign the email using their private key, encrypt the email using the five team members’ public keys, and send the result. In addition, we designed the test so that one of the team members had an RSA key while the others all had Diffie-Hellman/DSS keys, so that if a participant encrypted one copy of the message for all five team members (which was the expected interpretation of the task), they would encounter the mixed key types warning message. Participants were told that after accomplishing that initial task, they should wait to receive email from the campaign team members and follow any instructions they gave.
If that’s mystifying, that’s the point. You can get a very good sense of how you would’ve done, and how much has changed in 16 years, by reading the Electronic Frontier Foundation’s guide to using PGP on a Mac. The study participants were given 90 minutes to complete the task.
Things went wrong immediately:
Three of the twelve test participants (P4, P9, and P11) accidentally emailed the secret to the team members without encryption. Two of the three (P9 and P11) realized immediately that they had done so, but P4 appeared to believe that the security was supposed to be transparent to him and that the encryption had taken place. In all three cases the error occurred while the participants were trying to figure out the system by exploring.
Cryptographers are absolutely right that cryptography is hard:
Among the eleven participants who figured out how to encrypt, failure to understand the public key model was widespread. Seven participants (P1, P2, P7, P8, P9, P10 and P11) used only their own public keys to encrypt email to the team members. Of those seven, only P8 and P10 eventually succeeded in sending correctly encrypted email to the team members before the end of the 90 minute test session (P9 figured out that she needed to use the campaign manager’s public key, but then sent email to the the entire team encrypted only with that key), and they did so only after they had received fairly explicit email prompting from the test monitor posing as the team members. P1, P7 and P11 appeared to develop an understanding that they needed the team members’ public keys (for P1 and P11, this was also after they had received prompting email), but still did not succeed at correctly encrypting email. P2 never appeared to understand what was wrong, even after twice receiving feedback that the team members could not decrypt his email.
Another of the eleven (P5) so completely misunderstood the model that he generated key pairs for each team member rather than for himself, and then attempted to send the secret in an email encrypted with the five public keys he had generated. Even after receiving feedback that the team members were unable to decrypt his email, he did not manage to recover from this error.
The user interface can make it even harder:
P7 gave up on using the key server after one failed attempt in which she tried to retrieve the campaign manager’s public key but got nothing back (perhaps due to mis-typing the name). P1 spent 25 minutes trying and failing to import a key from an email message; he copied the key to the clipboard but then kept trying to decrypt it rather than import it. P12 also had difficulty trying to import a key from an email message: the key was one she already had in her key ring, and when her copy and paste of the key failed to have any effect on the PGPKeys display, she assumed that her attempt had failed and kept trying. Eventually she became so confused that she began trying to decrypt the key instead.
This is all so frustrating and unpleasant for people that they simply won’t use PGP. We actually have a way of encrypting email that’s not known to be breakable by the NSA. In practice, the human factors are even more difficult than stumping the NSA’s cryptographers!