Some of the most perplexing and frustrating experiences that technologists have are with politics and social policy. There are issues that have overwhelming data and scientific analyses to support a position, but value systems based on economics, religion, and/or misinformation are relied upon instead—usually to the consternation of the scientists and engineers. Examples abound, from issues such as the anthropogenic contributions to climate change, the safety of childhood inoculations, and the nature of evolution. Amazing to most of us, there are even those who are certain the Earth is flat! Furthermore, to hold some of these positions requires also believing that scientists are either ignorant or corrupt.
Computing is not immune to these conflicts. One that is currently playing out involves encryption, and what (if anything) should be done to regulate it. Some officials involved in law enforcement and in government are concerned about the potential impact of encryption and wish to restrict how and where it can be used. Many computing professionals have a different set of views, and stress that restrictions to weaken encryption will be much more harmful than helpful.
The following letter was published in the Letters to the Editor of the June 2016 CACM (http://cacm.acm.org/magazines/2016/6/202652).
I was disappointed by Eugene H. Spafford's column "The Strength of Encryption" (Mar. 2016) in which Spafford conflated law enforcement requests for access to the contents of specific smartphones with the prospect of the government requiring backdoors through which any device could be penetrated. These are separate issues. Even if the methods the FBI ultimately used to unlock a particular Apple iPhone 5C earlier this year are too elaborate for the hundreds of encrypted or code-protected phones now in police custody, the principle that it is a moral if not legal responsibility for those with the competence to open the phones do so would still be relevant.
Unlocking an individual phone would not legally compel a backdoor into all Apple devices. Rather, Apple would have to create and download into a particular target phone only a version of iOS that does two things return to requesting password entry after a failed attempt, without invoking the standard iOS delay-and-attempt-count code and allow password attempts at guessing the correct password be submitted electronically rather than through physical taps on the phone's keypad. The first is clearly trivial, and the second is, I expect, easily achieved.
The FBI would then observe, at an Apple facility, the modified iOS being downloaded and be able to run multiple brute-force password attempts against it. When the phone is eventually unlocked, the FBI would have the former user's correct password. Apple could then reload the original iOS, and the FBI could take away the phone and the password and access the phone's contents without further Apple involvement.
No backdoor would have been released. No existing encryption security would have been compromised. Other law-enforcement agencies, armed with judicial orders, would likewise expect compliance and should receive it.
The secondary argument that should Apple comply and authoritarian regimes worldwide would demand the same sort of compliance from Apple, as well as from other manufacturersis a straw man. Since Apple and other manufacturers, as well as researchers, have acknowledged they are able to gain access to the contents of encrypted phones, other regimes are already able to make such demands, independent of the outcome of any specific case.
R. Gary Marquart
My column was written and published before the FBI vs. Apple lawsuit occurred and was on the general issue of encryption strength and backdoors. Nowhere in it did I mention either Apple or the FBI. I also made no mention of "unlocking" cellphones, iOS, or passwords. I am thus unable to provide any reasonable response to Marquart's objections as to items not in it.
Eugene H. Spafford
West Lafayette, IN
Displaying 1 comment