Apple PrivacyThe issues that have surfaced in the legal battle between Apple and the FBI are larger than the question of the rights of the U.S. government versus the rights of U.S. residents. The case speaks to the question of what rights all people have to protect their persons and property. The outcome of the argument will have a profound impact on everyone, regardless of nationality. My position is that without privacy there can be no real security for anyone, anywhere.

The FBI has demanded that Apple provide the agency with access to data stored on an iPhone possibly used in connection with the terrorist attack that took place Dec. 2 in San Bernardino. Apple has refused to comply. I agree with Apple's position.

It is our responsibility as developers to safeguard the information that users provide our software. It is an unwritten pact: The users provide information, the developers provide some useful function based on that information, and we developers have the responsibility to protect that information. There are obvious and important reasons why users expect this, including:

  • Financial value of the information
  • Social value of the information

If we are negligent with user information, the users can experience incalculable loss and frustration. What would be the cost to you if you lost your phone and the person who found it was able to access your bank account and transfer funds from it? Imagine the nightmare if your lost phone revealed when you were going to be vacationing away from home. Think of the harm that could be done by a schoolyard bully who had access to his victim's inner thoughts and fears as recorded on a smart phone.

This is why responsible developers never build backdoors into the systems they develop. A backdoor is a door for anybody who wants to get in. Once it exists, both the good and bad will enter through it.

It is evident that the U.S. government is using this case to exploit an emotionally charged situation, garner sympathy and, with that sympathy, set a precedent that goes far beyond the case at hand. Law-enforcement officials have admitted that the information on the phone is probably of little value to them.[1] Moreover, the FBI already has already had the chance to leverage iCloud backups of the device.[2] It also has call, text and e-mail metadata from the device. In addition, the terrorists used and destroyed another phone that was probably instrumental in the planning of the attack whereas the phone currently in the FBI's possession was probably not used in the planning of the attack since it was owned by San Bernardino.

Although the FBI has stated that the backdoor is needed only for this single iPhone, the agency currently holds at least a dozen phones that it hopes to open in conjunction with other cases.[3] That suggests that the FBI's true motivations are much larger than stated and that the agency is essentially trying to force Apple to create a backdoor for all iPhones.

By requiring this backdoor, the U.S. government gives all other governments, including autocratic states such as China, Russia and Iran, tacit permission to require similar backdoors on all such products. Not all governments are as kind and law-abiding as ours. In some countries, this capability undoubtedly would lead to the persecution, imprisonment and death of those who disagree with the authorities. Backdoors strengthen despotism.

An equally valid concern is that if backdoors exist, they will be discovered and exploited by criminals and vandals. With 12 million identity thefts taking place every year in the United States alone, adding this backdoor will only accelerate the trend, enabling ID thefts from stolen phones. The hacks of the Office of Personnel Management, the Internal Revenue Service and other nationally recognized organizations, were all caused by slack security for the sake of convenience. Without the privacy afforded by good software, there will be no security from identity theft.

More than 2 million smart phones were stolen in the United States in 2014, according to Consumer Reports, which annually surveys consumers regarding smart-phone thefts.[4] Apple has been making great progress in preventing the reuse of stolen phones, but this backdoor would bring new value to stolen iPhones.

When we lock the keys in our car, it can be an expensive and inconvenient proposition to get back into the car. When we lose the passcode to our phones, it should be similarly expensive and inconvenient. That is the cost of keeping out the criminal element. In encryption technology, it is critical that algorithms not contain backdoors and that companies that produce the technology not be required to escrow keys for law-enforcement use.

We've already seen that governments are shamefully bad at protecting data. Any backdoor into a commonly used encryption will quickly be found and exploited by unsavory governments, criminals and vandals. Additionally, the existence of backdoors will only drive the informed, whether terrorists or others concerned about privacy, to use "unauthorized" encryption algorithms that are truly impenetrable, thereby rendering moot the entire reason for the backdoor. The bad guys will still have access to privacy, and the law-abiding will not.

Rather than forcing all devices to have built-in security weaknesses, the FBI and the U.S. government should be helping citizens protect themselves and secure their own information. It is far more likely that we fall victim to identity theft [5], data theft and other technological crimes than to acts of terrorism on U.S. soil.

If we aren't allowed to securely hold our personal information on these increasingly personal devices, we will have neither security nor privacy.

About the Author: Jack Cox has over a decade of experience helping Fortune 500 clients build mobile strategy through technology, security and cryptography. He is a software developer, systems architect, and Managing Director at CapTech where he is responsible for the firm's mobile software practice. Jack's love of software development and all things mobile has driven a career developing software for businesses of all sizes including large-scale transaction processing systems, embedded software, and smart-phone software. Jack co-authored the book 'Professional iOS Network Programming' (Wiley). He has been involved in several startups, holds multiple patents and frequently speaks nationally. Jack is based in CapTech's Richmond, Virginia office and helps clients both locally and across the US.


[1] National Public Radio. "San Bernardino Police Chief Sees Chance Nothing of Value on Shooter's iPhone." Feb. 26, 2016. Available at http://www.kunc.org/post/san-bernardino-police-chief-sees-chance-nothing-value-shooters-iphone#stream/0

[2] Chmielewski, Dawn. "FBI Says Resetting San Bernardino Shooter's Apple ID Password Not a Screwup." Feb. 21, 2016. The Atlantic. Available at http://recode.net/2016/02/21/fbi-says-resetting-san-bernardino-shooters-apple-id-password-not-a-screwup/.

[3] Waddell, Kaveh. "Apple Is Right: The FBI Wants to Break Into Lots of Phones." Feb. 23, 2016. Available at http://www.theatlantic.com/technology/archive/2016/02/apple-is-right-the-fbi-wants-to-break-into-lots-of-phones/470607/.

[4] Consumer Reports. "Smartphone Thefts Drop as Kill Switch Usage Grows." June 11, 2015. Available at http://www.consumerreports.org/cro/news/2015/06/smartphone-thefts-on-the-decline/index.htm.

[5] Bureau of Justice Statistics. "16.6 million People Experienced Identity Theft in 2012." Dec. 12, 2013. Available at http://www.bjs.gov/content/pub/press/vit12pr.cfm.