The Department of Justice was in the process of using the courts to force Apple to assist in deactivating the security features on the iPhone of Syed Rizwan Farook, the shooter in the San Bernardino massacre back in December 2015, when it had discovered a way to access the data without Apple’s help. What the DOJ wanted was to force Apple to make software to deactivate the “self destruct” mechanism that erases the phone once the number of unsuccessful attempts to enter the security code has been reached.
Apple’s view is that building software to disable its own security systems, which are marketed to ensure customers have a sense of privacy, would be tantamount to building a backdoor that could easily be exploited by hackers. Apple insists that its security systems are virtually impervious to hacking and so users can be assured that their data is protected.
From a technical standpoint, Apple has a point. The government forcing a product maker to make it easier to break to help support law enforcement sounds like a good idea until you realize that it’s the same thing as leaving a set of keys at a store down the street. Sounds good, in case you’ve somehow lost your keys and know where you can go in an emergency. Apple would be the “store” where your backdoor (pun intended) keys are kept. As long as Apple could guarantee that your keys are safe from being stolen, all is well.
What if the government came along and said, “not only do you have to make the keys, you have to give them to us whenever we need it”? Can you imagine the outcry that would occur if the government told all the lock makers in America they had to keep copies of every key for every lock in a central location, along with the names of the owner and where the lock is located because it would make it easier for law enforcement to do its investigations?
It seems as though the Senate, in its infinite wisdom, wants to do just that for software companies that make products with encryption. The new law would force the software makers to hand over any backdoor keys to gain access to data if told to do so by a court order. If the software maker doesn’t have “keys” (and most likely wouldn’t) they would be compelled to create keys and be reasonably compensated.
It’s one thing if the government has the means to gain entry into a facility in execution of a search warrant. The Fourth Amendment specifically says,
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
It’s another thing when the government gets the thing it wants but can’t figure out how to get access to its contents.
Here’s a hypothetical…
Let’s say that you encrypt every single piece of paper and document in your house using ENIGMA (the code name for the machine used to encrypt German communications that was a sub plot of the movie The Imitation Game). One day, the government managed to come to your house and take all of your papers and your machine because there was an investigation of embezzlement at your Grandmother’s grocery store. During the execution of the search warrant, you happened to be killed in a gun battle. This all seems rather bizarre, but stick with me here.
How should the government go about decrypting your entire paper collection of meaningless codes if you are unable to provide the keys? Should they force the maker of ENIGMA to provide services and instructions on how to break its encryption mechanism?
In The Imitation Game, the writers focused on the story of Alan Turing’s struggles to deal with his limited social skills and his homosexuality while trying to solve the puzzle of ENIGMA. What most people don’t realize is that the machine Turing built did not “break” ENIGMA, rather, it exploited a few flaws that helped Turing to build a machine to calculate what keys were needed to plug into ENIGMA.
In technical terms, ENIGMA designs were known and there were limitations that had mathematical implications. In addition, the content of what the Germans used in their messages would help in processing the message code. This allowed Turing to use his machine to find the ciphers (keys) the Germans used to code captured ENIGMA machines and decode German messages. After all, they were looking for the keys, not a backdoor or way around it. No messages were decoded without the use of the ENIGMA machine itself.
There were many more setbacks that were never revealed, and a lot artistic license was taken in the story. However, the end result was the same: Turing essentially built a machine to hack ENIGMA, not break it or create another way to break the code.
That leads us to the issue at hand: If the government has my documents for its investigations, and it cannot figure out how to use the machine that secured the contents, should the government construct laws that effectively forces companies to make a product it markets to consumers that has holes? Proponents of such laws have made the argument that no one is above the law, and that criminals or terrorists should not exploit technology that advance their enterprises.
On the other hand, a law such as this gives the government the power to control how much security one can have, even from the government itself. If I want to use encryption to protect my sensitive information from corporate spying, what happens if there is a hack from the company I bought the software to break into my data?
Let’s not forget that the government may decide upon secret tribunal (see Patriot Act) to go into my cloud drive where I might keep some of my files, download and decrypt them without my knowledge. They can already monitor your phones. Under this new law, if they decide they want to get into your phone, they won’t have to ask you. All they would need to do is get a search warrant, connect your phone into a device, punch a few buttons, and now they know everything you did on that phone. And what happens to the data in your phone that is of no use to them?
Freedom means the government has to do its due diligence in protecting the rights of its citizens while balancing the concerns of security. Forcing companies to create products that most consumers use for protecting their own privacy against others is not protecting its citizens: It’s giving the government the ability to limit your ability to protect yourself. Yes, criminals, terrorists and bad people are going to do bad things and use technology. It goes with being a country that extols the virtues of freedom and liberty.
In the end, the FBI ended up finding a solution and the contents of the phone were accessed. It appears that a tool was purchased that helped to either disable the lock or go around it. Without giving away the secret, the government found a way around the issue and the investigation continues. As far as I’m concerned, that’s the better approach – let the government figure it out and if it succeeds in that case, people have the ability to protect themselves.