The FBI & Apple: A Data Security War
As Apple’s struggle against the FBI enters the public domain we’re taking a closer look at what their proposed “backdoor” software could mean for our cyber security.
The FBI has demanded that Apple provide a new version of iOS that would unlock the iPhone of Syed Farook, one of two individuals who carried out the terrorist attack in San Bernardino in December 2015. The FBI argues an iOS that would allow them to circumvent Apple’s security encryption on Farook’s iPhone would provide important information on how the planning took place, and could also prevent any future attacks. Apple opposes the measure, saying that creating such an iOS would make a digital master key that could be used to unlock any iPhone, therefore compromising the safety of all iPhone users. As it stands, Apple security features erase all information from any iPhone after 10 failed login attempts using the iPhone’s numeric security code.
Apple and the FBI both possess great intelligence and are out for the greater good. Unfortunately they have found themselves at odds over a question of ethics that does not appear to have a singular solution. This good-vs-good battle has implications for the future of the world’s digital security, as well as for the potential for terrorism that shows no signs of tapering off at present.
The FBI’s request is simple: get us the info on this iPhone with one-time-use software, then destroy the software so it can’t be used again. Seems simple enough, right? Not according to Apple. Tim Cook recently penned a letter to all Apple customers outlining Apple’s stance on the debate. His argument is clear: if this software is created once, it cannot be destroyed. According to Cook, Apple would essentially be creating “backdoor”, as Cook calls it, to unlock any iPhone in the world, thus allowing capable hackers to get anything they would like off of anyone’s iPhone.
The FBI has fired back by threatening to invoke the All Writs Act of 1789, which would legally force Apple to create the software and hack the phone. In this case, Apple would have no choice but to comply, which they argue would put the security of data from iPhones the world over at risk.
Many important questions have been raised about cyber security and the implications for a fundamental change in how it works as a result of this ideological debate. If Apple could theoretically create the software that the FBI demands, what’s to say that a highly skilled group of hackers could not do the same? If that is the case, how safe is Apple’s encryption? If this software were created, why could it not be done in an ultra-secure environment and destroyed immediately afterwards, eliminating the possibility of it being leaked to any bad guys? If the FBI had this software, would it be so bad if they had access to everything on your iPhone, since they are the good guys – not to mention the potential for thwarting future terror attacks the world over?
The court case is currently pending, and when the resultant victor is announced it will set a precedent for innumerable aspects of modern-day cyber security.
What do you think: will this “backdoor” software be containable and controllable? Or are we heading towards a huge breach of personal security?