In December 2015, an iPhone belonging to one of the San Bernardino terrorism suspects, Syed Rizwan Farooq was seized by the FBI. However, encryption technology has allegedly prevented the authorities from gaining access to its contents. On January 16, 2016, a federal magistrate in California requested Apple to create a program that would disable key security features (in geek terms, give the FBI backdoor access to the device) and install it on Farooq’s iPhone in order to break past the encryption.
In layperson terms, the code would disable a key security feature that will remove limitations on how often an attacker can incorrectly guess an iPhone passcode. This would allow the FBI to crack the passcode on Farooq’s phone via brute force; guessing each possible encryption key in sequence until you find the right one. There are just a million conceivable mixes for a six-digit password. The FBI needs Apple to make a custom iOS that will permit the FBI to enter allow the FBI to enter password guesses electronically, as fast as the iPhone can process them. This would fundamentally undermine key security features of the iPhone, as explained in Apple CEO Tim Cook’s public letter to customers.
Here’s a quote from Apple’s letter explaining why they it’s not complying with FBI request:
The order would set a legal precedent that would expand the powers of the government and we simply don’t know where that would lead us. Should the government be allowed to order us to create other capabilities for surveillance purposes, such as recording conversations or location tracking? This would set a very dangerous precedent.
…
Law enforcement agents around the country have already said they have hundreds of iPhones they want Apple to unlock if the FBI wins this case. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks. Of course, Apple would do our best to protect that key, but in a world where all of our data is under constant threat, it would be relentlessly attacked by hackers and cybercriminals. As recent attacks on the IRS systems and countless other data breaches have shown, no one is immune to cyberattacks.
Again, we strongly believe the only way to guarantee that such a powerful tool isn’t abused and doesn’t fall into the wrong hands is to never create it.
The San Bernardino County government made a public statement on February 19, 2016, revealing that their staff tampered with Syed Farooq’s iCloud account under an FBI directive. This is relevant because a reset of Farooq’s iCloud password made it impossible to see if there was another way to get access to data on the shooter’s iPhone without taking Apple to court. The FBI’s response? it doesn’t matter.
On February 20, 2016, The FBI released a statement to Ars Technica. Their stance comes down to two claims: 1) Yes, the FBI ordered the reset of the password, and 2) The reset of the iCloud password is irrelevant to the court order that asks Apple to unlock the terrorist’s iPhone by building a backdoor tool to crack the passcode on the device. They insist that even if they hadn’t bungled the password reset, there’s still information that a seized iCloud backup wouldn’t have been able to retrieve on the phone. Thus, they think that it’s irrelevant as to how or why the iCloud backup failed. (Read the full statement)
Experts say, whether FBI’s bungling up iCloud password forced no other option but to break through encryption is an argument that will play out strongly in court. However, this has not deterred public opposition to FBI’s request. ABC News ran a feature revealing 175 pending cases in New York alone, which require foiling encryption, further strengthening the argument that this will not be a one off deal with the FBI. If Apple gives in to the FBI’s demands, it sets a dangerous precedent. What’s to stop other authorities from doing the same? Additionally, if a backdoor is built for one case, that same backdoor can be used to break encryption on other devices as well. Forensic Scientist, Jonathan Ździarski in a detailed analysis on his blog, writes about the forensic and legal issues of the case:
FBI could have come to Apple with a court order stating they must brute force the PIN on the phone and deliver the contents. It would have been difficult to get a judge to sign off on that, since this quite boldly exceeds the notion of “reasonable assistance” to hack into your own devices. No, to slide this by, FBI was more clever. They requested that Apple developed a forensics tool but did not do the actual brute force themselves. This was apparently enough for the courts to look past the idea of “reasonable assistance”, however there are some unseen caveats that are especially dangerous here. What many haven’t considered is the significant difference – in the legal world – between providing lab services and developing what the courts will consider an instrument.
Ździarski’s analysis gives a detailed insight on how the case may play out in court, but more importantly the impact it will have on the forensic process and encryption. There’s been a lot of conversation around Apple’s decision, some of which points to the fact that Apple may be doing this as a way to promote their product. This is an interesting observation and one that could be useful in the fight for encryption. If corporations embedded encryption as a core part of their products, inevitably, more companies would have to join the fight for privacy by aligning their interests with that of the public.
What does this mean for Pakistan? What happens in the US has had a trickle-down effect in many countries, including ours. The aftermath of the NSA revelations saw governments, including ours, scrambling for more control. We witnessed a similar incident when Blackberry released a statement informing users that it planned to exit Pakistan over pressure from security agencies for user data. The company subsequently announced the issue was resolved. By refusing to comply, Apple has set a precedent for other companies to follow, it has risked being accused of siding with the terrorists to partake in a broader and much needed debate on privacy vs security.
The case also brings to the forefront, the need for better security when it comes to our digital devices. Micah Lee from The Intercept has a detailed guide to follow:
- Within the “Touch ID & Passcode” settings screen, make sure to turn on the Erase Data setting to erase all data on your iPhone after 10 failed passcode attempts.
- Make sure you don’t forget your passcode, or you’ll lose access to all of the data on your iPhone.
- Don’t use Touch ID to unlock your phone. Your attacker doesn’t need to guess your passcode if she can push your finger onto the home button to unlock it instead. (At least one court has ruledthat while the police cannot compel you to disclose your passcode, they can compel you to use your fingerprint to unlock your smartphone.)
- Don’t use iCloud backups. Your attacker doesn’t need to guess your passcode if she can get a copy of all the same data from Apple’s server, where it’s no longer protected by your passcode.
- Do make local backups to your computer using iTunes, especially if you are worried about forgetting your iPhone passcode. You can encrypt the backups, too.
About The Author
Sana Saleem
Sana Saleem is an activist working on minority rights and internet freedom. Sana was listed in Foreign Policy's 100 Global Thinker's list in 2012, for her work on free speech in Pakistan with Bolo Bhi. She serves on the advisory board of Courage Foundation, which is Edward Snowden's Legal Defense Fund. She blogs at Global Voices, Asian Correspondent, The Guardian, Dawn and her personal blog Mystified Justice. She also won Best Activist Blogger award by CIO & Google at the Pakistan Blogger Awards in the same year. In 2014, she was listed on BBC's 100 Women list. She can be found Twitter: @sanasaleem and contacted via email: sana@bolobhi.org