Sorry, you need to enable JavaScript to visit this website.
Apple vs FBI: What you need to know | Digital News Asia

Apple vs FBI: What you need to know

Apple vs FBI: What you need to know

Apple vs FBI: What you need to know

By
  • The FBI wants Apple to create a backdoor: What the actual implications are
  • Contrary to reports, Apple has not complied with such requests ‘70’s time before

A US judge has ordered Apple Inc to comply with a request from the Federal Bureau of Investigation (FBI) for ‘technical assistance’ to look into an iPhone owner’s data. In the few years I’ve written about security issues, I’ve never seen an issue as hotly debated as this one.
 
It seems a bit snarky of the FBI to use this one particular case that seems to have the highest chance of success to set precedent, but it also seems mighty nasty of Apple to refuse to comply with a court order.
 
But here are some facts of the case.
 
The phone in question belonged to Syed Ridzwan Farook, one of the shooters in the San Bernadino shooting, which caused the deaths of 22 people and injured many more.
 
The United States has numerous mass shootings, but this one involved two Muslims aligned to the Islamic State (ISIS), hence more easily labelled terrorism without the need for adjectives like ‘domestic.’
 
Unfortunately (or fortunately), as I blogged recently, self-radicalised terrorists don’t get funding from headquarters, and without that glorious ISIS oil money, all these guys could afford was an iPhone 5C – an entry-level phone with hardware remarkably similar to the iPhone 5 which was launched way back in 2012 (you’ll remember that as the year Manchester United last won the Premier League).
 
As an older phone, the security architecture of the 5C lags behind current generation iPhones, all of which have a secure enclave.
 
But make no mistake, it’s still pretty secure.
 
By pretty secure, I mean that the phone has all of its contents encrypted, unreadable to anyone without the encryption key. And that key is derived from both the user passcode, and the hardware key that is randomly generated and unique to the specific iPhone.
 
Without both components, the encrypted  data is unreadable – which explains why the FBI can’t suck the data out of the device for decryption on a more powerful computer, or load the data unto hundreds of iPhones for parallel cracking, because it would be without the phone-specific key in both instances.
 
But even with the phone, things are tough. iPhones are hardened against passcode guessing, just like ATM (automated teller machine) cards.
 

 
If thieves managed to obtain your ATM card, they could take it to any ATM and try to guess your PIN (personal identification number) code, but they only get three attempts before the ATM sucks in the card and ends the attack.
 
The odds of guessing a six-digit passcode with just three attempts is worse than striking the lottery – that’s why losing your ATM card isn’t that big a deal.
 
And strictly from a data perspective, losing your iPhone isn’t a big deal either … provided you have secured it with a long enough passcode. (Of course, nobody likes losing such an expensive phone!).
 
Similarly, iOS works to limit an attacker’s ability to guess its passcode, either by slowing down the rate of guesses (through artificially delaying retry attempts by hours), or by just erasing the entire contents of the phone after 10 incorrect passcode entries.
 
All these protections against brute-force passcode-guessing are baked directly into iOS, Apple’s iPhone operating system, and even though the iPhone in question was an older generation phone, it still had the latest software (yet another reason to admire Apple).
 
Which also means that previous hacks which bypassed the protections have also been patched.
 
Like I said, pretty secure.
 
To bypass these obstacles, the FBI is asking Apple to provide it with a special tailor-made version of iOS that would eliminate these protections, specifically:
 

  • The removal of the Auto-Erasing feature, which would otherwise erase the contents of the phone after 10 incorrect passcode entries;
  • The removal of the artificial delays in attempting the passcodes, allowing the FBI to try passcodes at the fastest rate the hardware will allow;
  • The ability to submit passcodes electronically to the iPhone which eliminates the need of a human person manually entering the passcodes by hand; and
  • The special software could be further customised to work on only this one iPhone, down to the serial number, etc.

In essence, the FBI is asking Apple to create a ‘special’ ATM that would allow it to try PIN codes for a specific ATM card without sucking in the card after three failed attempts. That ‘special’ ATM would also have the ability to attempt the PINs electronically rather than having someone manually enter them.
 
If a terrorist told you that the coordinates for a bomb was the PIN to his ATM card, and neither the bank nor the police had anyway to determine the PIN other than by creating that ‘special’ ATM – do you think the bank should do it? Would it be considered a ‘backdoor’?
 
All in all, it seems like a pretty reasonable request, but what makes this so controversial (at least in cybersecurity circles) is whether the four items above are considered a ‘backdoor.’
 
Some experts say it is, others say it isn’t.
 
What isn’t controversial is Apple’s ability to do this. Experts all agree that this is possible, and even Tim Cook’s brilliant PR response didn’t deny the company’s ability to do so.
 
So it is whether Apple should do this. That is the question.
 

‘Should’ is a strange word. This is, after all, a phone that belonged to a terrorist, and shouldn’t Apple do everything in its power to help law enforcement? What if the phone has contact details of other ISIS operatives in the United States, or what if it had more details of ISIS operations in Syria? Wouldn’t we want that, regardless of how ‘burdensome’ it might be to Apple?
 
This has nothing to do with the Fourth Amendment, which prohibits unreasonable searches and requires a warrant – after all, the owner of the iPhone is dead, and a court warrant is available.
 
So what’s stopping Apple?
 
To be fair, it’s a small – perhaps 1% – chance that these ‘self-radicalised’ lone wolves have anything of grave importance on their old iPhone, but when words like terrorism are bandied about, 1% is more than enough.
 
Which is why I feel it’s a bit snarky for the FBI to use this case. It seems the perfect case. It involves terrorism (which pushes everyone’s emotional buttons), an old iPhone that can be cracked, and violates no Constitutional protections.
 
But you can bet that if it succeeds on a legal level, the precedent set by this case will be used for other cases that don’t involve terrorism and which affect more recent versions of iPhones.
 
And that’s what scares Apple.
 
If Apple sets a precedent by complying with this court order, then it will have to comply with all other court orders, potentially thousands more, requesting the very same thing.
 
The perception of iPhone security will take a severe beating – after all, there’s a reason drug lords and terrorists use iPhones.

READ ALSO: Smartphone sales growth slowest since 2008, iPhone declines for first time: Gartner
 
Let’s also accept that if one judge rules for Apple to comply for this iPhone, explaining to future court judges the technical intricacies of the secure enclave, RAM disk implications, and digital signatures of future iPhones would be a task so monumental that from a legal standpoint, this iPhone 5c is all iPhones.
 
The question of ‘should’ must also take into account the precedent this might set, and look beyond the specifics of this one case.
 
Plus, it really would be burdensome on Apple to replicate what it might do in this one case, thousands of times. Every time it does so, it would have to sign the new operating system, which means it would have to get its super-secret iOS signing key and digitally sign the software.
 
Apple’s entire iOS security is premised on keeping that key secret, and since iOS operates on all of iPhones, and iPhones are a big business, the signing key is quite possible the most valuable secret in the world.
 
If you have it, you would have broken all of iOS’ security, plain and simple.
 
If you were a shareholder, would you like Apple to get out this secret key every time the FBI, DEA (Drug Enforcement Agency) or NYPD (New York Police Department) came knocking?
 
And what’s to stop this precedent from being enforced internationally? Why would China, Apple’s largest growing market, feel that its law enforcement isn’t entitled to the same privileges as the FBI?
 
How would you feel if Apple complied with the request to the FBI for this terrorist case, and then complied with other requests from Israel, China, Australia … or Malaysia?
 

 
And finally, to circle back to the bank analogy, the ATM has no personal data, and the data in the iPhone doesn’t represent a clear and imminent threat, it was details of an event that had already happened.
 
If you were the CEO of Apple, what would your stand be?
 
Postscript
 
The iPhone has a hardware limit of 80ms per password attempt, which means it would take about 15 minutes to crack a four-digit passcode, and one day to crack a six-digit one.
 
However, a sufficiently lengthy alphanumeric passcode (composed of both digits and characters) would still take years to crack even if the FBI obtained its special iOS.
 
Apple has never complied with similar requests before, even though you may have read that it acquiesced 70 times previously.
 
That’s just technically wrong. I love Shane Harris and his rational security podcast, but he doesn’t get the nuance of this issue.
 
First of all the number ‘70’ is a government estimate, not something Apple agreed with, and while we don’t have the specifics of each case, some argue that this was much older iOS7 devices which don’t have built-in encryption – which meant data could be extracted without the need for a passcode.
 
But think about what that means – if Apple could extract the data from the phone without a passcode, what’s to stop the likes of Russian cybercriminals or Chinese state-sponsored hackers from doing the same?
 
Hence, Apple improved the security of its software to ensure that no-one without the passcode could access the contents – and what the FBI is requesting now is a tailor-made iOS that intentionally circumvents the protections, un-doing all of what Apple has engineered.
 
This is the first time the FBI has requested this, and not the 70th time.
 
Update: As it turns out, the San Bernadino shooter’s iPhone actually belong to his employer, the county Health Department, and part of the reason we’re in the mess is that they reset the password for the Apple ID account shortly after the attack.
 
Had they not done that, Apple may have been able to provide the data from the cloud backup without all this kerfuffle.
 
Keith Rozario blogs at keithRozario.com covering technology and security issues from a Malaysian perspective. He also tweets from @keithrozario. This article first appeared on his blog and is reprinted here with his kind permission.
 
Related Stories:
 
Spyware: Government denial despite the evidence
 
Singapore is using spyware, and its citizens can’t complain
 
HITB GSEC: The privacy and security balancing act, or not
 
 
For more technology news and the latest updates, follow us on TwitterLinkedIn or Like us on Facebook.
 

Keyword(s) :
Author Name :

TOP