Saturday, March 12, 2016

An Apple (R) a day keeps the DoJ away

Couple kills 14 people in San Bernardino; they are determined to have terrorist ties.
iPhone with potentially valuable law-enforcement data is recovered.
FBI asks Apple for help retrieving data, Apple declines.
FBI takes them to court, judge orders Apple to help.
Tech world gets behind Apple's appeal centered on privacy and back-doors.

This is where we are now. It's been incredibly disappointing to me to see so many tech-involved people (Apple included) blatantly misrepresent what Apple has been asked to do. The narrative is that the FBI would get a backdoor that they (or hackers) could freely use to access any encrypted data in the future. This is not at all the case, and is a very important point in the discussion:

 "Apple’s reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware."

This would be a custom OS that prevents the booby-trap of locking down the device after a handful of incorrect passwords. The FBI's approach afterwards would be to brute-force their way in by guessing the pin code. This is also a very important distinction. The details of how the device is encrypted are fairly technical (Apple seems vague on details because their devices use magical fairy dust, but it's likely very similar, or identical, to Windows bitlocker). The key takeaways here are that
  1. The FBI is not actually accessing anything to do with the real encryption of the data.
  2. The FBI is exploiting the relatively weak pin that lets the underlying hardware decrypt far more securely locked data. In other words, your data is only as secure to attack as guessing your 4- or 6-digit pin in a handful of tries is.
  3. There is no general alternate route to the data provided to the FBI or anyone else who gets a hold of the device
Apple, like other technology companies, has been asked to (and complied with) dig up data on customers by law enforcement. Why in this case are they resisting? There are numerous possibilities, some more rational than others. The following have been explicitly cited:

Apple is fearful the custom OS will get into the wrong hands
This is probably the most valid concern. I can't imagine they can't negotiate some arrangement where they recover and hand over the unencrypted data, but never hand over the OS to the FBI. Essentially, guarantee they maintain custody of said OS bits and move on.

It sets a bad precedent that the government can have Apple build custom OS
This seems silly to me. This is an example of requiring technical assistance to gather data. This happens all the time. When the FBI collects phone records, or cloud-stored info, or identities from an account, etc, they call the affiliated company, present a warrant, and the company gets the data for them. The FBI does not have the technical capability to get this info without help; in fact, doing so would require them hacking databases which is not kosher.

People have a right to privacy
I'd scoff far louder if Google were making this claim ... but let's stick to the case at hand. Yes, people have a right to privacy. Law enforcement can't randomly peek into my house to see if something is amiss. That requires a warrant, just like the one the FBI has for the phone, just like the ones they need to call Verizon and get phone logs, etc. This, to me, is absolutely no different.

It compromises encryption for all
This is simply technically false and has been a completely erroneous part of the discussion. No, we should not compromise the quality of encryption. No, that's not happening in this case.

Compelling companies to write custom software is a slippery slope towards encryption backdoors
I don't think this is true. I think the road to encryption back doors is independent of this. Companies are already 'compelled' to write certain code; for example anyone working in healthcare or finance has to comply with very strict auditing guidelines. No one is dictating particular lines of code, but some very specific features have to exist.

Apple wants to look like the technical leader of privacy
Bingo! Sell more iPhones! I can't imagine this isn't a/the top internal reason.

But other tech companies all back Apple, Apple must be right
Yep, cuz people are dumb and none of the other big players want to look like law-enforcement pawns to these technically illiterate people. Or to the technically semi-literate who have misconstrued all of this.

A couple Facebook discussions yielded numerous other claims, but they fall into the technically inept bucket so I'm not even going to address those. Bottom line is this narrative has fallen far from the scope of reality and entered a world of fearmongering most tech people seem to think only exists in the GOP around Planned Parenthood and persecutions on Christianity. We make fun of them for being ridiculous, let's turn that same analysis on ourselves.
 

No comments: