FutureFive New Zealand - Consumer technology news & reviews from the future
Story image
Mon, 22nd Feb 2016
FYI, this story is more than a year old

The internet was recently ablaze as a Californian Judge ordered that Apple help the FBI unlock a suspected criminal's iPhone. The device in question belonged to Syed Rizwan Farook, one of the alleged perpetrators of the San Bernardino shooting massacre.

Irate internet users complained that the US government is trampling over their civil liberties. The most vocal complaints say that by pushing Apple to install a backdoor on iPhones, the government will gain unfettered access to the personal information of all iPhone users.

Trouble is they're not quite correct.

Like most fraught situations, the facts appear to have taken a backseat to all the yelling. The devil is in the detail in a situation such as this and the reality is that there are a lot of subtle yet complex distinctions in the court order. It appears that the situation is not as black and white as all the yelling would have it.

For a start, the court hasn't mandated that a generic 'back door' be built into iPhones to make their info accessible to government agencies wanting to take a peek.

A brief examination of the court order () reveals that while a worrying precedent has been set, the fears of most conspiracy theorists are most probably unfounded.

It isn't so much that a back door is to be installed. Instead the courts are asking that an app be developed by Apple that'll prevent an iPhone erasing data after 10 unsuccessful passcode attempts.

That isn't exactly a master key for unlocking iPhone encryption. but would instead allow the FBI to use brute force techniques to get at the information contained on the suspect's iPhone.

The actual encryption of iPhones is not being tweaked by the feds and stays unchanged. Here's what the court order requests of Apple:

Apple's reasonable technical assistance shall accomplish the following three important functions: (1) it will bypass or disable the auto-erase function whether or not it has been enabled; (2) it will enable the FBI to submit passcodes to the SUBJECT DEVICE for testing electronically via the physical device port, Bluetooth, Wi-Fi, or other protocol available on the SUBJECT DEVICE and (3) it will ensure that when the FBI submits passcodes to the SUBJECT DEVICE, software running on the device will not purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.

The order stipulates 'SUBJECT DEVICE' – meaning the government is not requiring the app to be baked into all iPhones as part of iOS, just that it is used on the specific devices of suspected criminals. This is a subtle but oh-so-very important distinction.

In effect it means that law enforcement agencies will need to have just cause before they can confiscate a suspect's iPhone and then a warrant before they can forensically examine it. This involves judicial oversight and should in theory mean that illegitimate snooping is less likely.

Even more importantly, the court order also requires that Apple provides the FBI with software that is coded to only run on the iPhone of a specific criminal suspect:

Apple's reasonable technical assistance may include, but is not limited to: providing the FBI with a signed iPhone Software file, recovery bundle, or other Software Image File (SIF) that can be loaded onto the SUBJECT DEVICE. The SIF will load and run from Random Access Memory and will not modify the iOS on the actual phone, the user data partition or system partition on the device's flash memory. The SIF will be coded by Apple with a unique identifier of the phone so that the SIF would only load and execute on the SUBJECT DEVICE.

So while the order doesn't force Apple to weaken iPhone encryption on all iPhones, it'll instead require that Apple turn off a feature so the FBI can access information on a specific iPhone.

This said, people have good reasons to be distrustful and a slippery slope argument is probably applicable. Recent revelations by Edward Snowden have seen surveillance technologies such as StingRay come to light. StingRay essentially mimics a cell tower to force all nearby mobile phones to connect to it where the data from the phones can be intercepted and monitored. StingRay devices have been fitted to airplanes, helicopters, drones and vehicles.

Because StingRay doesn't require law enforcement agencies to confiscate a suspect's smartphone, surveillance can be done in secret without the suspect aware that they are under surveillance. Unlike the methods proposed in the Californian court order, it also appears that SingRay may have already have been abused.

The American civil liberties Union found that in 2006, the company that manufactures StingRay equipment conducted wireless surveillance using StingRay units on behalf the Palm Bay Police Department because of a bomb threat made against a school. The search was conducted without any warrants or judicial oversight. Similarly, in 2014, Florida police revealed they had used StingRay at least 200 times since 2010 without disclosing it to the courts or obtaining a warrant.

It isn't just StingRay technologies either; there are other means of getting data off of a smartphone involving methods such as malware. Perhaps these indignant internet users should be getting worked up about these as well?

While there is no debate that a little suspicion is indeed a good thing, especially when it comes to protecting the rights of law abiding citizens, the reality is that the feds will need to apply for search warrants on a case-by-case basis and the method as stipulated by the court order is less likely to be subject to abuse such as has been the case with StingRay.

Ironically, a 2015 court case in the U.S. revealed that Apple had already helped the government pull information off iPhones. Last year in New York, court prosecutors revealed that Apple had unlocked phones for authorities at least 70 times since 2008.

So Apple's strongly worded response to the court order could have more to do with public relations and commercial fallout as it does with customer protection.

This isn't all that surprising as none of this is likely to be terribly palatable for Apple who'll foot the bill for complying with the court order. While the costs of doing so are likely to be peanuts in the grand scheme of things, it'll also do Apple no favours in an ultra-competitive smartphone marketplace, where any perceived weakness such as the feds being able to snoop through a phone could be exploited by competitors to turn potential buyers away from iOS.