Explaining Apple's decision to not unlock a terrorist's iPhone.
The risk of allowing the government to create a "back door" into your phone.
February 17, 2016

ASU security expert explains Apple's thinking on fighting a court order to unlock a terrorist's iPhone

This week, a federal judge in California ordered Apple to allow investigators from the FBI access to the iPhone of an alleged terrorist.

Apple refused to comply.

The technology company said Wednesday that it would fight the order to create a workaround to the encryption software it builds into its popular smartphone. The judge ordered Apple to assist the federal government in gaining access to the cell phone of Syed Rizwan Farook, one of the shooters in last year’s San Bernadino terrorist attack. Farook was killed in a shootout with police.

According to reports, the FBI has been unable to gain access to Farook’s locked iPhone, and says that only Apple can get around its own encryption software.

Apple CEO Tim Cook argued that the ramifications of bypassing its security will affect all users of the iPhone.

“We oppose this order, which has implications far beyond the legal case at hand,” Cook wrote in a statement.

Woman wearing a jacket

Jamie Winterton, pictured at left and the Director of Strategic Research Initiatives at ASU’s Global Security Initiative, spoke with ASU Now about the implications of Apple’s refusal to help the FBI, and why what the bureau is asking for is more complicated than the modern day equivalent of just opening up a file cabinet.

Question: In this latest scuffle between the government and tech companies, the government is asking for a "back door" to be able to override the 10-swings-you’re-out software built into the iPhone, which deletes all data after ten tries to unlock the phone. Given how much private information we already give up on our phones to all sorts of commercial enterprises, is this really that much different?

Answer: It’s true; our phones are really an extension of ourselves. I’ll be the first to criticize commercial apps that bury privacy settings in hard-to-find menus, collect inappropriate amounts of data or change the default security settings without notice. But even though these settings can be difficult to find and use, the important thing is that they exist. Broken (“back doored”) encryption doesn’t allow the user a choice of what personal data can and can’t be accessed. That’s a big difference.

Q: Will building this software for use by the government really make tech more vulnerable for everyone to non-government actors?

A: Absolutely. Once encryption is compromised, any guarantees of security are lost. The “back door” analogy is a very appropriate one. If a door is improperly secured, it can be used by your grandmother, or it can be used by Kim Jong Un. There’s no way to determine the intent of the person coming through that door. The government seems to believe in the “golden key” argument — saying that if an encryption “back door” were created, a key could be created such that only authorized government users would be able to unlock it — but just as with physical doors, keys can be misplaced or locks can be picked, given enough time. 

I question the ability of the government to responsibly hold encryption keys. We’ve seen the significant data breaches that have resulted from bad security practices within the government. 22 million security clearance profiles lost in the Office of Personal Management incident, 30,000 employee profiles exposed in the recent FBI and DHS breaches. The President recently allocated over $3 billion to shore up federal cybersecurity, much of which is going to replace egregiously outdated systems that simply can’t be secured, they’re so deprecated. To be more literal, CAD files of TSA master keys were created and uploaded to GitHub (an open source code repository), so anyone with the desire and equipment could duplicate these keys and unlock any TSA-approved locks. Long story short, the government has a lot of trust to regain before we should trust them with the keys to all our personal information.

Q: Apple has, in the past, complied with other court orders. What makes this different?

A: The order in this case is not simply for Apple to hand over data. They are actually being asked to create entirely new software to break their encryption standard, which would violate Apple’s longstanding commitment to personal data security. If Apple were to comply, it would have drastic implications outside this particular case. The government argues that the request is reasonable because Apple “writes software as part of its regular business,” but the software in question could be used on any iOS device, by any government agency or possibly by malicious actors looking to exploit and collect personal data. Other software companies could be similarly pushed to create weakened products. Once the genie is out of the bottle, it’s very hard to put it back in.

Case in point: “export-grade” encryption from the 1990s. The US banned the sale of cryptographic software overseas unless it was purposefully weakened. But the Internet is a land without borders — many products both inside and outside the U.S. built this weakened protocol into their products. In early 2015, security researchers discovered an exploit that relied upon this weakened encryption scheme. About a third of websites were found to be vulnerable, including those of the FBI, the White House, and NASA. The choices made here will have far reaching repercussions.

Q: Is there a compromise that could be reached in this case?

A: No. Encryption is either broken, or its not. There’s no middle ground here.

Top photo: Image courtesy of Wikimedia Commons