The slippery definition of encryption 'back doors'

The slippery definition of encryption 'back doors'
© Getty Images

The technology community has a definition of an encryption back door. The government has its own. 

A judge’s order Tuesday night compelling Apple to assist law enforcement officials in unlocking the iPhone used by one of the people who attacked a holiday party in San Bernardino, Calif., last year created a high-profile case study in an encryption debate about back doors that is most often talked about in the abstract. 

ADVERTISEMENT
Encryption on phones and over other digital services has given users a level of security and protection from hackers. It has also been sold as a protection against intrusive government surveillance in the wake of Edward Snowden's revelations about NSA spying.

Privacy advocates and tech companies have said creating a path for law enforcement to bypass encryption in any form amounts to a security back door, which would eventually be exploited by criminals or abused by the government. 

The government, however, has used varying rhetoric over the past year to avoid the “back door” label, describing its request as a "front door" or something similar that would simply let tech companies comply with court orders.

Below is a rundown of the rhetoric used by tech companies, privacy advocates, law enforcement and the courts when it comes to getting around encryption. 

The court 

On Tuesday night, a magistrate judge ordered Apple to “assist in enabling the search of a cellular telephone.”

Apple’s phones have a feature that can auto-erase information on an iPhone if the pass code is entered incorrectly too many times. The judge ordered Apple to help bypass that auto-erase system so the FBI can try to break that pass code by trying multiple codes until one is successful. To do that, the order suggests Apple create software that could run through a system update that could be used only for that phone. 

“If Apple determines that it can achieve the ... functions ... using an alternate technological means from that recommended by the government, and the government concurs, Apple may comply with this order in that way,” the order says. 

Apple

Apple CEO Tim Cook said the order would put the security of all its customers in danger and the company will appeal. Cook said the software requested does not exist and would be a government back door no matter how it is framed.

“The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control,” Cook said in a letter to customers. 

The White House

White House press secretary Josh Earnest on Wednesday said President Obama believes in “strong encryption” but believes the debate about the San Bernardino shooter’s phone is narrower than creating a back door, which he seemed to equate with redesigning its software for all users. 

“They are not asking Apple to redesign its product or to create a new back door to one of their products. They are simply asking for something that would have an impact on this one device,” he said. 

Privacy and technology advocates

Privacy and tech groups say the government cannot bypass encryption without necessarily creating a back door, which could potentially be exploited by others. This is from a 2014 issue brief released by the Center for Democracy and Technology:

“As a technical matter, creating a path through encryption to provide access that the user does not authorize is, by definition, a ‘backdoor’ security vulnerability into the device. It is impossible to build encryption that can be circumvented without creating a technical backdoor.”