The odds you don’t know already know about Apple’s latest attempt at market domination — the iPhone X — are about even with the possibility that President Trump will stop using hand gestures when explaining something. That said, it is quite possible that you have not yet heard why the iPhone X should worry you.
“In its continuing war on inconvenience,” Andy Greenberg of Wired wrote, Apple is poised to “give an unproven biometric security technology its biggest field test yet.”
The most notable change—never mind the iPhone X’s screen size, resolution and configuration—is its use of facial recognition as a security feature and the death of the home button. And it really isn’t an exaggeration to say this is a field test of sorts.
Now, I certainly don’t want to knock Apple, especially given noteworthy, pro-consumer privacy stances taken by the company. CEO Tim Cook bravely refused to help authorities access the San Bernardino shooter’s iPhone over privacy concerns—specifically taking the stance that deciding when to provide assistance and when not to help law enforcement is too slippery a slope for a publicly traded company vested with zero legal authority over such matters.
With the obvious caveat that no Silicon Valley company is ethically spotless in the land of data monetization, Apple is more privacy true-believer than not when compared to its cohorts in the corporate sector. To be clear: This doesn’t mean to say that Apple isn’t in the information business, because it is. But in general, under Tim Cook’s leadership, the company has been sensitive to the issue of consumer privacy.
And this is precisely why the latest iPhone whizzbang—or privacy field test, if you will—is puzzling, because the use of facial recognition technology raises serious questions about security and privacy.
Granted, the particular technology driving Face ID, for the time being, seems difficult to spoof without a fair amount of expensive equipment and buckets of technical acuity, but in the world of hacking exploits, all things crack with the application of enough time and pressure.
In the meantime:
How will Face ID data be stored?
Apple has used Secure Enclave to store biometric data in the past, but most cybersecurity experts agree that the safest place to store biometric data is locally, i.e., only on the device that it’s being used to access. If the data is stored on a server but is not encrypted, what safeguards are in place to prevent a third party from using the Face ID data for other purposes—whether those purposes are “enterprising” or outright illegal? Why not avoid the danger of data compromise, and restrict FR data to the device it unlocks?
Will Face ID reliably work on people of all ethnicities?
Anyone of a certain age will remember the public relations problem that Eastman Kodak had because their color film didn’t accurately capture people with darker skin. And not to be too macabre, but in the event of an accident or a face-swelling allergy attack, how will Face ID know you’re you? Get ready for viral stories about the failure of the Face ID feature, because they’re coming.
What about the Fourth Amendment?
The problem of theft or issues surrounding privacy rights of suspects and criminal defendants seem like immediate “real world” concerns as these devices are about to hit the streets. In theory, had the San Bernardino shooter been using the iPhone X, the phone could have been opened. What are the legal implications of that? Could police access your phone by pointing it at your face? Would they be able to use anything they found inside? What about a mugger? Could you be robbed with nothing more than your own phone pointed at your head?
What else will facial recognition be able to do?
In a perfect world, we’d have some assurances that Apple is not going to use facial recognition data to improve product design or even create services like FindFace (an app that allows Russian consumers to identify strangers who have profiles on popular social networking site), and if they plan to do these things, are lawmakers poised make sure consumers have a way to opt out? If Sen. Al Franken’s (D-Minn.) comments are any indication (and I believe they are), the legislative branch is ready to meet this particular challenge.
The bigger issue from a security standpoint is the question of overall efficacy. Biometric authentication is flawed. It doesn’t matter what kind we’re talking about. Facial recognition can be tricked. Voice prints can be stolen. Fingerprints can be copied, and even retina scans have been defeated by hackers.
I know what you’re thinking: Body weight (alone, or in combination with shoe size yoked to any or all of the aforementioned metrics) is the answer. Go ahead and invent the device that makes it happen. (And yes, I realize you weren’t actually thinking that.)
As for Apple, the company is thinking the right thoughts, but we’re not where we need to be to make something like Face ID safe. For the time being, the increase in convenience comes with a parallel increase in our attackable surface.
As I’ve written elsewhere, good security should incorporate something you have (like an ID card or a token generator), something you know (like a password or a phrase), and something you are (biometric identifiers like iris scans and fingerprints). Strong authentication requires at least two different identifiers.
While the best solution is probably still using numeric passwords, an even more secure environment could be created using 2-factor authentication that required both Face ID and a six or greater digit numeric code. But then that wouldn’t be convenient, would it?
Adam K. Levin is chairman and founder of CyberScout (formerly IDT911) and co-founder of Credit.com, and a former director of the New Jersey Division of Consumer Affairs. He is also the author of "Swiped," which debuted at #1 on the Amazon Bestsellers Hot New Releases List.