Apple erred on facial recognition

By Todd Mozer

CEO

Sensory

September 14, 2017

There are many theories about what happened with Apple's FaceID. Let's discuss what failure means and 2D vs. 3D cameras. There are three classes of failure: accuracy, spoofability, and user experience

On the same day that Apple rolled out the iPhone X on the coolest stage of the coolest corporate campus in the world, Sensory gave a demo of an interactive talking and listening avatar that uses a biometric ID to know who’s talking to it. In Trump metrics, the event I attended had a few more attendees than Apple.

Interestingly, Sensory’s face ID worked flawlessly, and Apple’s failed. Sensory used a traditional camera using convolutional neural networks with deep learning anti-spoofing models. Apple used a 3D camera.

There are many theories about what happened with FaceID at Apple. Let's discuss what failure even means and the effects of 2D versus 3D cameras. There are basically three classes of failure: accuracy, spoofability, and user experience. It's important to understand the differences between them.

False biometrics

Accuracy of biometrics is usually measured in equal error rates or false accepts (FA) and false rejects (FR). This is where Apple says it went from 1 in 50,000 with fingerprint recognition to 1 in 1,000,000 with FaceID. Those are FA rates, and they move inversely with FR – Apple doesn’t mention FR.

It's easy to reach one in a million or one in a billion FAs by making it FR all of the time. For example, a rock will never respond to the wrong person... it also won’t respond to the right person! This is where Apple failed. They might have had amazing false accepts rates, but they hit two false rejects on stage!

I believe that there is too much emphasis placed on FA. The presumption is random users trying to break in, and 1 in 50,000 seems fine. The break-in issue typically relates to spoofability, which needs to be thought of in a different way – it's not a random face, it's a fake face of you.

Every biometric that gets introduces gets spoofed. Gummy bears, cameras, glue, and tape were all used to spoof fingerprints. Photos, masks, and videos have been used to spoof faces.

To prevent this, Sensory built anti-spoof models that weaken the probability of spoofing. 3D cameras also make it easier to reduce spoofs, and Apple moved in the right direction here. But the real solution is to layer biometrics, using additional layers when more security is needed.

Apple misfires on UX?

Finally, there’s an inverse relationship between user experience and security. Amazingly, this is where Apple got it wrong.

Think about why people don't like fingerprint sensors. It's not because too many strangers get in; it's because we have to do unnatural motions, multiple times, and often get rejected when our hands are wet, greasy, or dirty.

Apple set the FA so high on FaceID that it hurt the consumer experience by rejecting too much, which is what we saw on stage. But there's more to it in the tradeoffs.

The easiest way to prevent spoofing is to get the user to do unnatural things, live and randomly. Blinking was a less intrusive version that Google and others have tried, but a photo with the eyes cut out could spoof it.

Having people turn their face, widen their nostrils, or look in varying directions might help prevent spoofing, but also hurt the user experience. The trick is to get more intrusive only when the security needs demand it. Training the device is also part if the user experience.

Todd Mozer is the CEO of Sensory. He holds over a dozen patents in speech technology and has been involved in previous startups that reached IPO or were acquired by public companies. Todd holds an MBA from Stanford University, and has technical experience in machine learning, semiconductors, speech recognition, computer vision, and embedded software.

More from Todd

Categories
Industrial