What will I use Face ID for?
The same things you currently do: Apple Pay, App Store and iTunes purchases, and third-party apps that currently rely on Touch ID. Apple says that third-party apps—as with Touch ID—will be able to allow a Face ID authentication, and iOS only informs the app whether or not the match was accurate.
Interestingly, Apple says developers can use Face ID without a fallback to a passcode, if a developer wants to use the biometric identification as a kind of second factor that can’t be bypassed.
Third parties will also have access to live depth maps, just as the rear two-camera systems provide in iOS 11, but not the raw data of sensors sampling your face.
How do I set up Face ID?
Face ID uses an “enrollment” process just like with Touch ID. You’ll go to Settings > Face ID & Passcode and tap Enroll Face, and then the iPhone will use the front-facing camera to display your face within a circle with green tick marks surrounding it. The enrollment software will overlay quasi-3D markings onscreen to show your eye line and facial center. You’ll be prompted to move your head in a circle, while your facial characteristics are captured.
Apple says the odds that someone else’s fingerprint will unlock Touch ID is 1 in 50,000, a pretty low number given there’s no way to test for that without trying. Apple says Face ID’s chance of another face matching is 1 in 1,000,000.
Apple’s senior vice president of worldwide marketing, Phil Schiller, did say during the iPhone X introduction that, “The statistics are lower if the person shares a close genetic relationship with you.” Apple clarified this in its white paper, noting that the accuracy is “different” for twins and siblings. If you have an evil twin, you should probably avoid Face ID.
It also said that children under 13 had a higher rate of false matches, though it didn’t provide a number, because distinct facial features “may not have fully developed.”
How does Face ID work?
Apple uses a combination of infrared emitter and sensor (which it calls TrueDepth) to paint 30,000 points of infrared light on and around your face and also capture flat or 2D infrared snapshots. For the points, the reflection is measured, which allows it to calculate depth and angle from the camera for each dot and construct a depth map.
The information collected for a Face ID depth map is used for live tracking in Apple's new Animoji feature. Credit: Apple
Sign up for Computerworld eNewsletters.