1 week with iPhone X, studying Face Tracking with ARKit

It’s been a week with my iPhone X and I am drilling into technical details of what iOS 11 does with iPhone X. One of the big features is the facial recognition. Curious I watched an Apple Developer video on “Face Tracking with ARKit”  https://developer.apple.com/fall17/601

 

FullSizeRender.jpg

Everyone focus on the 1st one. Facial recognition and doing things like Animoji  

With expressions tracking you could use the Face Tracking as UI input.  

Here is where ARFaceAnchor can be used to show position and orientation of the users face, 3D topology, facial expression. Everyone is a bit different in how they reach, but they are in general consistent in their facial expressions. 

 

FullSizeRender.jpg

If you don’t think you can do this check out this list of facial tracking shapes. 

 

FullSizeRender.jpg

In operations lighting has a direct impact to the quality, accuracy, and speed of work. And, you could use Facial Tracking to get a reading of the Lighting in the environment. 

 

FullSizeRender.jpg

And last with the microphone support for Animoji you can use the same method to capture audio as well. 

 

FullSizeRender.jpg