As said Epic Games in his blog, with a new version of the Unreal Engine 4, she wanted to release something that would complement the engine from outside. This “something” was the Live Link Face mobile app, which is already available on the App Store.
Live Link Face allows you to capture and stream facial expressions from the iPhone, in real time superimposing it on the character in Unreal Engine. The application takes advantage of Apple’s ARKit platform and iPhone’s TrueDepth camera. They are also used to create Animoji inside iOS itself.
The creators of the application assure that with their minimum delay of streaming, Live Link Face will achieve almost perfect synchronization with other components of the production, such as cameras and capture body movements.
In addition to this, the application adapts to the user, depending on whether he uses a full suit to capture movements or just sits at his workplace. Live Link Face is able to track not only the facial expression, but also the rotation of the head or neck rotations.
As they say in Epic, one of the main goals of Unreal Engine is to make working with tools in real time easier. In this case, Live Link Face makes capturing facial expressions more accessible for both large teams and many independent developers.
For more information on how the Live Link Face application is arranged and working with it, see the official documentation .