Live Link Face for effortless facial animation in Unreal Engine — Capture performances for MetaHuman Animator to achieve the highest fidelity results or stream facial animation in real time from your iPhone or iPad for live performances.
- MetaHuman Animator uses Live Link Face to capture performances on iPhone then applies its own processing to create high-fidelity facial animation for MetaHumans.
- The Live Link Face iOS app captures raw video and depth data, which is ingested directly from your device into Unreal Engine for use with the MetaHuman plugin.
- Stream out ARKit animation data live to an Unreal Engine instance via Live Link over a network.
- Visualize facial expressions in real time with live rendering in Unreal Engine.
- This workflow requires an iPhone (12 or above) and a desktop PC running Windows 10/11, as well as the MetaHuman Plugin for Unreal Engine.
- Tune the capture data to the individual performer and improve facial animation quality with rest pose calibration.
- Facial animation created with MetaHuman Animator can be applied to any MetaHuman character, in just a few clicks.
- Record the raw ARKit animation data and front-facing video reference footage.
- Delete takes within Live Link Face, share via AirDrop.
- Transfer directly over network when using MetaHuman Animator.
- Select from the iPhone system clock, an NTP server, or use a Tentacle Sync to connect with a master clock on stage.
- Drive a 3D preview mesh, optionally overlaid over the video reference on the phone.
- Video reference is frame accurate with embedded timecode for editorial.