The Live Link Face app is required to generate animation in real time or offline from depth data using an iOS device (iPhone or iPad). See the Capture Device Requirements page for more information about the required hardware specification.
Follow these steps to configure the iOS app ready to generate real-time animation for your MetaHuman, or capture depth data for processing offline.
Although the emphasis here is on capturing depth data, the Live Link Face app records video, depth, and audio for offline processing, all of which can be used to generate animation in Unreal Engine in different ways. For example, you could use only the audio data for audio driven animation.
Download the latest version of the Live Link Face iOS app (version 1.5.0 or later) from the Apple App Store.
Connect the device to the same local network as the PC running Unreal Engine.
Launch the Live Link Face app. Accept the EULA and when prompted allow the app to access the camera, microphone and the local network.
You must accept the local network permission in order to use automatic network discovery in Unreal Engine to make set up easier. If you accidentally deny the app network access this can be resolved by launching the iOS Settings app, navigating to Privacy & Security > Local Network and then enabling the option for Live Link Face.
In the Settings screen, ensure that the Capture Mode is set to MetaHuman Animator.
(Optional) For real-time animation, enable the Realtime Animation option. This will cause the app to start generating animation data.
Click Done to exit the settings screen.
Once the Live Link Face App is running, it is ready to be connected to Unreal Engine using a Live Link Face source in Live Link or Capture Manager in Live Link Hub.