Use Live Link on a mobile device to capture facial animation and apply it to your MetaHumans in Unreal Engine in real time. In addition to capturing facial performance, Live Link also sends the head's rotation to Unreal Engine, which allows for a more natural range of movement.
Required Setup
You can only use the Live Link Face app on an iOS mobile device (iPhone or iPad) that supports ARKit. Your device must have a TrueDepth camera, which is available on these models:
iPhone: iPhone X or newer.
iPad: iPad Pro (3rd generation) or newer.
Before you can follow the steps in this guide, you need to complete the following required setup:
On your mobile device, download and install the Live Link Face for Unreal Engine app from the Apple App Store.
Create a new Unreal Engine project. You can use any template you want, but for the best results, start with a blank Level inside your project.
Create a MetaHuman in MetaHuman Creator.
Download your MetaHuman and export them to Unreal Engine. Refer to the Downloading and Exporting MetaHumans section if you need additional information on how to complete this step.
From the Unreal Editor main menu, go to Edit > Plugins and make sure the following plugins are enabled for your project:
Live Link
Live Link Control Rig
Apple ARKit
Apple ARKit Face Support
These plugins should be enabled by default after you import at least one MetaHuman into your project.
(Optional) Add your MetaHuman to the Level. This will make it easier to enable and preview LiveLink.
Connect Live Link to Unreal Engine
Follow the steps below:
1. Find Your Computer's IP Address
You need this to configure the link between your iOS device and the instance of Unreal Engine running on your computer.
On Windows, follow these steps:
Right-click your connection icon in the taskbar. Then, from the context menu, select Open Network & Internet Settings.
Scroll down to the Properties section (or click the Properties button, depending on your OS version), then write down or copy the IPv4 Address value.
On macOS, follow these steps:
Open System Preferences.
Double-click the Network icon.
Select the Network you are currently connected to.
Click the Advanced button.
In the window that opens, click the TC/ICP tab, then write down or copy the IPv4 Address value.
2. Configure the LiveLink App
Configure Live Link on your iPhone or iPad to start using it with Unreal Editor.
Although Live Link requires an iOS device, you can work on your Unreal Project on either Windows or macOS.
On your iOS device, open the Live Link Face app.
Tap Settings (gear icon) in the upper-left corner.
Select Live Link, then Add Target.
In the Add Target screen, enter the IPv4 address you noted earlier.
Tap Add in the upper right corner.
(Optional) In the Subject Name field, give your Live Link connection a name that’s easy to recognize.
To confirm that Live Link is connected properly, in Unreal Engine, from the main menu, go to Window > Virtual Production > Live Link. This opens the Live Link configuration settings window. You should see your iPhone or iPad listed as a Source.
In this example, the phone "iPhoneChris" is recognized as a LiveLink source.
Configure the MetaHuman Blueprint
Next, you need to configure the MetaHuman's Blueprint to accept data from Live Link. You can do this in two ways:
From the Level Viewport
From the MetaHuman's Blueprint
From the Level Viewport
If you’ve added your MetaHuman to the Level Viewport, follow these steps:
In the Level Viewport, click your MetaHuman to select them.
With the MetaHuman Blueprint selected, in the Details panel, configure the following settings in the Live Link section:
ARKit Face Subj: Select your device from the drop-down.
Use ARKit Face: Enable this option.
If you’re also capturing body motion using more advanced MoCap technology, repeat these steps for the Live Link Body Subj and Use Live Link Body settings.
From the MetaHuman Blueprint
In the Content Browser, search for and open the
BP_(MetaHumanName)
Blueprint. For this tutorial, we are using the Taro preset, whose Blueprint is namedBP_Taro
.In the Components panel, select the root Component. This is named
BP_(MetaHumanName)(Self)
.In the BP(MetaHumanName)(Self) Component's Details panel, configure the following properties:
ARKit Face Subj: Select your device from the drop-down.
Use ARKit Face: Enable this option.
Compile and Save the Blueprint.
Test the Live Link Connection
You can now test your Live Link setup to see if your MetaHuman reacts correctly to incoming data. Follow these steps:
If you haven't done so already, from the Content Browser, drag the MetaHuman Blueprint into the Level.
Move around the Viewport until you can clearly see the MetaHuman's face.
If you need more information about how to move around the viewport, refer to the Viewport Controls page in the Unreal Engine documentation.
On your iOS device, open the LiveLink Face app and point the front camera at your face. You should see a tracking mesh overlap your face and react to your expressions and head movements.
This screenshot shows the tracking mesh in action. You can disable this from the Live Link app's settings.
For best results, make sure that your facial features aren't obscured by any hair or accessories, such as glasses, and that your face is well-lit.
You should see the MetaHuman in your Viewport start reacting to your facial expressions and head movements.
The video below shows a MetaHuman in the Viewport whose head movement and expressions are driven in real time by a human's performance through Live Link.
Next Steps
With the Live Link connection now established, you can start recording facial animations for your MetaHuman using Sequencer and Take Recorder. For more information on how to get started, refer to the pages below in the Unreal Engine documentation: