MetaHuman Animator can be used to generate animation for your MetaHuman character from facial performances captured on a variety of video and audio devices. Animation can be generated in three ways:
In real time from any mono video camera (including webcam), mobile device (using the Live Link Face app), or audio source.
Offline from depth data captured with a TrueDepth camera on an iOS device, or a stereo head-mounted camera (HMC).
Offline from mono video or audio.
Realtime Animation Quickstart
To animate an assembled MetaHuman character that has been added to a level in real time from a webcam:
Enable the MetaHuman Live Link plugin.
Open the Live Link window and add a new MetaHuman Video Source and Subject.
Add the character blueprint for an assembled MetaHuman Character to a level.
In the Details panel for the character in the level, navigate to the Live Link section and select the Live Link Subject created in step 2.
Your character should begin to animate based on your facial performance.
Audio Driven Animation Quickstart
To generate animation offline from audio:
Enable the MetaHuman Animator plugin.
Create a new SoundWave asset using Take Recorder or by importing an existing audio file.
Create a new instance of the MetaHuman Performance asset.
Double click on the asset to open the MetaHuman Performance asset editor.
Select Audio as the Input Type, and pick the SoundWave asset in the Audio field.
Click Process.
Click Process.
Getting Started in MHA
Enable one or more plugins to use MetaHuman Animator in Unreal Engine.
Hardware Requirements
Recommended hardware requirements for MetaHuman Animator.
Recommended Unreal Engine Project Settings
These are the recommended project settings for using MetaHuman Animator.
Facial Performance Capture Guidelines
Guidelines and best practices for capturing facial performance to use with MetaHuman Animator.
Realtime Animation
Animate your MetaHuman in real time from any mono camera (including webcam), supported mobile device, or audio source.
Animation from Depth Data
Animate your MetaHuman from depth data captured with a TrueDepth camera on an iPhone, or a stereo head-mounted camera (HMC).,
Animation from Mono Video
Animate your MetaHuman from mono video.
Audio Driven Animation
Learn how to process audio into realistic facial animations for your MetaHuman.
Python Scripting
Use the Python API and example scripts to automate MetaHuman Animator.
Mesh to MetaHuman
Create a MetaHuman Identity from a mesh or video footage.
Asset Reference
A more detailed reference for the assets used by MetaHuman Animator.