Before you can create a custom MetaHuman using the MetaHuman plugin, you must bring the data into the Mesh to MetaHuman pipeline.
There are two types of data you can use:
Meshes (Static or Skeletal)
Footage from an iPhone or a stereo HMC.
Both of these data types must meet specific requirements in order to be processed correctly.
Importing and Preparing a Mesh
You can use any static or skeletal mesh that depicts a character, whether it’s a scan, a sculpt, or an existing game asset.
For best results, make sure that your mesh meets the following requirements:
FBX or OBJ format.
Has a texture or material. Ideally, the mesh should be textured with the skin's albedo texture. Otherwise, it needs to have materials that separate the whites of the eyes (sclera) from the skin tone.
The whites of the eyes (sclera) are showing. Your mesh doesn't need to have separate eye sockets and eyeballs, but it must have open eyes and visible separation between the eye and eyelid.
Empty eye sockets are very likely to track poorly and generate fitting artifacts and other tracking imperfections.
If your mesh isn't textured with the skin's albedo texture, it needs to satisfy the following lighting requirements:
Lighting is flat and front-facing.
Features are lit uniformly and symmetrically.
There’s minimum contrast between significant facial features (nasolabial folds and lips).
Importing a mesh in FBX format can take significantly longer than importing an OBJ mesh. As a best practice, we recommend using an OBJ file if the mesh you are importing has over 200,000 vertices. This is a known limitation.
Import your mesh using one of the workflows described on the Importing Assets Directly page in the Unreal Engine documentation, such as drag-and-drop.
In the import options window, make sure to enable the Combine Meshes option, if it isn’t already enabled. Failure to do so will result in an unuseable imported mesh.
Preparing the Mesh
After the import completes, double-click the resulting Static Mesh Asset in the Content Browser to open it in the Static Mesh Editor.
If the mesh doesn’t have a texture, follow these steps to fix the issue:
Find the mesh's Material in the Content Browser and double-click it to open it in the Material Editor.
Drag the mesh's Texture from the Content Browser into the Material Editor. This creates a new node for the Texture.
Drag from the Texture node's RGB pin to the Material node's Base Color pin. If another node is already connected to the pin, this overrides the old connection.
Save the Material and close the Material Editor.
If the mesh isn't oriented correctly (for example, if it’s facing down), follow these steps to fix the issue:
In the Content Browser, double-click the mesh to open it in the Mesh Editor.
In the Details panel, adjust the mesh’s Import Translation and / or Import Rotation properties.
(Optional) Depending on the orientation of your imported mesh, you may also need to enable the Force Front XAxis option in the Details panel.
In the Main Toolbar (above the Viewport), click the Reimport Base Mesh button.
You can disable the grid in the Mesh Editor from the Viewport menu: click Show, then uncheck the Grid option.
If you see visual artifacts in the viewport, it is highly likely that the mesh won't track well. Most artifacts on scanned meshes come from vertex / tri splitting, usually because of multiple UV sets. To fix this, make sure your mesh only has the UV set necessary to map the albedo texture.
Once you’re happy with the resulting mesh, save it, then close the Static Mesh Editor.
Importing Footage
Capture Source Configuration
Capture data comes from a Capture Source Asset which tells Unreal Engine where the data is stored.
To create and configure a new Capture Source Asset, follow these steps:
Create a new Capture Source Asset. Right-click in the Content Browser, then select MetaHuman Animator > Capture Source.
Double-click the Capture Source to open it and select the Capture Source Type. You can choose from the following options:
LiveLink Face Connection: Connect to the LiveLink Face app to capture data in real time.
LiveLink Face Archives or HMC Archives: Connect to a local folder that contains capture data from an iOS or HMC device.
The steps to configure the Asset depend on the Capture Source Type you selected:
LiveLink Face Connection: Configure these fields:
Device Address: Enter the iOS device’s IP Address.
Device Control Port: Enter a port number on the computer that’s not currently in use.
Typically, a number higher than 15000 should be safe to use.
LiveLink Face Archives or HMC Archives: Click the ellipsis (...) button next to the Storage Path field, then browse to and select the folder that contains your takes.
Import Footage to Unreal Engine
After you configure your Capture Source, you can start importing takes to Unreal Engine. Follow these steps:
Open the Capture Manager window. From Unreal Engine’s main menu, go to Tools > Capture Manager.
If you don’t see this menu option, make sure the MetaHumans plugin is enabled.
Select a Capture Source (1), then select the takes to import (2) and click Add to Queue (3). This adds takes to the right-panel queue in the Capture Manager window (4).
You can only import footage from active sources, which show a green dot next to their name in the Capture Sources panel. If you’re transferring takes from an iOS device, make sure the LiveLink Face app stays connected to Unreal Engine until the transfer completes and that it remains in the foreground. Don’t lock your phone or switch to another app on the device while the transfer is ongoing.
Once you selected all the takes you want to import, click the Import All button, then wait for the import to complete.
In the Content Browser, click Save All to save the newly imported Assets.