This page outlines the Mesh to MetaHuman workflow for creating a MetaHuman Identity from an existing Static or Skeletal Mesh. The mesh can use any topology.
As of Unreal Engine 5.6, the Mesh to MetaHuman workflow will no longer create a new character in the MetaHuman Creator web application. After creating the identity, use the Conform from Identity option in the MetaHuman Character asset editor to create the character directly in MetaHuman Creator in Unreal Engine.
Workflow
A video tutorial for this workflow is available on YouTube.
The video linked above was made using an older version of MetaHuman Animator. Steps and menus may be different if you are using a more recent version.
The Mesh to MetaHuman from mesh workflow consists of the following steps:
Import and prepare the character Mesh.
Create and populate a MetaHuman Identity Asset.
Create and track a Neutral Pose.
Run the Identity Solve.
Submit the Template Mesh to the MetaHuman Backend.
These steps are covered in detail below.
Import and Prepare the Character Mesh
Importing the Mesh
You can use any static or skeletal mesh that depicts a character, whether it’s a scan, a sculpt, or an existing game asset. For best results, make sure that your mesh meets the following requirements:
FBX or OBJ format.
Has a texture or material. Ideally, the mesh should be textured with the skin's albedo texture. Otherwise, it needs to have materials that separate the whites of the eyes (sclera) from the skin tone.
The whites of the eyes (sclera) are showing. Your mesh doesn't need to have separate eye sockets and eyeballs, but it must have open eyes and visible separation between the eye and eyelid.
Empty eye sockets are very likely to track poorly and generate fitting artifacts and other tracking imperfections.
If your mesh isn't textured with the skin's albedo texture, it needs to satisfy the following lighting requirements:
Lighting is flat and front-facing.
Features are lit uniformly and symmetrically.
There’s minimum contrast between significant facial features (nasolabial folds and lips).
Importing a mesh in FBX format can take significantly longer than importing an OBJ mesh. As a best practice, we recommend using an OBJ file if the mesh you are importing has over 200,000 vertices. This is a known limitation.
Import your mesh using one of the workflows described on the Importing Assets Directly page in the Unreal Engine documentation, such as drag-and-drop.
In the import options window, make sure to enable the Combine Meshes option, if it isn’t already enabled. Failure to do so will result in an unusable imported mesh.
Preparing the Mesh
After the import completes, double-click the resulting Static Mesh Asset in the Content Browser to open it in the Static Mesh Editor.
If the mesh doesn’t have a texture, follow these steps to fix the issue:
Find the mesh's Material in the Content Browser and double-click it to open it in the Material Editor.
Drag the mesh's Texture from the Content Browser into the Material Editor. This creates a new node for the Texture.
Drag from the Texture node's RGB pin to the Material node's Base Color pin. If another node is already connected to the pin, this overrides the old connection.
Save the Material and close the Material Editor.
Ensure the mesh is correctly oriented
If you see visual artifacts in the viewport, it is highly likely that the mesh won't track well. Most artifacts on scanned meshes come from vertex / tri splitting, usually because of multiple UV sets. To fix this, make sure your mesh only has the UV set necessary to map the albedo texture.
Once you’re happy with the resulting mesh, save it, then close the Static Mesh Editor.
Create and Populate a MetaHuman Identity Asset
Next, you will create and populate a MetaHuman Identity Asset. This Asset holds the MetaHuman's Face Mesh (Template Mesh), body type information, and pose information.
Follow the steps below:
Create a new MetaHuman Identity Asset. Right-click in the Content Browser. From the context menu, select MetaHuman > MetaHuman Identity.
If you create a MetaHuman from this Asset later, the filename of the Identity Asset you create will be your MetaHuman’s name in MetaHuman Creator.
Double-click the MetaHuman Identity Asset you created to open it in the Identity Asset Editor. Initially, this Asset is empty. You will populate it with data in the next steps.
In the Main Toolbar, click Create Components, then select From Mesh. Search for and select the mesh you imported.
After you complete this step, the Components panel on the left side of the Metahuman Identity Asset Editor will be populated with Face and Poses.
After completing this step, you now have an Identity Asset populated with all the required Components.
Create and Track a Neutral Pose
Before the MetaHuman backend can create a MetaHuman-compatible Template Mesh from the mesh you uploaded, you need to create and track a Neutral Pose.
To learn more about Neutral Poses, refer to the Key Concepts section on the Mesh to MetaHuman landing page.
Adjust Viewport Lighting
Changing the lighting in the viewport can make it easier to work with your mesh. You can switch between different lighting modes from the Viewport Toolbar, as shown below:
Lit
Unlit
Lighting Only
For meshes that are textured with the skin's albedo texture, Unlit mode works best. For other meshes, you may have to choose Lit mode and rotate the light around until the entire face is lit evenly. To rotate the light, hold down the L key (or Ctrl + L keys, depending on how your engine is configured), then hold down the left mouse button and drag.
Create and Track a Neutral Pose
To create and track a neutral pose, follow these steps:
In the Components panel, under Poses, select the Neutral Pose Component.
Click the Viewport Camera menu and change the Field of View to 20 degrees or less.
In the Viewport, position the camera so that the mesh is facing forward and the head is completely visible.
With the Neutral Pose Component still selected, click Promote Frame in the Main Toolbar or the + button in the bottom toolbar.
After you promote a frame, it will appear in the Frame Promotion Timeline that runs along the bottom of the Viewport. The first frame you promote will be set as the frontal view and marked with an additional (F) marker.
As this is the first Promoted Frame, its name is Frame 0. You can double-click the frame to rename it.
(Optional) Promote additional frames. In the Frame Promotion Timeline, click the Free Roaming Camera Mode button highlighted in the screenshot below, then repeat the process outlined in step 4.
Only do this step If you need to manually fit details like ears or nostrils (for example, a side profile view) and activate specific markers for those frames in the Markers Outliner. A frontal frame with full view of nasolabial folds, lips, and eyelids is always necessary. Additional frames should only contain Markers not present in the front frame, and be added only if a feature (like ears or nostrils) fits poorly after the Identity Solve step.
If you have multiple frames containing the same markers (for example, two different frontal frames), make sure that only one of them is used for the Identity Solve. The Used to Solve flag is a per-frame attribute that can be set in the Details panel with the Poses > Neutral Pose Component selected.
Use to Solve flag on a Promoted Frame. You can set these flags individually for each frame.
With the frame you created still selected, in the main toolbar, click the Track Markers (Active Frame) button. This creates a series of markers along the facial features of your mesh.
If auto-tracking is enabled from the Promoted Frame's context menu, you can unlock the camera and move it around to see how well the tracker responds.
Lock the current frame. In the Frame Promotion Timeline, right-click the frame and make sure that the Lock Camera option is enabled.
Once a frame is locked, you can still navigate by setting the camera in free roaming mode. To do this, click the Camera button on the left of the Frame Promotion Timeline.
Optional) Zoom in, then click and drag to adjust any markers that don't align correctly with facial features.
Run the Identity Solve
Still in the Identity Asset Editor, from the Main Toolbar, click the MetaHuman Identity Solve button. This button will only be enabled once at least one Promoted Frame exists in the timeline, and will produce good results only when the markers are active and well-tracked.
Identity Solve fits the Template Mesh vertices to the volume of the Neutral Pose Mesh you tracked in previous steps. This part of the process happens in the cloud.
After the Identity Solve completes, you can toggle between your original Mesh and the Template Mesh in the Viewport by clicking the A / B tabs in the Viewport Toolbar. The A and B buttons activate two separate frame buffers for the same Viewport camera, and promoting a frame creates a snapshot of that camera. You can configure Viewport settings, such as lighting, individually for each of these two frame buffers.
In addition, you can toggle the display of both the original Mesh and the Template Mesh in the same frame buffer from the viewport toolbar, by enabling or disabling the Neutral Pose and Template Mesh options. Do this to see if your two meshes "z-fight" for screen occupancy.
Submit the Template Mesh for Auto-Rigging
You can now submit the Template Mesh to the MetaHuman backend. In the Main Toolbar, click the Auto-Rig MetaHuman Identity button.
This creates an auto-rigged Skeletal Mesh with embedded MetaHuman DNA. The mesh is automatically downloaded and added to your Unreal Engine project in the same folder as the MetaHuman Identity Asset and has the same name as the Identity Asset, with the prefix SK_.
Unreal Engine will show a confirmation dialog when the process has finished. You will then be able to see the new Skeletal Mesh in the Content Browser.
Next Up
From Template Mesh
Create a MetaHuman Identity from template mesh data.