MetaHuman Creator
Conform Head From Identity Not Available on Linux/macOS
The Head > Conform > From Identity option is disabled on Linux and macOS as it is not possible to create a MetaHuman Identity asset on these platforms.
Workaround
A workaround is not available at this time.
Crash When Using UEFN Export and UEFN Island is Loaded
Unreal Engine will crash while running the UEFN Export assembly process if the UEFN island is loaded.
Workaround
Ensure UEFN is closed whilst running the UEFN Export assembly process.
UEFN Export Assembly Cannot Reference Assets From Other Plugins
The UEFN Export assembly process will fail if the MetaHuman Character references assets (for example, as part of the wardrobe) that are part of another plugin.
Workaround
Only reference assets from the current project or that are part of the MetaHuman Creator plugin.
Invalid Arguments Error When Autorigging
Using the conform workflow with a mesh that has zero polygon areas will lead to an “Invalid Arguments” error when autorigged.
Workaround
Find and then fix any zero polygon areas in the mesh by inspecting mesh degeneration. In Maya for example you can use “Cleanup Options” > “Faces with zero geometry area” checkbox. There are also various other ways to check mesh degeneration which you may prefer.
Grooms on macOS only support Shader Model 6 (SM6)
MetaHuman grooms on macOS only support SM6.
Workaround
Disable Shader Model 5 (SM5) and enable SM6 in your project.
Crash when overwriting an existing Actor Blueprint during assembly
Unreal Engine may crash when assembling a MetaHuman character with support for UAF enabled that overwrites an existing Actor Blueprint previously assembled for AnimBP use.
Workaround
Use one of the following methods before assembling your UAF MetaHuman:
Rename the MetaHuman character asset before export.
Change the target root directory in the Assembly Options to a new location.
Manually delete the existing Actor Blueprint asset exported for AnimBP usage before exporting the new UAF version.
Unable to Control Texture Resolution in DCC Export Assembly Pipeline
Textures created by the DCC Export assembly pipeline do not respect texture resolution settings made in MetaHuman Creator.
Workaround
The underlying texture graph assets for all skin textures (/MetaHumanCharacter/TextureGraphs/TGI_SkinDCC) and eyes (/MetaHumanCharacter/TextureGraphs/TGI_Eye_Sclera_sRGB) can be modified manually to achieve a similar result. As these assets ship with Unreal Engine, care must be taken to avoid conflicts when updating engine versions.
Incorrect Textures when assembling for DCC
When assembling a character using the DCC Export pipeline, the textures will appear incorrectly.
Workaround
Disable ‘Substrate materials’ from the Project Settings and ‘Virtual Textures’ from the Project Settings under the MetaHuman Character plugin before assembly for DCC.
MetaHuman Modules are Rebuilt
MetaHuman modules are rebuilt in the installed engine from the Epic Games Launcher when creating new C++ projects, or when switching between repositories.
Workaround
No workaround other than to let the engine rebuild.
MetaHuman Clothing assets to don’t resize immediately on Body edit
When a user modifies a MetaHuman Body that has clothing garments applied, the clothing asset does not resize immediately.
Workaround
Navigate to a different panel within MetaHuman Creator, or save the asset.
Unable to Remove Rig Using Python API
It is currently not possible to remove a rig using the Python API once it has been created.
Workaround
A workaround is not available at this time.
Unable to Conform Body Using Python API
It is currently not possible to conform the body via DNA or skeletal mesh via the Python API, despite an example present for it.
Workaround
A workaround is not available at this time.
MetaHuman Animator
Realtime Live Link Face Source Won’t Connect to the Live Link Face iOS App if ARKit Mode is Active
When using the Live Link Face iOS app, the Live Link Face Source within UE/Live Link Hub will be unable to successfully connect if the iOS app is configured to use the ‘ARKit’ mode as opposed to the ‘MetaHuman Animator’ mode.
Workaround
Configure the Live Link Face iOS app to use the ‘MetaHuman Animator’ mode. To do this go to the settings screen within the app and under the ‘Capture’ section select ‘Mode’. Select ‘MetaHuman Animator’ and tap ‘Continue’. Note that this mode requires a device with a TrueDepth camera (i.e. a device which supports Face ID).
Artifacts Visible on the Depth Map
Rendering artefacts may be visible when viewing the depth map for a MetaHuman Identity with a large depth range selected in the camera settings of the Viewport. This is a visual issue only and does not impact the actual depth data (stored in the EXR format) or quality of the MetaHuman Identity and final animation results.
Workaround
In the viewport camera settings, set the depth range to the minimum possible value.
MetaHuman Capture
Calibration Processing
Grayscale Images Appear Red in the Image Viewer
Grayscale images will appear red within the image viewer for calibration processing. This does not impact or affect the process.
Workaround
A workaround is not available at this time.
Unable to Cancel Auto Frame Selection
If board settings are incorrect for the footage, users will be unable to cancel the auto frame selection.
Workaround
Wait for the process to complete.
Auto Frame Selection Selects Blurry Frames
If footage contains blurry frames, automated frame selection may pick them for the process, but the calibration algorithm will discard them.
Workaround
A workaround is not available at this time.
Failure Warning Message Does not Persist
If the calibration process fails, the error toast message will disappear after a few seconds, which users may miss.
Workaround
A workaround is not available at this time.
Poor Quality Depth Generation Due to Mismatched Image Sequences and Calibrations
When generating depth for a Capture Data asset, the resulting depth image sequence may be low quality if the order of the Image Sequences and Lens Files (within the Camera Calibration) do not match. For example the first Lens File in the Camera Calibration needs to correspond to the first Image Sequence in the Capture Data.
Workaround
Image Sequences in the Capture Data and/or Lens Files in the Camera Calibration can be manually reordered to prevent this problem.
Poor Quality Depth Generation for Footage With Misaligned Start Timecodes
When generating depth from a Capture Data asset, the resulting depth image sequence may be low quality if the Image Sequences in the Capture Data do not have matching start timecodes. Depth generation requires that the Image Sequences start on the same frame and have matching frame rates.
Workaround
Ensure your stereo HMC footage is aligned with respect to start timecode and frame rate.
Incorrect video track duration after capturing high frame rate takes in Live Link Face 1.6.0
Version 1.6.0 of the Live Link face for iOS now includes more flexible capture options including the ability to capture footage at much higher frame rates than previously offered. On certain iOS devices this may result in dropped video frames being absent from the recorded .mov files due to system performance constraints. Whilst these takes will be ingested into UE without error, the video tracks will appear to have a shorter duration than expected when viewed within the MetaHuman Animator identity or performance editors. This will result in accelerated playback, unexpected animation results and audio misalignment.
Workaround
The simplest workaround is to re-record the take using a less demanding target frame rate, however if that is not possible then the take can be repaired outside of UE using ffmpeg. Note that the take will need to be re-ingested once the following modifications are made:
Ensure ffmpeg is installed on your system and available as part of the PATH environment variable.
Ensure you have a copy of the problematic Live Link Face take data on your PC.
Navigate to the location of the take and open the directory with PowerShell.
In PowerShell, prepare the following ffmpeg command:
ffmpeg -noautorotate -i input.mov -q:v 3 -filter:v fps=90 -c:v mjpeg output.movBe sure to specify the correct
input.movname for the given take, there should be a single.movfile within the take directory.Specify the relevant target frame rate for the
fps=portion of the command.
Execute the command.
On completion make a note of the final
frame=(number)value output by ffmpeg, we’ll use this to update the take metadata which is required for successful ingest into UE.Open the
take.jsonfile within the take directory and update the value for the"frames"JSON property to match the final frame count produced by ffmpeg.Rename the new
output.movfile to match the name of the original.movfile within the take.You will need to either delete or rename the original mov file. This file will no longer be used for ingest.
Ingest the take with the updated
.movfile using the target archive ingest approach.
Crash When Attempting to Run Calibration Generation on Monocular Footage
When opening the Generate Calibration window for Capture Data assets which contain only a single Image Sequence, users will encounter a crash when trying to run Automatic Frame Selection. A crash will also occur if any of the calibration board config parameters are modified.
Workaround
No workaround other than to avoid generating calibration for monocular Capture Data assets.
Incorrect Timecode Set on Soundwave Assets Ingested Using the Stereo Video Ingest Device
During ingest using the Stereo Video Ingest device, if the supplied audio .wav file is processed before the video files, the timecode of the resulting Soundwave asset can be set incorrectly. This results in the audio and video data being misaligned in the downstream processing steps which could affect output animation quality.
Workaround
This issue can be mitigated by naming the audio file in a way that places it alphabetically after the video file names in the directory.
MetaHumans in Unreal Engine
Faceboard Look At Feature Control doesn't work with Head Movement
If a user is trying to animate a MetaHuman face with the Look At controls on faceboard, when using head movement as well, the look at controls do not function as expected.
Workaround
Add an empty Face AnimBP to the Face that just feeds a Copy Pose from Parent to the output result.
Rig Mapper / RigMapper Op Plugins
RM_MHL_MHH and RM_MHH_MHL RigMapper Definitions are Missing Controls
The RM_MHL_MHH and RM_MHH_MHL RigMappers are missing outputs for the following controls: CTRL_L_mouth_thicknessInwardU.ty, CTRL_R_mouth_thicknessInwardD.ty, CTRL_R_mouth_thicknessInwardU.ty, and CTRL_L_mouth_thicknessInwardD.ty. In general use, this does not affect users as these controls are not solved for by MetaHuman Animator.
However, if a user were to use the RM_MHL_MHH definition to re-target an Animation Sequence where there are activations of any of the CTRL_expressions_mouthLipsThickInward curves as part of a chain of RigMapper definitions in an IK Retargeter asset RigMapper Op stack operation, these will not be retargeted correctly, thus losing subtle animation on the inner lips.
Workaround
A possible workaround for animators using RigMapper who want to include the missing definitions is to copy the original RM_MHL_MHH and RM_MHH_MHL assets and add the missing definitions manually.
MetaHuman for Maya
Unreal Engine Crash When Importing DNA Without Geometry
Expression Editor in MetaHuman for Maya supports exporting a DNA and FBX file that can be imported directly into Unreal Engine as a Skeletal Mesh (without using MetaHuman Creator). This DNA file does not contain geometry information. Using this DNA file with the Skeletal Mesh asset will cause Unreal Engine to crash.
Workaround
Use a DNA file that does include geometry - this can be the DNA file saved at any other point in the Expression Editor workflow.
Or, import the DNA file directly into MetaHuman Creator using the Conform > Import DNA option.