Processing animation using MetaHuman Animator can be automated as part of a performance capture workflow with the Python API. The MetaHuman Animator plugin includes a number of example scripts that can be used as a reference and modified to suit your specific requirements. Python scripts should be executed using the UnrealEditor executable.
Use forward slashes / (instead of \) for paths that appear in a command to avoid problems with character parsing.
Audio Driven Animation
An example script for solving animation from audio is provided in the plugin. This can be used as a reference, and modified to suit your specific requirements. It can be found in the following location:
\Engine\Plugins\MetaHuman\MetaHumanAnimator\Content\Python\process_audio_performance.pyConsult the ReadMe.txt file that accompanies this Python script for additional information about exporting Animation Sequences or Level Sequences using the export_performance.py script.
Animation from Depth Data
An example script for solving animation from depth data is provided in the plugin. This can be used as a reference, and modified to suit your specific requirements. It can be found in the following location:
\Engine\Plugins\MetaHuman\MetaHumanAnimator\Content\Python\process_performance.pyConsult the ReadMe.txt file that accompanies this Python script for additional information about creating an identity using create_identity_for_performance.py, exporting Animation Sequences or Level Sequences using the export_performance.py script, and rendering.
Generate Calibration
Calibration can be generated from a calibration take captured using a stereo camera pair. It requires a Capture Data asset (ingested using Capture Manager) and an instance of the MetaHumanCalibrationGeneratorOptions. For example:
import unreal
calibration_generator = unreal.MetaHumanCalibrationGenerator()
calibration_generator.process(capture_data_asset, calibration_generator_options)An instance of the MetaHumanCalibrationGeneratorOptions can be created as follows, although many of the settings will be set by default.
calibration_generator_options = unreal.MetaHumanCalibrationGeneratorOptions()
calibration_generator_options.package_path.path = '/Game/GenerateCalibration/'
calibration_generator_options.auto_save_assets = True # Set by default
calibration_generator_options.sample_rate = 30 # Set by default
calibration_generator_options.board_pattern_width = 15 # Set by default
calibration_generator_options.board_pattern_height = 10 # Set by default
calibration_generator_options.board_square_size = 0.75 # Set by default
calibration_generator_options.sharpness_threshold = 5.0 # Set by defaultGenerate Depth
An example script for generating depth data from a stereo camera pair is provided. This can be used as a reference, and modified to suit your specific requirements. It can be found in the following location:
\Engine\Plugins\MetaHuman\MetaHumanAnimator\Content\DepthGenerator\depth_generator_example.pyThe script can be run from a Windows terminal (such as PowerShell) using the following command, with the parameters updated as needed for your project:
.\UnrealEditor.exe "MyProject.uproject" -ExecutePythonScript="<path-to_ue-installation>/Engine/Plugins/MetaHuman/MetaHumanAnimator/Content/DepthGenerator/depth_generator_example.py --cd-package-path /Game/CaptureManager/Imports/ExampleTake/CD_Example"Next Up
Mesh to MetaHuman
Create a MetaHuman Identity from a mesh or video footage.