Expression Pose editing arguably is the most significant part of Expression Editor, and the more intricate editing operation.
Entering expression editing is done similarly to other modes. Ensure you have the correct DNA loaded, then press the button found in the editing modes section of the Home screen.
Expressions Graph
In this graph each node represents an expression, and the dependencies illustrate how expressions combine, or how correctives are applied subsequently to one or more expressions.
The Names of the nodes and the topology of the graph are fixed and represent the Rig Definition.
The Expressions Graph is a tabbed view. At different points in the workflow new tabs can be added, or tab navigation locked, in support of the workflow. The overall, whole-rig view tab is always available and always leftmost.
The graph performs two functions: It displays information about the expression dependencies and their state, which makes it a great debugging and even learning tool as you familiarize yourself with the MetaHuman rig, and it sets the context for key operations like propagating (or not) expression edits to dependent expressions.
Scene Assembly
Like in other modes, to be able to edit Poses it’s first necessary to assemble the scene.
This will create a fully functioning rig in the Maya scene. Unlike in other modes, only LOD0 editing is supported and automatically defaulted to on Assembly.
Loading Animation
We strongly encourage the use of animation, captured, hand keyed, or artificially controlling the rig into specific expressions, as a starting point to inspect behavior and identify if intervention is necessary.
FBX files containing animation can be loaded for either the Faceboard Rig, or directly on the RigLogic channels. Such animation can be loaded from the buttons up top, each command configures the scene to hook/unhook the faceboard and load animation on the appropriate channels.
Loading Animation has some requirements on what’s contained in the animation file:
To load animation on the Faceboard, the FBX file must be on the appropriate channels of the Faceboard controls as they exist in the scene assembled in Expression Poses Editing Mode.
To load RigLogic Raw animation, the FBX file must contain an object called CTRL_expressions, containing all raw control attributes, which can be keyed (i.e CTRL_expressions.browDownL and similarly named attributes.)
Frame Analysis
Whether the controls have posed manually, or animation is loaded on the channels, the rig can now be analyzed to show which nodes are active and contributing to the current frame’s pose.
Click the analyze frame button, and the graph will update showing a heat map of the rig’s activity in terms of poses (warmer and more intense node swatch colors indicate higher activity.)
Once the most significant contributions to the frame have been spotted it’s now possible to isolate their results by turning on Preview Selected Node from the RMB Click Menu. This will set the viewport to the full contribution of the last node in the selection. This setting is saved in the user preferences.
Once an expression has been identified as in need of editing, you can work from any existing graph view, or select dependencies and use Isolate Selected Node to create a new tab with a reduced node population.
Active Expression Editing
Most of Expression Editor’s toolkit serves the calibration of target expressions, both meshes and joints.
Because Joints can be matched to meshes, and LOD0 can be seamlessly reduced to all other LODs, Expression editing mode ONLY works with LOD0 changes. While other LODs can be displayed and brought to par automatically, editing is intentionally constrained to the highest resolution LOD.
To edit an expression, a node needs to be selected, and the button with green circles pressed. Alternatively, double-clicking a node will enable the user to edit that node’s expression.
If a valid selection exists the button will change state and look, and the rest of the top toolbar will be disabled.
All tabs except for the currently active tab will be disabled. The top toolbar will also be disabled and the edit mode button will turn red.
In the current tab you will be able to set node locks to choose what expressions are affected by an edit (delta propagation). This can be done through the padlock icon on each node downstream of the currently edited node.
The lock state of a node is a global state. This means no matter in what graph tab you changed it, it will affect all tabs. The lock state of a node is also conditional, if all nodes upstream of another node are locked, the downstream node won’t receive a delta.
The geometry and joints outliners should become visible, and a vertical toolbar appears and becomes active.
You can find a full description of each command under the Utility Toolbar heading of the UX Overview section.
Previewing Upstream Expressions
Another key element when actively editing an expression are the preview sliders.
These sliders allow to preview how upstream expressions would blend out of a combination. If an expression has fewer than two upstream expressions then only the global slider will be available.
The global slider at the top of this set blends all other sliders at the same time. This is effectively like blending between the neutral pose and the currently edited expression.
The successive sliders (if present) will instead each control the blend of an upstream expression. This is one of the most important artistic/quality inspection steps after the calibration of an expression, as it allows previewing if simpler (upstream) expressions combine well.
Previewing Downstream Expressions
Preview sliders are excellent to preview upstream expressions and verifying that they combine nicely into an actively edited combination, they don’t do anything for downstream expressions you might be propagating changes to.
Inspecting downstream nodes before committing to a change is left to preview meshes.
Three buttons are available to create preview meshes.
The three buttons will, respectively, do the following:
Create preview meshes for all selected nodes in the expression graph
Create preview meshes for all unlocked nodes downstream of the actively edited expression’s
Delete the meshes created by either or both the above
Expression with phases
When editing the phase expression slider will have points where you can edit expressions. By clicking on left or right you will go through phases and when joints become red you will be able to edit that phase. Then use sliders to see how it will look when you are done with editing.
Committing Changes Downstream (Delta Propagation)
We do not recommend the use of Advanced Delta Propagation to anybody other than a very experienced technical artist who’s grown very comfortable with what expressions do and how they combine. It’s for all intents and purposes the “Expert Mode” of Expression Editor.
Two distinct delta propagation modes exist. This behavior is controlled by an option in the Settings and Preferences menu.
Default delta propagation — Edits are propagated in full, to all unlocked and non-deactivated nodes (non-deactivated being nodes that have at least one upstream node unlocked), only when the active editing is concluded (round button in the top toolbar or double clicking another node to change editing activity)
Advanced delta propagation — Edits are propagated partially to all unlocked and non-disabled nodes at the time of change, and changing the lock state of nodes and then proceeding to further edits propagates to the new selection without need to leave the active editing mode. This allows to propagate small adjustments to complex selection while editing, without having to change editing modality. The greater “flow” however produces changes that will be more difficult to inspect and validate.
Once happy with the changes made to an expression, how it’s blended into from its upstream dependencies, and how the changes have been propagated downstream, one can leave active editing mode by clicking the round button in the top toolbar, or double clicking the currently active node. This will finalize delta propagation to the vertices.
After exiting active editing of an expression changes are committed to the scene, but only to the vertices. The rig is perfectly functional to preview animation, but the information is entirely stored in “morph targets”; joints haven’t been re-positioned and these are only available at LOD 0.Joint Matching is a separate step by design that should follow vertex edits.
Double clicking another node while actively editing a first will also exit the current activity for the first, commit changes, and start active editing for the [other] expression that’s been double clicked.
The very first time MLJM is run from the Expression Editing Toolbar, it will take a few minutes (depending on machine specs) because it needs to load the ML Model. Subsequent uses for the session will be instantaneous.
Joint Matching is described in far greater detail in the following section, but we want to make note here that, during expression editing, it’s possible to do at least some Joint Matching at the expression level, albeit limited to the Machine Learning Based Joint Matching (offered in the toolbar).
Scene Level Joint Matching
We recommend tackling Joint Matching methodically, the way an optimization process would be, and not “artistically”. Creative contributions and likeness matching are best performed on vertices in the expression editing phase, Joint Matching is then responsible to minimize the displacement contributed by the morph targets, so that as much information and accuracy as possible is represented by joint transformation alone.
For the most part this is done automatically by the Joints Matching passes you can run, but inspection is highly recommended, and some corrections in the first few stages are common.
Minimizing the difference between the LOD 0 (joint AND vertex animation) and the joints-only animation doesn’t require minimizing the distance between joints and vertices. In expressions this is, in fact, often counterproductive.
The minimization should aim to align as closely as possible the volume of joints-only LODs (or LOD0 without vertex animation) to the full detail of LOD.
In a good result joints influencing more than one vertex, and vertices influenced by more than one joint, hardly ever line up with vertices that were near them in the neutral pose.
Surface joints being close to vertices is only really possible in the Neutral Pose, and applies poorly to expressions.
Joint Matching is more than just a frame rate related optimization. Joint Animation interpolates in a way linear shape interpolation can’t. Even if you were to only use LOD0 in a cinematic setting, we still recommend performing Joints Matching. Morph targets can greatly enrich motion, but should target subtleties and minimal displacement.
There are two types of joint matching: NLS, and ML.
NLS Joint Matching (NLSJM) is a well established more analytical process that can operate on variable sets of joints, while ML is an experimental, data driven Machine Learning based approach that was trained on our database (MLJM).
ML Joint Matching (MLJM) works quickly and in a single pass for humans with relatively normal proportions. It will however degrade when the target meshes deviate from normal human morphology. It’s often a good “I’m feeling lucky” thing to run and inspect the results of.
After running MLJM you might still want to run some of the NLS presets that take care of joints that MLJM won’t position such as those related to head turns. More details can be found in the relative Known Issue.
NLSJM instead is run in multiple passes that affect specific sets of joints.
NLS Joint Matching
NLSJM is run in “passes”. Passes are sets of joints that the algorithm will be run on so they can be repositioned. Each pass positions joints to match the volume found in the meshes representing the expressions.We offer a complete list of the ones we found to work best as a list of presets named sequentially for convenience and guidance.
Running a Joint Matching pass is as simple as selecting which pass to run, and clicking the Run button.
The preset selection only affects NLSJM, and won’t do anything for MLJM.
Joint Matching can be interleaved with manual corrections, we’ll offer guidance on a specific and usually successful sequence in a following section on Editing Recommendations.
Running a preset can take from several seconds to a handful of minutes, depending on the preset and how many joints are in it.
At least some NLSJM passes are always needed and worth running, even when MLJM has been run (see recommendations in the MLJM section.)
ML Joint Matching
MLJM is an experimental feature.
MLJM is a Machine Learning joint placement system that was trained on characters in our Database (the same database MetaHuman Creator uses).
It can operate on individual expressions (as seen in the expression editing toolbar), or on most expressions in the graph all at once.
ML Joint Matching Joint Exclusions
Expression Editor explicitly prevents MLJM from affecting the following joints:
|
|
|
This is generally not relevant for an end user, but might matter if you make direct use of the libraries and ignore Expression Editor specific implementation details.
MLJM will not affect some expressions (listed above), which is all the more reason to run some of the NLSJM passes that will.
Update DNA
DNA as it is in memory will be overwritten by the contents of the scene.
Be mindful of interactions with Joints Matching (which won’t update the scene) and this update.
We recommend you always reload the DNA after Joints Matching, before starting or going back to Expression editing work.
This command analyzes the state of expressions in the scene, meshes and joints, then re-encode all of them to the DNA that’s in memory. This command saves a temporary file which we don’t recommend using, the command doesn’t constitute a save operation, which is another command (to its right.)
Save DNA
This saves the DNA as it exists into memory to a file. No other calculations are performed, and it’s the user’s responsibility to be sure that the DNA in memory reflects all the changes they wish to save.
Until the user is well accustomed with the workflow and what effects are visible at which point, we recommend reloading DNA files after they have been saved before further operations.
Next Up
Saving and Exporting Data
Preparing the modified MetaHuman head DNA for use in MetaHuman Creator and Unreal Engine.