unreal.MetaHumanVideoBaseLiveLinkSubjectSettings

class unreal.MetaHumanVideoBaseLiveLinkSubjectSettings(outer: Object | None = None, name: Name | str = 'None')

Bases: MetaHumanLocalLiveLinkSubjectSettings

MetaHuman Video Base Live Link Subject Settings

C++ Source:

  • Plugin: MetaHumanLiveLink

  • Module: MetaHumanLocalLiveLinkSource

  • File: MetaHumanVideoBaseLiveLinkSubjectSettings.h

Editor Properties: (see get_editor_property/set_editor_property)

  • alpha (float): [Read-Write]

  • capture_neutral_frame_countdown (int32): [Read-Write]

  • capture_neutral_head_pose_countdown (int32): [Read-Write]

  • capture_neutral_head_translation_countdown (int32): [Read-Write] deprecated: CaptureNeutralHeadTranslationCountdown is deprecated. Use CaptureNeutralHeadPoseCountdown instead.

  • capture_neutrals_property (int32): [Read-Write]

  • dropping (str): [Read-Only] Whether video frames are being dropped because they can not be processed fast enough.

  • focal_length (double): [Read-Only] The focal length of the video being processed.

  • fps (str): [Read-Only] Processing frame rate.

  • frame (str): [Read-Only] Frame number being processed.

  • frame_rate (FrameRate): [Read-Only] Last FrameRate estimated by the subject. If in Timecode mode, this will come directly from the QualifiedFrameTime.

  • head_orientation (bool): [Read-Write] When enabled the rotational orientation of the head is output. You may want to disable this option if the head is being tracked by other means (eg mocap) or if you wish to analyze the facial animation on a static head.

  • head_stabilization (bool): [Read-Write] Reduces noise in head position and orientation.

  • head_translation (bool): [Read-Write] When enabled, and a neutral head position has been set, the position of the head is output. You may want to disable this option if the head is being tracked by other means (eg mocap) or if you wish to analyze the facial animation on a static head.

  • interpolation_processor (LiveLinkFrameInterpolationProcessor): [Read-Write] The interpolation processor the subject will use.

  • monitor_image (HyprsenseRealtimeNodeDebugImage): [Read-Write] Shows the video being processed. Options are None (no image), Input Video (the raw video), or Trackers (the video with tracking markers overlaid which can be useful in analysing the stability of the animation solve). Note this monitoring takes up resources so you may want to use it sparingly especially at high webcam frame rate or heavily loaded scenes

  • neutral_frame (Array[float]): [Read-Write]

  • neutral_head_orientation (Rotator): [Read-Write] Head orientation

  • neutral_head_translation (Vector): [Read-Write] Head translation

  • outbound_name (str): [Read-Write] Name override that will be transmitted to clients instead of the subject name.

  • parameters (MetaHumanRealtimeSmoothingParams): [Read-Write] Smoothing

  • pre_processors (Array[LiveLinkFramePreProcessor]): [Read-Write] List of available preprocessor the subject will use.

  • properties (Array[Name]): [Read-Write] The properties to calibrate.

  • rebroadcast_subject (bool): [Read-Write] If enabled, rebroadcast this subject

  • remapper (LiveLinkSubjectRemapper): [Read-Write] Remapper used to modify incoming static and frame data for a subject.

  • remove (str): [Read-Only]

  • resolution (str): [Read-Only] The resolution of the video being processed.

  • rotation (MetaHumanVideoRotation): [Read-Write] Allows for the input video to be rotated by 90, 180, or 270 degrees prior to processing. This can be used to account for different camera mountings.

  • source (str): [Read-Only] Source that contains the subject.

  • state (str): [Read-Only] The state of the processing.

  • state_led (Color): [Read-Only]

  • subject_name (str): [Read-Only] Name of this subject.

  • timecode (str): [Read-Only]

  • translators (Array[LiveLinkFrameTranslator]): [Read-Write] List of available translator the subject can use.

  • translators_proxy (LiveLinkFrameTranslator): [Read-Write] Proxy property used edit the translators.

get_head_orientation() bool

Get Head Orientation

Returns:

head_orientation (bool):

Return type:

bool

get_head_stabilization() bool

Get Head Stabilization

Returns:

head_stabilization (bool):

Return type:

bool

get_head_translation() bool

Get Head Translation

Returns:

head_translation (bool):

Return type:

bool

get_monitor_image() HyprsenseRealtimeNodeDebugImage

Get Monitor Image

Returns:

out_monitor_image (HyprsenseRealtimeNodeDebugImage):

Return type:

HyprsenseRealtimeNodeDebugImage

get_rotation() MetaHumanVideoRotation

Get Rotation

Returns:

out_rotation (MetaHumanVideoRotation):

Return type:

MetaHumanVideoRotation

set_head_orientation(head_orientation) None

Set Head Orientation

Parameters:

head_orientation (bool)

set_head_stabilization(head_stabilization) None

Set Head Stabilization

Parameters:

head_stabilization (bool)

set_head_translation(head_translation) None

Set Head Translation

Parameters:

head_translation (bool)

set_monitor_image(monitor_image) None

Set Monitor Image

Parameters:

monitor_image (HyprsenseRealtimeNodeDebugImage)

set_rotation(rotation) None

Set Rotation

Parameters:

rotation (MetaHumanVideoRotation)