unreal.MetaHumanVideoLiveLinkSubjectSettings¶
- class unreal.MetaHumanVideoLiveLinkSubjectSettings(outer: Object | None = None, name: Name | str = 'None')¶
Bases:
MetaHumanVideoBaseLiveLinkSubjectSettingsMetaHuman Video Live Link Subject Settings
C++ Source:
Plugin: MetaHumanLiveLink
Module: MetaHumanLocalLiveLinkSource
File: MetaHumanVideoLiveLinkSubjectSettings.h
Editor Properties: (see get_editor_property/set_editor_property)
alpha(float): [Read-Write]capture_neutral_frame_countdown(int32): [Read-Write]capture_neutral_head_pose_countdown(int32): [Read-Write]capture_neutral_head_translation_countdown(int32): [Read-Write] deprecated: CaptureNeutralHeadTranslationCountdown is deprecated. Use CaptureNeutralHeadPoseCountdown instead.capture_neutrals_property(int32): [Read-Write]dropping(str): [Read-Only] Whether video frames are being dropped because they can not be processed fast enough.focal_length(double): [Read-Only] The focal length of the video being processed.fps(str): [Read-Only] Processing frame rate.frame(str): [Read-Only] Frame number being processed.frame_rate(FrameRate): [Read-Only] Last FrameRate estimated by the subject. If in Timecode mode, this will come directly from the QualifiedFrameTime.head_orientation(bool): [Read-Write] When enabled the rotational orientation of the head is output. You may want to disable this option if the head is being tracked by other means (eg mocap) or if you wish to analyze the facial animation on a static head.head_stabilization(bool): [Read-Write] Reduces noise in head position and orientation.head_translation(bool): [Read-Write] When enabled, and a neutral head position has been set, the position of the head is output. You may want to disable this option if the head is being tracked by other means (eg mocap) or if you wish to analyze the facial animation on a static head.interpolation_processor(LiveLinkFrameInterpolationProcessor): [Read-Write] The interpolation processor the subject will use.monitor_image(HyprsenseRealtimeNodeDebugImage): [Read-Write] Shows the video being processed. Options are None (no image), Input Video (the raw video), or Trackers (the video with tracking markers overlaid which can be useful in analysing the stability of the animation solve). Note this monitoring takes up resources so you may want to use it sparingly especially at high webcam frame rate or heavily loaded scenesneutral_frame(Array[float]): [Read-Write]neutral_head_orientation(Rotator): [Read-Write] Head orientationneutral_head_translation(Vector): [Read-Write] Head translationoutbound_name(str): [Read-Write] Name override that will be transmitted to clients instead of the subject name.parameters(MetaHumanRealtimeSmoothingParams): [Read-Write] Smoothingpre_processors(Array[LiveLinkFramePreProcessor]): [Read-Write] List of available preprocessor the subject will use.properties(Array[Name]): [Read-Write] The properties to calibrate.rebroadcast_subject(bool): [Read-Write] If enabled, rebroadcast this subjectremapper(LiveLinkSubjectRemapper): [Read-Write] Remapper used to modify incoming static and frame data for a subject.remove(str): [Read-Only]resolution(str): [Read-Only] The resolution of the video being processed.rotation(MetaHumanVideoRotation): [Read-Write] Allows for the input video to be rotated by 90, 180, or 270 degrees prior to processing. This can be used to account for different camera mountings.source(str): [Read-Only] Source that contains the subject.state(str): [Read-Only] The state of the processing.state_led(Color): [Read-Only]subject_name(str): [Read-Only] Name of this subject.timecode(str): [Read-Only]translators(Array[LiveLinkFrameTranslator]): [Read-Write] List of available translator the subject can use.translators_proxy(LiveLinkFrameTranslator): [Read-Write] Proxy property used edit the translators.