LiveLink Hub is a standalone executable that launches as a separate process on your workstation. Through LiveLink Hub, you can stream the data from your motion capture recording system to UEFN.
Use LiveLink Hub to capture data from your motion capture recording system to use as animations for your FN Mannequin characters, then use these animations to create movements that match your NPC's personality and theme.
You can then use the Take Recorder to record the live animation captured in LiveLink Hub.
To launch LiveLink Hub, navigate to Tools > LiveLink Hub.
See the LiveLink Hub tutorial to go through the steps of using this tool.
Using LiveLink Hub
Below is an explanation of LiveLink Hub's features and tools. LiveLink Hub has an overall taskbar to record your animation, along with three panels, broken down into corresponding numbers below.
Record Button
Use this button to record and save your live animations. Click Record to record your animation and click again to save it.
Timecode Bar
This bar shows the timecode for the data captured from your motion capture recording system.
Select the dropdown box to change the system time to either 24, 30, or 60 fps.
Sources Panel
This panel shows the devices on your computer or network that will stream animation into UEFN. You can note your motion capture system source type, source machine, and status.
In the Sources Panel, click Add Source to choose between the following LiveLink Sources:
Apple ARKit Source
LiveLinkInputDevice Source
Message Bus Source
Mocopi LiveLink
Pose AI App
Rokoko Studio Source
1Vicon Data Stream Source
Subjects Panel
This holds animation subjects, which are individual characters or objects that are being streamed.
Clients Panel
This panel displays the UE/UEFN sessions open on your computer or accessible on your network. By default, LiveLink Hub will automatically connect to and start streaming animation data to your UEFN session.
This tutorial will walk you through setting up your project to capture and record animations on a FN Mannequin character. These can then be used on a Fortnite character for the Character device.
You will first create an Inverse Kinematics (IK) Rig from an imported skeletal mesh asset for FN Mannequin. IK Rigs are used for manipulating areas on skeletal meshes to create animations.
Then, stream your motion capture data into the LiveLink Hub to then record it to use in your gameplay and cinematics. LiveLink Hub is a tool that provides a common interface for streaming animation data into UEFN.
Before starting this tutorial, create a project from the Animation template located in the Feature Examples section of the Project Browser.
Create an IK Rig
Follow the steps below to create an IK Rig from an imported skeletal mesh asset.
In your project's Content Browser, navigate to your project's folder.
Right-click on your project folder, select New Folder, and name the mocopi import as "Mocopi".
Visit the Epic Games Box site to download the mocopi skeletal mesh asset.
Once downloaded, select Import from the Content Browser toolbar, and select your downloaded mocopi asset.
Make sure the Skeleton field is empty when importing the .fbx file, then select Import All.
Take note of the assets created.
In the Content Browser locate your Mocopi assets and right-click on the "MocopiMannequin" skeletal mesh. Navigate to Create > IK Rig.
Double-click on the new IK Rig asset "IK_Mocopi Mannequin" to open the asset window.
In the new window, click Auto Create Retarget Chains, then Auto Create IK, and save the asset.
The Mocopi skeleton will be the only asset recognized by UEFN.
These two actions will create the FK Chains and IK Effectors needed for a full body solve.
Next, in the Content Browser, right-click on the new IK Rig asset, "IK_Mocopi Mannequin", and select Create IK Retargeter.
Double-click on the IK Retargeter asset you just created.
In the Details panel of the new window, open the Target IKRig Asset dropdown then select "IK_FN_Mannequin".
In the Viewport toolbar, click Running Retarget to turn the retarget asset into Edit Retarget Pose.
Navigate to the Auto Align dropdown and select Align All Bones.
Save your asset.
You should now see the two sets of bones line up with each other. You can take this a step further by retargeting the source and target skeletons's base rotations.
Next, bring the live animation data into UEFN through LiveLinkHub.
Streaming into LiveLinkHub
Follow the steps below to import your asset as an animation. This example uses a Sony mocopi motion capture system.
Navigate to Tools > LiveLink Hub to access the LiveLink Hub.
Add the mocopi source into LiveLink Hub by selecting Add Source < Mocopi LiveLink < Create Mocopi Source.
After the mocopi source has been added, you will see a new subject added in the Subjects panel. A green icon beside the subject indicates a healthy subject. A yellow icon indicates stale data that is not currently active.
In your UEFN session, drag the MocopiMannequin skeletal mesh into your level.
Take note of the LiveLink Hub connected message in the bottom toolbar of UEFN.
Next, add the FN_Mannequin to the next level.
Select the MocopiMannequin in the Outliner or Viewport.
In the Details panel, click Add, then type "Performer Component".
Select the new Performer Component. In the Subject Name field, select your MocopiSkeleton subject.
In the Viewport, you will now see your motion capture applied to the MocopiMannequin.
Next, select the FN_Mannequin in the Viewport or Outliner.
In the Details panel, click Add then type "Retarget Component".
Select the new retarget component then open the Retarget Asset dropdown, and select the RTG_MocopiMannequin you previously created.
In the Details panel, open the Source Skeleton Mesh Component, then choose the SkeletalMesh under MocopiMannequin from the drop-down menu.
In the Viewport, you will now see animation from both the MocopiMannequin and FN_Mannequin mesh.
Synchronizing Timecodes
In the steps below, you will set the UEFN timecode rate from LiveLinkHub. This will ensure your animations are recorded at the correct frame rate.
In the LiveLink Hub, click on the timecode dropdown.
Check Enable Timecode Source and set the Timecode Provider to SystemTime to 30fps.
LiveLink will display the timecode as 30fps, the time, and a green icon to indicate that the timecode is being sent to UEFN.
Recording Animation
In the steps below, you will capture an animation sequence on a FN Mannequin character to apply to a Fortnite character for the Character device.
In the UEFN toolbar, navigate to Window > Cinematics > Take Recorder.
From the Outliner, drag the FN Mannequin into the Take Recorder.
Press the record icon to start recording your animation.
Next, press the film icon to review the last recording.
Press the record icon again to end the recording.
Select the icon to review the last recording to see your animation.
Your recorded level sequence will now load for you to review.
Your character will continue to animate as if it has not been recorded. The end result will be an animation sequence on the Fortnite mannequin character, which you can apply to the Character device.