UE5.0 | VR Mocap: Vive Mocap Study
Megasteakman Vive Motion Capture Tool Experiment Try Out and Review
The video below is the first test I tried implement Megasteakman's Vive tracking tool. There's a webcam view on the side to see the entire motion capturing real-time result. Vive trackers *5 + Valve Index Controllers *2 for limbs and head. No trackers to navigate the elbows pole vector. Quick and easy to setup.
Important Notice
The video above as you can see the elbows are acting weird, tho that's not a problem of the plugin. Megasteakman mentioned in the video comment to point out my setup error. I did not delete the tracker names in the elbow field on the player pawn. and here's the second video address the issue.
Frame Rate Test for Live Event Scenario
With complex environment and mostly dynamic lights, no baking light use in the set, there is around 60 FPS frame rate in this test. Mocap has very little delay, my pose was captured mostly accurate. Vive Trackers setup took me around 10 minutes. I think at this point this method is a better solution for Live Event compare to Rokoko and Neuron. The only downside of this method is fingers are not very responsive due to the current VR controller finger detection is not for finger mocap usage.
VRMocap Study - AnimBP
In animation blueprint, first it will save the calibration pose in the animation instance I assume this step provides a way to store the calibration data so you don't need to calibration each time when you hit play. Then call MocapActor Component Class which contain the most functions for Vive Trackers.
And the next thing he gets player pawn. It is quite inspiring approach for calibration by runtime scaling trackers relative positions. Even though I don't think this method is accurate. (Scale up/down the relative positions to match the virtual character should consider limbs length are vary)
There are two Macro used in this animation blueprint. GetGoalTransformsFromPlayer is used to extract the user mocap setup and also to determine if tracker used or not for this animation blueprint to follow relative to Pelvis data or use Goal(Tracker)
Another Macro, UpdateSteamVRFingerCurls/Splays is used to real-time update fingers data from SteamVR.
In AnimGraph, there are many nodes and quite a big complex setup. I only show the key part for this animation blueprint, Control Rig implementation. The main purpose of this animation blueprint is to drive Control Rig in runtime. It receives steamVR data and send them into Control Rig so the body IK can be driven by trackers.
The rest part of this animation blueprint is the fingers setup, and he uses Transform(Modify) Bone to control their rotations separately. This method is compromised by the facts of VR controller has almost no interpolation data for finger movements.
And the final part of this animation blueprint is implementing iPhone ARKit setup so the user can decide if head rotation should blend with iphone ARKit head rotations.
VRMocap Study - MocapActor_Component
Megasteakman implements many different skeletons, multi-players, Ragdoll, physics animation and different type of mocap setup (Full body and Upper body). For this study, I focus on how he connects the overall functions. In mocapActor_Component, it contains the functions of cast to VRMocap_PlayerPawn(This player pawn is used to determine which actor belong to which character), remap, do all the name list and bone list in this component. All different character blueprint only needs to add this component can function as mocap system.
Last Update: April 2, 2022
Software: Unreal Engine 5.0
OS: Windows 10
Specs: RTX 3080, AMD 3900x, 64GB RAM
Reference::
VR Mocap for Unreal Engine 5 - Complete Set-Up Guide - https://youtu.be/crv6AIbadSo