Python + UE4.27: Hand Tracking Animation - 1

Webcam Video, AI Hand Tracking Model power by Mediapipe, and Real-time Automatic Animation in Unreal Engine 4.27.

The video below is showing the result of using MediaPipe AI model hand-tracking class and transfer via OSC to Unreal Engine 4.27 skeletal mesh for animation. Benefit from AI hand gestures model and libraries. Extracting image data to control animations is the main focus of this test. The original goal is driven more natural looking animation by tracking human performs.

Eagle's wings flopping are controlled by my finger knuckles orientation in real-time. The way it works is followed a pretty simple logic: Image to Data to Animation. And technical methods in this test I used OpenCV. A huge benefit from Google's open source Mediapipe which is a well-trained OpenCV library for public usage. Rather to recompile OpenCV for Unreal Engine, I used python to call and run the functions. And to communicate between these two programs I used OSC protocol to send data. In Unreal Engine, I first created Eagle's animation blueprint as OSC server to receive data. Then created the Eagle's control rig to take control of it's wings. Finally, in the Eagle's animation blueprint plug the OSC data as reference variables into control rig.

Here's the diagram:

STEP 1: OpenCV and Mediapipe

Python code is on my github link below, and this code is not elegance but it works for me at this moment: https://github.com/easylife122/MPhandTrackingOSCout/blob/main/main.py or Expand this section.

Env and Libraries that I used in this test: Python 3.9, OpenCV 2, np, argparse, udp_client

import cv2

import mediapipe as mp

import numpy as np

import argparse

from pythonosc import udp_client


mp_drawing = mp.solutions.drawing_utils

mp_hands = mp.solutions.hands



# Draw Calculate Fingers Angles


joint_list = [[8, 7, 6], [12, 11, 10], [16, 15, 14], [20, 19, 18], [4, 3, 2]]

joint_list[4]


def draw_finger_angles(image, results, joint_list):

   # Loop through hands

   for hand in results.multi_hand_landmarks:

       # Loop through joint sets

       for joint in joint_list:

           a = np.array([hand.landmark[joint[0]].x, hand.landmark[joint[0]].y])  # First coord

           b = np.array([hand.landmark[joint[1]].x, hand.landmark[joint[1]].y])  # Second coord

           c = np.array([hand.landmark[joint[2]].x, hand.landmark[joint[2]].y])  # Third coord


           radians = np.arctan2(c[1] - b[1], c[0] - b[0]) - np.arctan2(a[1] - b[1], a[0] - b[0])

           angle = np.abs(radians * 180.0 / np.pi)


           if angle > 180.0:

               angle = 360 - angle


           cv2.putText(image, str(round(angle, 2)), tuple(np.multiply(b, [640, 480]).astype(int)),

                       cv2.FONT_HERSHEY_SIMPLEX, 0.5, (15, 15, 15), 1, cv2.LINE_AA)

   return image



# Save finger angles data to join_angle


joint_angle = [180,180,180,180,180]

joint_angle[0]


def finger_angles(results, joint_list, joint_angle):


   for hand in results.multi_hand_landmarks:

       for joint in joint_list:

           a = np.array([hand.landmark[joint[0]].x, hand.landmark[joint[0]].y])  # First coord

           b = np.array([hand.landmark[joint[1]].x, hand.landmark[joint[1]].y])  # Second coord

           c = np.array([hand.landmark[joint[2]].x, hand.landmark[joint[2]].y])  # Third coord


           radians = np.arctan2(c[1] - b[1], c[0] - b[0]) - np.arctan2(a[1] - b[1], a[0] - b[0])

           angle = np.abs(radians * 180.0 / np.pi)


           if angle > 180.0:

               angle = 360 - angle

           if joint == [8, 7, 6]:

               joint_angle[0] = angle

           if joint == [12, 11, 10]:

               joint_angle[1] = angle

           if joint == [16, 15, 14]:

               joint_angle[2] = angle

           if joint == [20, 19, 18]:

               joint_angle[3] = angle

           if joint == [4, 3, 2]:

               joint_angle[4] = angle


   return joint_angle



# Detect Hands and Draw Data


cap = cv2.VideoCapture(1)


with mp_hands.Hands(min_detection_confidence=0.8, min_tracking_confidence=0.5) as hands:

   while cap.isOpened():

       ret, frame = cap.read()


       # BGR 2 RGB

       image = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)


       # Flip on horizontal

       image = cv2.flip(image, 1)


       # Set flag

       image.flags.writeable = False


       # Detections

       results = hands.process(image)


       # Set flag to true

       image.flags.writeable = True


       # RGB 2 BGR

       image = cv2.cvtColor(image, cv2.COLOR_RGB2BGR)


       # Detections

       print(results)


       # Rendering results

       if results.multi_hand_landmarks:

           for num, hand in enumerate(results.multi_hand_landmarks):

               mp_drawing.draw_landmarks(image, hand, mp_hands.HAND_CONNECTIONS,

                                         mp_drawing.DrawingSpec(color=(50, 50, 50), thickness=2, circle_radius=4),

                                         mp_drawing.DrawingSpec(color=(250, 50, 50), thickness=2, circle_radius=2),

                                         )



           # Draw angles to image from joint list

           draw_finger_angles(image, results, joint_list)

           finger_angles(results, joint_list, joint_angle)

           print(joint_angle)


       # Save our image

       # cv2.imwrite(os.path.join('Output Images', '{}.jpg'.format(uuid.uuid1())), image)

       cv2.imshow('Hand Tracking', image)


       # Send it via OSC

       if __name__ == "__main__":

           parser = argparse.ArgumentParser()

           parser.add_argument("--ip", default="127.0.0.1", help="The ip of the OSC server")

           parser.add_argument("--port", type=int, default=5005, help="The port the OSC server is listening on")

           args = parser.parse_args()


           # Send joint_angle

           client = udp_client.SimpleUDPClient(args.ip, args.port)

           client.send_message("/handOSC", joint_angle)

           # for x in range(10):

           #     client.send_message("/o", joint_angle)

           #     time.sleep(0)


       if cv2.waitKey(10) & 0xFF == ord('q'):

           break


cap.release()

cv2.destroyAllWindows()


STEP 2: Eagle Animation Blueprint (OSC Receiver & Animation Control)

Here's Event Graph of Animation Blueprint. Running function as OSC server and animation controller. For Control Rig side, will need to create few reference variables and make them editable. In the graph, there are two variables running for transporting purpose: Index and Thumb. They are controlling the wings flopping angles.

Here's the animation graph. Place the control rig between Input Pose and Output Pose then plug in that two parameters Index and Thumb to the Eagle Control Rig. (Need to check the control rig pins of the WRotation_L and WRotation_R in details panel)

STEP 3: Eagle Control Rig

Here I only show the Forward Solve the main function part. I create two reference float parameters: Get WRotation_R and Get WRotation_L. And also create two controller first setup at the shoulders joint position and orientation: WingR and WingL. Plug in GetWRotation_R to control Rotation Y of WingR to control r_shoulder and the same as Left side.

Last Update:  February 4, 2022

Software: Unreal Engine 4.27, Python 3.9, Visual Studio Code

OS: Windows 10

Specs: RTX 3080, AMD 3900x, 64GB RAM


Reference: