R&D

Shader-based volumetric cloud 

Leveraging the volumetric cloud nodes base on UE built-in Ray March functions, we no longer need to add extra code to create heavy volume cloud and embrace extremely light and optimizable workflow for realistic clouds. The following video demostrate the possibility of this technique. You can further decorate the sky with custom volume textures or 2D textures and fully control the shader inside material.

Quick Tutorial of how-to setup Python Script and UE 5.1 for Metahuman Virtual AI

In this tutorial the first part is to setup a python script to connect OpenAI GPT-3-Turbo via Google Recognize so that the script can turn your voice into a OpenAI prompt in real-time. And the second part is using OculusLipySync plugin for Unreal Engine and setup Metahuman for Virtual AI. Let's see the final result video down below.

Rig Controller in Unreal Engine enable a seamless workflow from Mocap to Animation

Unreal Engine Control Rig Plugin opens up a new way of animation editing for animators in Unreal Engine. Animator can use facial mocap like ARKit and then record the data, edit the facial record data inside Unreal Engine. Whether to record it again or add a layer to control the current existing animation. The whole workflow is easy to setup and easy to edit. This implement save a lot time doing lip sync and facial expressions but also flexible enough to exaggerate the record data.

Webcam Video, AI Hand Tracking Model power by Mediapipe, and Real-time Automatic Animation in Unreal Engine 4.27.

The video below is showing the result of using MediaPipe AI model hand-tracking class and transfer via OSC to Unreal Engine 4.27 skeletal mesh for animation. Benefit from AI hand gestures model and libraries. Extracting image data to control animations is the main focus of this test. The original goal is driven more natural looking animation by tracking human performs.

TouchDesigner Real-time audioAnalysis and OSC  DAT output | Unreal Engine 5.0 Automatic Animation 

TouchDesigner provides audioAnalysis operator, which is very easy to implement and use for any kind of audio input. This operator help me to animate character and sets animation by music.

Feather Groom Guides Procedural Tool for Unreal Engine

The purpose of this procedural feather tool is to create dynamic feather curves as groom guides for Unreal Engine Groom system. The demo video below shows how to control each feather's shape (Curve, Density, Width, Length, Tip size, Root size), numbers of feathers, randomize the feathers (Scale size, Rotation, Shapes, Normal) and the areas of the feathers.

Quick and Easy Project Setup Tool: Query from Shotgrid and Create Unreal Engine Project

In order to quickly synchronize project information and convert it into the folder structure of the team and comply with resource specifications, the project creation tool uProjectStarter was created to assist project managers in quickly creating projects. Also reduce manually typing error.

Megasteakman Vive Motion Capture Tool Experiment Try Out and Review

The video below is the first test I tried implement Megasteakman's Vive tracking tool. There's a webcam view on the side to see the entire motion capturing real-time result. Vive trackers *5 + Valve Index Controllers *2 for limbs and head. No trackers to navigate the elbows pole vector. Quick and easy to setup.

Crowds cheering system in Unreal Engine 4.27. Demonstrate the random animation montages and cheering trigger.

The video demonstrates the random animations playing in the crowd and also there are triggers to activate blue team crowd cheering and purple team crowd cheering. The trigger can either be keyboard, mouse, OSC, or any protocol that supported in Unreal Engine. It's very easy to control the crowd in a production shoot. 

Via sending OSC data over nDisplay to control Frustum camera | Unreal Engine 4.26.

This test been in my heart for a long time, since I wanna know if iPhone is steady enough to do camera tracking for nDisplay. So without further talk, here's the testing video below.

Receiving Live Streaming Video and Real-time remove the background and send it to Unreal Engine 4.27.

The idea of this test is capturing live Streaming video in real-time and remove the background. So the input videos can be used in live events scenario. In this test, the main focus is remove the background. And try to see the overall results of this AI model cvzone.

Crowd simulations, trigger baked fire simulation and RBD Fracture to a specific state.

The video below on the right is the final result, the Monarch started from flying state to on fire state and then break into pieces which is another state. There are 3 simulations and cache used in this shot. Unfortunately, I won't provide any houdini file or showing any nodes graph here. But I'll try to explain this operation as detailed as possible.

Real-time butterflies niagara system use static mesh render and vertex animation texture (VAT)

Niagara system and vertex animation texture enable a large number of particles with animation and also very much optimized for high FPS in the game engine. For previz purpose or even render purpose are both work.

Use Mantra Render and Pyro Shader for Realistic Fire. 

I noticed the preset pyro fire simulation is targeting on bigger scale fire. When it comes to smaller scale like campfire, the shader does not match the VFX standard quality. So, in this study is focus on creating a smaller scale realistic fire. Fire simulation and fire movements is not main focus in this study.

Pyro Simulation with collision and custom field force

In order to create a simulation that can collides with meshes requires adding a collider in pyro solver. Same as the custom field force. In these three fire simulation, the shader is customized as blue-greenish fire.