You can see in the video how the characters’ lips sync with what they are saying and how their gaze is always following the player. The gesturing animation (obtained from Mixamo) also only plays when the characters are talking.
Tools Used

Personal Contribution
- Researching and Engineering Lip Sync and Gaze Following Algorithms in Blueprint and C++
- Integrating MetaHumans with the aforementioned algorithms
- Animating MetaHumans with FaceRig and BodyRig
- Integrating Animations from Mixamo into MetaHumans to emulate gesturing
- Scene Design
The Story Behind this Project
Institute for Creative Technologies’s RIDE framework is, in principle, engine agnostic. Currently, it only supports Unity Engine. During my time at ICT, I endeavored to bring RIDE into Unreal Engine 5. I primarily aimed to use UE5 MetaHuman to create virtual humans. This also helps the VHToolKit by adding the MetaHuman Framework to the list of tools that VHTK can use to create virtual humans. I provided the foundation for UE5 and MetaHumans support by engineering algorithms that can cause our MetaHumans to emulate lip-syncing to speech, gaze following the user, and gesturing; essentially the most basic of conversational non-verbal behaviors.
The animations are obtained from Mixamo and then the skeleton was retargeted to work with metahumans.
