Heat Death Of The Metaverse

A real-time 3D motion capture project developed in Unreal Engine 5

Developed by Mark Auman, Tex Barnes, Tyler Fellman, Will Hughes & Conor Steele

Roles on Project: CG Artist, Project Manager, Editor

About Heat Death Of The Metaverse

 Heat Death Of The Metaverse was a music video that I developed as part of a group for the KNB227 - CGI Technologies unit at QUT. In the first part of the project, we were to individually develop an abstract metahuman and a proposal for an environment for it to perform in. In the second part of the project, we were to combine our metahuman with three or four other student's metahumans in order to produce a music video wherein our abstract metahumans would do a performance within a refined version of one of our environments.

Summary Of Work & Contributions

Outcomes From This Project

Through this project, I was able to learn how to use Unreal Engine within a group project setting. I had used Unreal Engine for some previous CGI projects, however, all of those project had been individual. In this context, I was able to learn how to collaboratively work on an Unreal project and gain experience in using Perforce as a version control system. Due to this, I was also able to further develop my project management and production skills.

My Process

In the first phase of the project which was individual, I developed my abstract metahuman called 'The Woodsman', along with an environment that suited it. In the second phase of the project, I was responsible for my specific segment of the music video, the setup and management of our team's version control system, general project management, and editing together the final music video. The full development process of this project was documented in a blog as part of the assessment criteria, which can be viewed by clicking the button below, however, you can view a curated version of my process below.

Research, Mood Boards & Concept Art Work

My initial work on this project centred around me doing research and concept art around the metahuman and accompanying environment I wanted to make. I found myself drawn towards making a woods-like environment seeing as Unreal had an extensive library of assets related to that aesthetic. Naturally, I wanted to create a woods-like metahuman to go with it.

Initial Metahuman & Environment Development

After putting together a concept for my metahuman and environment, I went into Unreal to produce my metahuman, which I named 'The Woodsman. I found the relative ease I had in being able to create my metahuman extremely fascinating, and is a tool I would like to explore further in future projects.

I also created my environment in Unreal where my metahuman would be contained using Quixel Megascan assets and my experience from past units to construct it.

Learning How To Clean Up & Apply Motion Capture Data

Of course, the focus of the unit was on the application of motion capture performances. Through the unit's studio sessions, I was able to learn how to apply mocap data provided to us onto a mesh, clean that data up, and then apply that data to my metahuman in Unreal. For our specific setup, our pipeline was to record the motion capture performance in a volume space, process that data in Shogun, clean it up in Motion Builder, and then export it for use in Unreal.

After getting to grips with the motion capture data pipeline, I did further cleanup on some data we captured during a studio session to be used in my concept render.

Applying that cleaned up mocap data, here was the final concept video for my metahuman and environment.

Team & Project Management

In the second half of the project wherein we worked in groups to produce a music video combining our metahumans to make a cohesive performance, I was responsible for managing the team, as well as set up and ensure the proper use of our project version control. I had only used GitHub for Unity for previous projects, so it was a good experience to learn how to use Perforce with Unreal Engine.

Further Metahuman, Motion Capture & Environment Work

Another responsiblity I had was cleaning up my section of the mocap performance data as it would then be later applied to my metahuman for my segment of the music video. As this was a completely bespoke piece of motion capture data, I spent a lot more time analysing weird behaviour in the captured performance and cleaning up the data in Motion Builder. Most of my cleanup was focused on the feet placement and knee bends, as a lot of the performer's movements were around these parts of the body.

After we had selected one of our team member's environments to stage our music video in, I was also responsible for customising it further to fit my particular segment of the music video, wherein I added trees and other elements from my concept environment developed in the first stage of the environment to match it more closely to my metahuman. I also refined my Woodsman metahuman with further particle effects and glowing eyes to give him a more prominent and distinct look.

Screenshots From Final Music Video

Applying the mocap data I cleaned on my metahuman within the environment I created, I was able to create my sequence of the music video, with some stills of my segment shown above. I was also responsible for editing together all of the team's sequences into the final music video shown at the top of this page.