How Mocap can Bring the Artform of Movement Into the Metaverse

Just as we’ve seen over the years with ever more advanced video games and virtual reality experiences, the success of future metaverse experiences may rely significantly on the use of motion capture technology to accurately bring human movement into digital worlds.

There are obvious examples that we’ve seen in the video game industry, such as Lara Croft-like explorers making death-defying leaps and acrobatic dodges that look realistic down to the smallest gestures; soldiers accurately taking cover and firing weapons, and digital athletes that sink three-pointers or fluidly glide on ice like their real-life counterparts. But that’s just scratching the surface when it comes to the metaverse.

The metaverse is seen as a future incarnation of the internet in which users primarily interact in 3D spaces using avatars. We’ll no doubt use it to play games together but also to collaborate in work settings, shop, socialize, and possibly plenty more. 

We’ll also share communal experiences, bringing concepts from the real world into the digital space, brought to us by real actors and performers puppeteering avatars. Experiences could range from a live concert from a mocapped singer to a digital circus, a live painting experience from an accomplished artist, or something else altogether.

Some of those types of live events are already taking place on metaverse platforms, even though it’s very early days for the space. And it’s all made possible by the continued evolution and enhancement of motion capture technologies, which can capture every nuance of human movement and ensure that artistic expression is accurately reproduced. 

Motion capture was reportedly first tapped for video games in the early days of 3D gaming, notably for Sega’s arcade smash, Virtua Fighter 2. There was a subtle yet noticeable difference between the maneuvers of the original fighting game and the smoother, more accurate-looking attacks and counters seen in the celebrated sequel. 

Over time, however, the practice has evolved beyond simply capturing and reproducing movement. For a while now, many top-tier video games have embraced full-blooded performance capture, with actors performing entire cinematic sequences while also voicing their digital counterparts. These ‘cut scenes’ bring a truly complete, holistic performance into a lavish video game.

Over that timeframe, the technology has improved dramatically. Thanks to motion capture suits and head-mounted cameras, it’s possible to capture an entire performance at once, bringing nuanced facial animation and voice acting into the equation along with body movements.

Continued advances in stability and reliability for motion capture tech and retargeting tools have streamlined the process of bringing such performances to life. So much less time is wasted now, thanks to dependable hardware and software handled by skilled professionals.

Motion capture technology has become more flexible in tandem. While some studios still prefer to use optical mocap setups, which are larger in scale but provide more detailed results, we’ve also seen a rise in inertial motion capture suits that are getting more accurate by the minute. They’re ideal for quick iteration and capturing in outdoor settings.

All told, the improving technology allows us to focus less on harnessing the tools and more on the creative possibilities of what’s on the stage or set. We can control how the motion looks on the final character to such a degree now that we can remap almost anything, even quadrupeds—like capturing a horseback rider in real-time within Unreal Engine.

There’s a seamless integration of technology and artistic performance that just wasn’t possible before, and that’s perfect for a growing metaverse that may one day support billions of users all looking for different types of shared experiences. Some may reflect real-world activities, while others will be uniquely built for digital settings—and in many cases, it’ll probably be a hybrid, like the best of both worlds.

We’re already seeing immersive online events that bring the best of real-world performances into the digital space for truly shared, communal experiences. Online concerts, for example, have the potential to connect fans from all over the world as a top singer or band performs live from the studio.

Unlike streaming, however, the finished result won’t just be a flat video for millions to watch. With even today’s technology, you can transport artists into a virtual world and retarget their movements and performances into an avatar while they’re performing live on-stage. Fans can control their own avatars in a massive crowd and interact with each other, all amid an array of stylized effects and unique backdrops.

There are so many other possibilities for bringing skilled performers into the metaverse, such as translating acrobats and gymnasts into a virtual world for a dazzling circus show or bringing a dance troupe into the digital terrain as they perform their routines. All of that can be captured accurately with motion capture technology and real-time game engines.

I imagine we’ll see a lot of experimentation as metaverse platforms begin taking hold of users and as their appetite for shared online experiences expands. Users will want experiences that look and feel human and, more importantly, feel like they’re happening in real-time.

It might be tempting to think of the metaverse as being a potential escape from the real world, but I don’t think that’s fully the case. For sure, there will be fantastical, surreal, and truly new and unique experiences in the metaverse, but its beauty lies in the wide variety of possibilities that can be explored in such an interactive digital environment.

One of the greatest draws and allures of this shared online space is the ability to have experiences that replicate or spin-off of real-world, human-driven activities.

That’s where motion capture will be essential to help turn a digital avatar into something closer to human, imbued with the precise movements, expressions, and voice of a real person.

By making the metaverse feel like an extension of reality, it’ll be a place that users could soon feel comfortable in as they enjoy new experiences with people from all over the globe—and potentially help shape this new digital reality.

 

Tags: , , , , , , ,

About the author

Brett Ineson has close to 20 years’ experience in visual effects and sits on the board of the Motion Capture Society. He has worked in production with industry leaders such as Weta Digital and in technology development with Vicon, Lightstorm Entertainment, and Autodesk. Brett founded Animatrik Film Design in 2004 to specialize in performance capture for film, games, and television. He consistently pushes the boundaries for Virtual Production as a whole through the development and deployment of new solutions and innovations.