Why Real-Time Rendering is the Key to Live Metaverse Experiences

What a difference 25 years makes. Back in the mid-1990s, we saw the flat pixel worlds of 16-bit games give way to immersive 3D universes that you could freely explore. The polygonal graphics might look awfully rudimentary today, but it was an enormous shift at the time.

The stark contrast quickly became visible between what a video game engine can produce with real-time, interactive graphics and what pre-rendered cut-scenes or cinematics looked like.

One of the most enduring examples is Square’s PlayStation role-playing smash game, Final Fantasy VII. While the pre-rendered cinematics showcased a sparkling fantasy/sci-fi world with realistic character animation and thrilling story beats, the boxy and angular in-game graphics couldn’t come close to matching it. Even back then, it was a jarring transition.

Over time, however, real-time graphics technology has grown by leaps and bounds. Graphics cards (GPUs) and video game console hardware have steadily pushed the industry forward, while ever more powerful and accommodating video game engines have allowed creators to get increasingly ambitious with what they can get out of the hardware.

Nowadays, it’s common to see video game cinematics that are rendered in real-time, in the same game engine as the interactive gameplay moments – and they look even more dazzlingly detailed than the pre-rendered CG cinematics of old. That graphical evolution is what will ultimately lay the foundation for the kind of quality that can make the metaverse take shape around us.

Real-time rendering engines are no longer the exclusive domain of video games. Now, Epic Games’ powerful Unreal Engine – which is used by a wide array of game studios – is also tapped by television shows like The Mandalorian and Obi-Wan Kenobi for virtual production, as well as for feature films, interactive Twitch streams, architectural visualization, mixed reality performances, and more.

The allure of real-time rendering can (at least partly) be attributed to the way it provides interactive playgrounds that can easily be tweaked and customized on the fly. For films and TV, that means changing a backdrop, popping in additional set pieces and characters, and moving the virtual camera around the space without significant delay. In all use cases, it eradicates lengthy rendering waits during post-production. It’s all happening live. Through the – currently very popular – virtual production workflow, content creators are skipping or significantly cutting down on the lengthy post-production process and tapping into real-time visual effects in-camera, which lets them visualize near-completed scenes while still on set.

Why is this important for the metaverse? 

The metaverse is envisioned as a future version of the internet, effectively, with online environments that we’ll apparently explore while using our own customizable avatars. We’ll play games together in the metaverse, chat and explore new kinds of interactive experiences, and potentially even do our jobs in the 3D spaces.

Live from the metaverse

Where I see the greatest potential for the metaverse is in live, shared experiences, which tap into the power of real-time rendering with the help of motion capture technology. By pairing those two technologies, it will be possible to bring real people’s movements and performances into these 3D spaces to entertain and delight the masses from all over the world.

We’re already seeing the possibilities come to life with virtual concerts, which started in proto-metaverse video game worlds like Fortnite and Roblox. Rapper Travis Scott, for example, hosted an in-game show in which his movements and rapping were translated into the game, while fellow rapper Lil Nas X did much the same in a Roblox concert.

Both were massive fan-favorite events, not to mention big business: Travis Scott reportedly banked $20 million from his virtual concert, including merchandise sales. That’s more than ten times what he made from a single live gig during his previous real-world tour.

But with both of those examples, the performer was recorded ahead of time, with the motion capture data retargeted to an avatar and played back in real-time. It showed the appeal of concert events in video game worlds, but there’s still so much untapped potential.

The next evolution of that, which is already coming to life with Justin Bieber and other artists, is to perform the concert live and have that mocap data plugged directly into the metaverse performance as it’s happening. That allows artists to interact with fans in the moment and adds more of a human element into the digital environments.

While some artists might find it safer to pre-record a digital concert, real-time rendering gives the unpredictability and excitement of a live performance, bringing the best of both worlds to the audience. It’s not just a canned recording that’s had all of its edges polished off. In a sense, I believe it will give credibility to the metaverse, providing new kinds of experiences that aren’t just another kind of Netflix for pre-recorded content.

The metaverse also makes live concert experiences more accessible for everyone in the world. You won’t have to be in a certain city on a certain date to attend a concert. They’ll be available to anyone with a device that plugs into the internet, opening up potentially vast audiences for shared online experiences.

Concerts are just one type of mixed-reality performance that I believe will thrive in the metaverse. Other possibilities include dance or acrobatic performances that translate the movements of skilled professionals into digital avatars, using an array of visual effects to immerse participants during the interactive experiences.

Even narrative gaming experiences can be transformed in the metaverse, as the technology opens the door to characters that are controlled in real-time by humans. Imagine talking to a character that you assume is scripted and controlled by A.I., but it’s really someone inhabiting that avatar via motion capture. That would be a mind-blowing twist for players.

How advanced has real-time graphics technology become? Just look at The Matrix Awakens, a recent real-time Unreal Engine 5 demo that recreates characters and scenes from the original blockbuster film. Partway through, what looks like a pre-rendered cinematic seamlessly shifts to a playable shootout. It’s all happening in real-time.

The Matrix Awakens is a taste of how advanced future gaming and metaverse experiences will quickly become. And the digital humans look and act almost flawlessly like their famed real-life counterparts, too, thanks to motion capture and photogrammetry techniques that create digital doubles like those used in major Hollywood films. They’ll be in the metaverse, too.

A quarter-century of technological advancement has all coalesced in this moment to enable new kinds of shared, interactive experiences. And as the metaverse continues to take shape and creative minds start playing around with the tech, I believe that we’ll see even larger leaps forward in the years to come.

 

Animatrik

Tags: , , , , , , ,

About the author

Brett Ineson has close to 20 years’ experience in visual effects and sits on the board of the Motion Capture Society. He has worked in production with industry leaders such as Weta Digital and in technology development with Vicon, Lightstorm Entertainment, and Autodesk. Brett founded Animatrik Film Design in 2004 to specialize in performance capture for film, games, and television. He consistently pushes the boundaries for Virtual Production as a whole through the development and deployment of new solutions and innovations.