How Virtual Production and the Metaverse Are Evolving Together

Virtual production is revolutionizing the film, and TV industries as technological advances coalesce to allow creators to see their actors and physical set pieces within lavish, effects-laden worlds while still on the set–without waiting months or years to view a finished result in post.

The Mandalorian, Disney’s smash Star Wars show, has been one of the highest-profile instances to date, with Jon Favreau and his team filming actors and other set elements against an enormous LED soundstage called “The Volume,” which displays dazzling sci-fi worlds. The result is stunning – the process saves ample time and eliminates the need to travel to distant locations to shoot.

But the beauty of virtual production, its increasing accessibility, and the commoditization of the tech is that it doesn’t have to be limited to massive studios and brands–and you don’t need your own dedicated “Volume” to bring digitally-infused content to life in a seamless fashion. Both the quality of the LED screens and the technical processes surrounding virtual production have taken major leaps forward in recent years.

What does any of this have to do with the metaverse? Quite a bit, actually. While the metaverse is another buzzy term and is seen as equally, if not even more, potentially revolutionary for the creative spaces, it’s actually growing in a similar direction as virtual production. That’s because both rely on real-time graphics provided by video game engines.

Game engines have come a long way over the last couple of decades. Early 3D games like Quake and Super Mario 64 might look rudimentary now, but those chunky games were revolutionary steps along the path that helped turn the gaming industry into a juggernaut.

As the industry evolved, the tools naturally became a lot more powerful. Thankfully, they also became a lot more accessible. Epic Games made it easier and easier to tap into its Unreal Engine creation suite, for example, and it flourished. Now it’s not only used for many of the world’s most popular games but also for film, television, architecture, and other creative work.

It’s the backbone of virtual production now, too–and it will almost certainly be a key element behind the growing metaverse. The metaverse is described by technologists as the next step of the internet–where websites and apps give way to immersive, 3D experiences that we share with others as avatars – and will potentially underpin online social interaction and even workplace comms.

Backed by super-powerful GPUs, game engines like the Unreal Engine can now deliver lifelike worlds that are indistinguishable from the real thing–and they do so in real-time, which means you can interact with and manipulate your surroundings. It also means that creators can make changes on the fly during the production of a movie or episode, as well.

That’s a stark contrast from the pre-rendered, time-intensive CG that defines big-budget films from Disney and Pixar, as well as the canned cut-scenes that will be familiar to many longtime gamers. That same fidelity can increasingly be reached in-engine, and it’s transforming the way digitally-infused productions are handled.

With the metaverse seen as an extension of gaming, it makes perfect sense why the increasing power and accessibility of game engines is a boon for the burgeoning space. And excitingly, the virtual spaces created for film and TV could potentially be transported into interactive metaverse experiences–and vice versa.

There’s another key similarity between virtual production and the metaverse: both require fast and accurate simulation of movement via motion capture technology.

Mocap data is vital to virtual production at present. Precision engineering is necessary for the camera to track the physical elements in a scene–including actors, props, and other set pieces–and accurately transport them into the digital backdrops. That’s the only way that this increasingly seamless fusion works as well as it does.

And there are a lot of layers within that. For example, a large-scale feature film may require varying capture techniques for gathering data from actors and props in both indoor and outdoor environments, along with capturing minute facial movements to transport onto digital creatures and characters.

The metaverse will be a digital playground for bringing new kinds of shared, communal experiences to life, such as concerts and performances that step out of the physical realm into something that feels like the best of both worlds.

For those needs, motion capture will again be essential for translating the movement and the essence of actors and real-world elements into the real-time digital space. Canned animations aren’t going to cut it. Users will rightly expect metaverse experiences to feel as real as possible, and that requires accurate simulation of the real thing in real-time.

While a robust, interconnected metaverse may still be years away, we’re already seeing this fusion of motion capture and digital experiences come to life today. Musicians such as pop icon Justin Bieber and heavy metal band Pentakill have hosted their own virtual performances, with every movement translated live into a digital space where excited fans were able to interact with the performer as well as each other.

There’s plenty more of that on the horizon, too. Rapper Snoop Dogg plans to host his own concerts in the blockchain-based game, The Sandbox, where he owns digital land. Warner Music Group similarly plans to put on performances in that same digital world.

Motion capture can also enable other exciting kinds of virtual performances. Imagine, for example, a mixed-reality circus experience that merges the movements of real-life acrobats with dreamy digital elements, providing an immersive experience for viewers. And then place that into a digital open world accessible to audiences all over the world. That’s just the tip of the iceberg of what the metaverse can enable.

No doubt, we’re in the early days of the metaverse, but these experiences are already showing how powerful the fusion of technology and interactivity will be for users. The appetite for metaverse content is sure to be immense as future users constantly explore the online space in search of new sensations. It’s essential to streamline the processes to create compelling new content across the board.

Virtual production is already transforming how creative content is made, and the metaverse is increasingly becoming a consideration for creators. Much of the same technology and techniques can be applied to both realms, empowering creators and delivering joy to people everywhere.

Tags: , , , , ,

About the author

Brett Ineson has close to 20 years’ experience in visual effects and sits on the board of the Motion Capture Society. He has worked in production with industry leaders such as Weta Digital and in technology development with Vicon, Lightstorm Entertainment, and Autodesk. Brett founded Animatrik Film Design in 2004 to specialize in performance capture for film, games, and television. He consistently pushes the boundaries for Virtual Production as a whole through the development and deployment of new solutions and innovations.