READ MORE: Advances in Virtual Production: Where VFX and Video Games Collide (IBC365)
Combining physical and digital assets in a real-time filmmaking environment, virtual production encompasses everything from motion capture stages with real-time feedback of digital doubles and virtual worlds, IBC365’s Steve Jarratt notes, “to the latest advances in LED staging, with camera moves synced to computer generated sets.”
Jarratt spoke with Steve Griffith, DNEG’s EP of virtual production, and veteran VFX supervisor Kevin Baillie (The Witches, The Walk) to discuss the potential of virtual production.
“In-camera compositing is really the best thing to call it,” says Griffith, pointing out that virtual production is actually a visual effects process and really just another tool in the filmmakers’ arsenal.
Alongside obvious things like cost reduction and minimizing carbon footprints, virtual production can offer a range of benefits, Griffith explains to Jarratt. “The way that we’ve been trying to describe this to filmmakers is that we’re just now additional tools for your production designer, your director, your DP and your visual effects department,” he says.
“So for a DP, we become a lighting tool. For a production designer, we become an extension of the set pieces that they create. And we can, in some ways, compress the visual effects schedule, and provide marketing and editorial with all of that material much further up ahead.”
Read It on Amplify: “Bullet Train:” Virtual Production, New IP, All the Punching
Real-time game engines have been transformational in virtual production, Jarratt writes, “but the current challenge is to have those visuals fully ray-traced, which avoids the need for texture baking while offering the ability to change lighting on the fly.”
Griffith says the implications across virtual production and visual effects for these new technologies are huge, “and everyone’s clamoring to get that pipeline going. It’s hard, because in some ways, it’s still not quite there yet. Organic elements like trees and creatures and crowds and things like that are still a ways off, but we’re so close. We’ve been wanting this real-time solution for so long in our industry, and it’s exciting to be at that stage where we’re testing it out.”
Baillie defines virtual production as “anything that uses real-time technology in a filmmaking process,” he says. “So it’s not just LED walls, it’s not just SimulCam… It’s all of these things. And I think Unreal and Unity, the game engines really improving in terms of quality, have allowed for real-time technology to bridge the gap between [digital and] the live action departments that have over a century of culture and experience and knowledge baked into them.”
The VFX supervisor, who oversaw the visual effects on director Robert Zemeckis’s live-action Pinocchio feature for Disney, believes that quality and accessibility form the two separate arcs of advances in virtual production technology. Accessibility means that VP tools and techniques are no longer limited to big-budget studio productions with enormous headcounts. “Now, the commercially available tools are so good and so robust that you can have a team of one running a virtual production setup,” Baillie says. “You can start lean from a team size perspective, and then ramp up as needs be.”
But the opportunities to improve the quality of the visuals using virtual production techniques could be even more exciting to filmmakers. Baillie points to his recent work on Pinocchio as an example, recounting how the production team shot and assembled almost all of the film inside Unreal Engine before any sets were built or even a single frame of live action footage was shot. Using ray tracing, the filmmakers were able to view shadows and reflections interactively, allowing them to act instantly on that feedback.
“Instead of having that video game-looking [imagery], like we were forced into having before this technology, it’s now like what would happen in the real world,” Baillie says. “And that’s useful from a technical perspective, because if you design a set, and you want the light to interact with it in a certain way, and it looks that way in Unreal, you can say, yeah, it’s going to work more or less that way when I build it for real. So, it’s useful for cinematographers, production designers, and so on.”
As virtual production techniques and technology continue to improve, Baillie anticipates that filmmakers will be able to tweak their edits right up until the soundtrack is finalized, he tells Jarratt.
“I think quality, speed, flexibility, those are all things that, as they improve you’ll see the world get closer and closer to the point where a lot of filmmaking started, where it’s the director and a very small team collaborating together on a project. And the filmmaker is getting closer and closer to being able to touch the final pixels on the screen, directly interacting with them. Which in the world of traditional visual effects for the last 30-40 years, has very much not been the case. It’s been an obscured process, hidden behind a big black curtain of visual effects companies. And I think that is changing and will continue to change quite rapidly.”
Next, Watch This
“From VAD to Volume: The End-to-End ICVFX Pipeline”
Understand the ABCs of Virtual Production with The Virtual Production Glossary
Virtual production techniques and technologies have taken Hollywood by storm, making VP the single hottest topic for everyone from broadcasters, studios and streamers to independent production and post-production facilities. Get a handle on virtual production techniques and terminology with The Virtual Production Glossary, which was made possible with the support of the Visual Effects Society, the American Society of Cinematographers, Epic Games, and Netflix.
Designed to document terminology, definitions, and describe roles actively used in virtual production, The Virtual Production Glossary was written and edited by Noah Kadner, Addison Bath, Michael Keegan, David Morin, Miles Perkins, Ben Schneider, and Sebastian Sylwan, VES, and was created with the input of numerous industry professionals who generously provided their time and expertise, including Ben Grossmann, Casey Schatz, Girish Balakrishnan, Matt Rank, Adam Davis, Brittany Montero, Curtis Clark, ASC, Stephen Rosenbaum, Dane Smith, John Refoua, ACE, Kim Richards, Matt Madden, Michael Goi, ASC, Phil Galler, Rob Legato, ASC, Susan Zwerman, VES, Wyatt Bartel, Fae Corrigan, Vlad Bina, Steve May, Haarm-Pieter Duiker, Fernando Rabelo, Rhiannon Murphy, and Heather McCann.
In addition to providing an alphabetical list of terminology, The Virtual Production Glossary is also searchable by category, including Cinematography, Editorial, Volume Operations, and VFX.
The database of virtual production terms contained in the glossary is made available under the Creative Commons CC-BY-4.0 license. You can download a handy PDF version of The Virtual Production Glossary HERE, and guidelines for submission of entries and additional feedback can be viewed HERE.
A Brief Voyage Through the History of Virtual Production
While virtual production is definitely having a “moment” in Hollywood and beyond, VP technologies and techniques have by no means just appeared overnight, cinematographer Neil Oseman observes in a recent blog post. The use of LED walls and LED volumes — a major component of virtual production — can be traced directly back to the front- and rear-projection techniques common throughout much of the 20th century, he notes.
Oseman takes readers on a trip through the history of virtual production from its roots in mid-20th century films like North by Northwest to cutting-edge shows like Disney’s streaming hit, The Mandalorian. Along the way, he revisits the “LED Box” director of photography Emmanuel Lubezki conceived for 2013’s VFX Academy Award-winner Gravity, the hybrid green screen/LED screen setups used to capture driving sequences for Netflix’s House of Cards, and the high-resolution projectors employed by DP Claudio Miranda on the 2013 sci-fi feature Oblivion.
Oseman also includes films like Deepwater Horizon (2016), which employed a 42×24-foot video wall comprising more than 250 LED panels, Korean zombie feature Train to Busan (2016), Murder on the Orient Express (2017), and Rogue One: A Star Wars Story (2016), as well as The Jungle Book (2016) and The Lion King (2018), before touching on more recent productions like 2020’s The Midnight Sky, 2022’s The Batman and Paramount+ series Star Trek: Strange New Worlds.