It’s taken three decades for the practice of using real-time computer graphics engines to create cinema. Now there are those hailing it as the future of film production.
Originally introduced as Machinima and used by CG artists to animate short films around video games like Quake and Doom, game engines have rocketed into professional production. From being used to render a few scenes in films like Rogue One, whole films and episodic series are now being commissioned in which real-time game engine renderers are integral to the virtual production.
Virtual production stages replace chroma-keying with green screens for LED backdrops, which give actors a fighting chance to deliver better performances by actually seeing their environment.
Other benefits include the ability for cinematographers to light actors and the set on the virtual stage using the LED wall itself as a cue. This delivers a fidelity and dynamism to the image that was lacking in previous artificial sets.
Disney’s The Mandalorian was among the first projects to harness the power of this new game engine technology and successfully apply it to the world of filmmaking in what VFX partner ILM called The Volume.
Speaking to Deadline, series 1 DP Greig Fraser ACS ASC said, “We have full control of the light, we’re not spending all that time trying to cut the sun, or trying to diffuse it, or trying to add negative fill. On the practical side, we’re able to move faster. But even more importantly, on an emotional side, we’re able to build the world that we’re wanting to in advance, knowing that we’re going to have an extended period of that particular controlled light.”
READ MORE: ‘The Mandalorian’ DP On Developing Disney+ Drama’s LED Volume (Deadline)
Now Epic Games has released the latest version of its games engine, Unreal Engine. In a series of video testimonials reported by the ProVideo Coalition, filmmakers including the Russo brothers (Avengers: Endgame) talk up the huge benefits as they see them.
Anthony Russo, said: “One of the things that excites us most is the fact that we can do in-camera choreography where all the elements of the frame are actually in concert with one another and organically working on one another to create a more visceral experience.”
READ MORE: Epic Games discusses their “magic” tools for Virtual Production (ProVideo Coalition)
Advances in Unreal 4.27 seem chiefly designed to speed up the whole process and provide greater synchronicity between all the moving parts of the virtual soundstage from camera tracking to renderer to LED playback.
In another video, Kim Libreri, Epic Games’ CTO says “what’s driving us right now is the ability for filmmakers to get what they want in the moment and not be beholden to some external process that takes away the control and their creativity whilst they’re actually making their movie.”
For example, one of the new tools is GPU Lightmass (GPULM), a light-baking solution that precomputes complex light interactions from lights and stores that data in lightmap textures that are applied to scene geometry.
The tool runs entirely on the GPU, “which has been a game changer,” explains Jonathan Litt, senior product manager at Epic Games. “Where previously turnaround times might have been a matter of days or even longer, we’re aiming for turnaround times of minutes. We want you to be able to run these operations on set with interactive feedback as you’re changing the lighting.”
Color consistency through the virtual pipeline has also been addressed with a OCIO color management tool and there’s a new tool called Switchboard to ensure that every single one of the devices you see running the LED wall is able to remain in sync.
You’d be forgiven for thinking that Epic Games and Unreal dominate this entire field. Hurlbut Academy highlights that not only is HBO’s Westworld using “Unreal Engine and ILM’s StageCraft technology for its third season,” but that Epic Games is investing in 45 movie and short film projects to be developed with the Unreal Engine.
It will begin first with an animated feature film about Gilgamesh in conjunction with animations studios like Hook Up, DuermeVela, and FilmSharks.
READ MORE: Are Game Engines the Future of Filmmaking? (Hurlbut Academy)
What neither Hurlbut nor ProVideo point out is that ILM has developed a custom “cinematic render engine” called Helios, which it used to create Season 2 of The Mandalorian and many other of its projects.
“Engineered from the ground up with film and television production in mind, Helios offers incredible performance, high fidelity real-time ray tracing, the ability to rip through scenes of unparalleled complexity, all while leveraging ILM’s unrivaled color science, and was designed from the start to work seamlessly with ILM StageCraft,” ILM states. “The purpose-built, production-hardened platform allows filmmakers to explore new ideas, communicate concepts, and execute shots in a collaborative and flexible production environment.”
READ MORE: Stagecraft (ILM)
As analyst Matthew Ball points out, Disney (which owns ILM, of course) is doing this to regain control over the intellectual property it can build and own in the Metaverse — a space in which Epic Games is both partner and keen rival (this is also a reason why Epic is investing in original content).
Arguably far more interesting than the technology per se and something only briefly glanced at in Hurlbut’s article is what becomes of the future of content when games and film storytelling forms merge. It’s a trajectory that seems inevitable, is certainly inspiring, but about which we are really only skirting the outer edges.
READ MORE: Virtual Platforms and the Metaverse (Matthew Ball)
Here’s Hurlbut: “With the emergence of the Unreal Engine in filmmaking, the line between gaming and movies continues to become increasingly obscured. It has unveiled a new interactive way to interface with entertainment.”
The Russo brothers might be the ones to watch if interactive content is to take a giant leap forward. Through their branded entertainment company Bullitt, they have explored several VR/AR concepts and are also consultants to automotive XR experience developer Holoride.
READ MORE: Holoride taps Russo Brothers and AR/VR pioneers for strategy board (Venture Beat)
Speaking about games engines and virtual production, director Joe Russo says it feels “like we’re moving from filmmaking 1.0, jumping to filmmaking 5.0 — and I don’t see any other path forward that is nearly as exciting, compelling, or freeing for artists as this path.”