READ MORE: Virtual Production 101 (MediaSilo)
Seamlessly combining physical and virtual elements, virtual production comprises a range of tools and techniques. LED walls synced with cutting-edge real-time graphics software take up one part of the spectrum, while other methods go back nearly as far as cinema itself.
With VP become more prevalent than ever, there’s no excuse for not learning the basics. Kelsey Opel, on MediaSilo’s blog, has compiled an excellent primer on virtual production techniques, including how we can expect these technologies to shape the future of filmmaking.
“The innovative advancements in virtual production have led to more seamless collaboration among different departments,” she notes. “Crews can effectively carry out the creative vision from pre-production all the way to completion.”
As an example, she cites the FX series Snowfall, which began using LED walls during production of Season 5. Series star Damson Idris, who also served as producer on Season 5, employed an LED wall to create real-time backdrops of Los Angeles.
“The show has saved up to $49,000 an episode by reducing shooting time, transportation between locations, and crew quantity. Shooting on a virtual stage can also reduce the production’s carbon footprint.”
One of the most central components of virtual production is pre-visualization, or previs. “As part of the pre-production process, creative teams implement storyboards and digital software to plan the design of animated characters and virtual locations. Any complex scenes or intricate camera movements can also be blocked out before shooting even begins.”
During the previs stage, productions can also perform virtual set scouting, which employs augmented reality technology for exploring virtual environments. Wearing VR headsets or using customized “meta-human” avatars, crew members can virtually “walk” through locations to plan shots, lighting, and more.
Virtual scouting is especially useful for crews tasked with building interior sets, says Opel, as well as backlot sets or even full-scale virtual environments developed to serve as digital assets while filming. These digital assets can also be seamlessly altered throughout the production process, making previsualization a vital tool for virtual production.
The production team for HBO’s Game of Thrones partnered with visualization studio The Third Floor on virtual set scouting for the hit show’s eighth season. TTF generated virtual copies of the sets for the art department to survey before beginning physical construction:
Pre-visualization allows filmmakers to design their vision for their projects, but post-visualization, or postvis, allows creative teams to actually carry out their plans to completion, even while the camera is still rolling.
“Like previs, postvis is meant as a guide to represent the creative team’s vision on set. Once the shot is approved, the digital assets are handed over to the editorial team for final animation and compositing,” Opel says.
“Cameras can be calibrated to sync with digital assets to give accurate perspectives in the shot. The virtual set, or animated characters, can be observed on monitors. This advanced technology aids the cinematographer in accurately setting the frame. The director can also give feedback in association with their vision.
“VR trackers help align the camera with the virtual world so that when the camera moves, the digital elements move as well.”
Watch Film Riot’s Ryan Connolly demonstrate how he used Unreal Engine coupled with the Vive Mars CamTrack system to sync his physical camera with his virtual camera in order to view digital assets on-screen while filming live-action sequences, which helped him successfully execute camera movements while remaining immersed in the virtual environment:
“From imaginary characters such as Gollum in The Lord of the Rings: The Two Towers (2002) and the apes in the revitalized Rise of the Planet of the Apes (2011), motion capture has inspired filmmakers for decades,” Opel writes.
Giving filmmakers the ability to bring fantastical animated creatures and characters into live-action environments, motion capture technology outfits performers with a motion capture suit (or mo-cap suit), a wearable device that records the body movements of the user. This data is combined with digital character assets so the production team “can view the animated character on the monitors during filming. After wrap, the assets are sent to post-production to finalize the character’s animation.”
Live projection just might be the most cost-effective VP solution for independent filmmakers, and can be utilized to add color and texture to shots, or to create a reference for actors to enhance their eye-line.
There are two types of live projection within virtual production, as Opel explains, front projection, which reflects light off the screen, and rear projection, which diffuses light from behind the screen.
While live projection offers filmmakers more creative options and can provide more control on set, “it’s important to ensure that the lighting matches both the projected background and the live-action filming. The closer those elements match, the more realistic your shot will look. When set up correctly, live projection can look practical and add simplicity to your project.”
In the video below, Indy Mogul visits Charles Haine, a professor of cinematography at the Feirstein Graduate School of Cinema and a writer at NoFilmSchool, to learn how filmmakers can use the same projection techniques employed for Damien Chazelle’s First Man (2018), Alfonso Cuarón’s multi-Oscar-winning Gravity (2013) and Joseph Kosinski’s Oblivion (2013) using an $80 projector:
Green screen technology, or chroma key compositing, is a visual effects and post-production technique for layering two images or video streams together based on color hues. As the most traditional form of virtual production, chroma keying has its roots in double exposure techniques, which were used to introduce elements into a scene not present in the initial exposure. In 1903, Edwin S. Porter’s The Great Train Robbery famously used double exposure to add background scenes to windows that were black when filmed on set.
In modern filmmaking, green screen technology made a major leap forward with Star Wars: Episode V: The Empire Strikes Back, when VFX supervisor Richard Edlund created a quad-optical-printer able quickly and cheaply interweave images from multiple reels.
“Now, the crew can use virtual production software on set to observe the digital assets on the monitor. News stations replace green screens with live weather reports while on air. Creative teams can view the imaginary world on the monitor during filming,” says Opel.
“Virtual production has taken the guesswork out of green screen technology with more accuracy in camera movements and realistic elements, saving your production time and money.”
Opel calls LED walls “the most advanced form of virtual production,” pointing to The Mandalorian’s use of this groundbreaking technology to bring the Star Wars spinoff’s visual effects to the next level.
The use of LED walls and LED volumes can be traced back to the front- and rear-projection techniques common in filmmaking throughout much of the 20th century in classic films such as Alfred Hitchcock’s 1959 spy thriller North by Northwest.
For production of The Mandalorian, ILM unleashed StageCraft, its end-to-end solution supporting all aspects of virtual production. Central to ILM’s StageCraft is The Volume, a 360-degree customized dome environment comprised of LED panels synced with a real-time graphics engine.
While certainly the most expensive of virtual production solutions, LED walls solve many of the problems inherent in green screen techniques, Opel observes.
“An LED wall also creates realistic lighting and reflections that match your digital assets, while green screens can cast shadows and spill over additional green light. A green screen requires more time in post-production, while there is a quicker turnaround time with an LED wall.”
However, she notes, “if your production involves explosives or dangerous special effects, a green screen would be the better option to avoid damaging an expensive LED wall.”
A Brief Voyage Through the History of Virtual Production
While virtual production is definitely having a “moment” in Hollywood and beyond, VP technologies and techniques have by no means just appeared overnight, cinematographer Neil Oseman observes in a recent blog post. The use of LED walls and LED volumes — a major component of virtual production — can be traced directly back to the front- and rear-projection techniques common throughout much of the 20th century, he notes.
Oseman takes readers on a trip through the history of virtual production from its roots in mid-20th century films like North by Northwest to cutting-edge shows like Disney’s streaming hit, The Mandalorian. Along the way, he revisits the “LED Box” director of photography Emmanuel Lubezki conceived for 2013’s VFX Academy Award-winner Gravity, the hybrid green screen/LED screen setups used to capture driving sequences for Netflix’s House of Cards, and the high-resolution projectors employed by DP Claudio Miranda on the 2013 sci-fi feature Oblivion. Oseman also includes films like Deepwater Horizon (2016), which employed a 42×24-foot video wall comprising more than 250 LED panels, Korean zombie feature Train to Busan (2016), Murder on the Orient Express (2017), and Rogue One: A Star Wars Story (2016), as well as The Jungle Book (2016) and The Lion King (2018), before touching on more recent productions like 2020’s The Midnight Sky, 2022’s The Batman and Paramount+ series Star Trek: Strange New Worlds.