Forty years ago, Tron transferred the visuals of a computer game onto film. Nowadays, it’s games that are pushing the technical and creative innovations of how we make movies.
The video game business outranked that of Hollywood some years ago. Cinema may be a $100 billion global industry but video games raked in $180 billion last year.
READ MORE: Videogames are a bigger industry than movies and North American sports combined, thanks to the pandemic (MarketWatch)
“Perhaps the most important point is that the game industry can massively outspend the film industry when it comes to developing new storytelling tools,” writes Bryant Frazer at the Frame.io Insider blog. “And that opens up a huge opportunity for them to grow and adapt.”
Even directors the caliber of the Russo brothers (Avengers: Infinity War) are wowed by video game technology and its potential for next level content creation.
“It feels like we’re moving from filmmaking 1.0, jumping to filmmaking 5.0 — and I don’t see any other path forward that is nearly as exciting, compelling, or freeing for artists as this path,” Joe Russo recently said.
Jumping off from the key points in Frazer’s article, here’s how video games are changing the movie biz from production to content.
Game Engines Are the New Soundstage
Game engines are being routinely used to render background environments in real time for display on the LED walls of studio volumes. It’s a process that allows scenes to have incredible CG environments with almost no post-production required.
Game engine developers including Notch, Unity and Epic are actively encouraging Hollywood’s use of their tech by adding new features specifically designed for filmmakers.
For example, the new version of Unreal Engine includes a Virtual Camera plug-in that lets you use an iPad Pro as the viewfinder for an imaginary camera.
READ MORE: “Filmmaking 1.0 to Filmmaking 5.0” — Virtual Production Technology Is Ready (NAB Amplify)
The technology is being applied to deliver a final shot, in-camera, without the resource and time cost of traditional VFX post-production tools. For instance, when Amazon approached production company Aggressive to design the set and show package for a Prime Rewind: Inside The Boys companion TV series, it spearheaded a virtual-set technique that unified the look of the entire project. Aggressive calls it “XR SetMapping.” The team was able to track multiple live cameras with virtual 3D backdrops, and then add AR elements in real time.
READ MORE: Amazon ‘Inside The Boys’ Realtime SetMapping (Aggressive)
According to Frazer, such technology is spreading like wildfire through scripted programs. Among them is ABC’s Station 19, an action-rescue show lensed by Daryn Okada, ASC, and the upcoming Taika Waititi-helmed comedy Our Flag Means Death.
READ MORE: On The Walls: Virtual Production for Series Shooting – The American Society of Cinematographers (American Cinematographer)
Adding Depth
For many productions, simply shooting beautiful footage is no longer enough. Modern VFX teams need more (and better) information to mesh their digital creations into the real-world scene that was captured by the camera.
This can be achieved through volumetric capture and photogrammetry, techniques which use depth-sensing cameras or camera arrays to extrapolate 3D models of an object or scene. Those models can then be used in a virtual environment or post-production.
Writer-director Neill Blomkamp (District 9, Elysium) has embraced volumetric capture on a large scale. His new film Demonic includes more than 15 minutes of volumetric footage.
As described by Frazer, for scenes where the film’s main character explores a simulation of the brain of her comatose mother, actors performed inside a 260-camera volumetric rig. Those volumetric captures were composited into 3D environments using the Unity rendering engine and a new, patent-pending technology code-named Project Inplay, which is designed for real-time playback, rendering, and even dynamic relighting of large volumetric point clouds.
Volumetric capture and photogrammetry are not limited to exotic, enterprise-level workflows. Productions of every scale can already start taking advantage of these new tools. Scatter sells Depthkit software that works with devices like Microsoft’s Azure Kinect and Intel’s Intel RealSense Technology depth cameras. Apple even has a new API called Object Capture that will allow developers to turn any Mac or iPhone into a tool for photogrammetric capture.
This leads Frazer to ponder that as power and affordability of GPUs continues to grow, these developments “will have a profound impact on how we tackle production and post-production.”
Eventually, volumetric capture, lightfields or computational cinematography will lead to the creation of true autostereo 3D or holographic content with implications for theater-like live performance of avatars. We’ll see.
Previz, Techviz, PostViz and More
Virtual production is not limited to LED stages and complex volumetric capture rigs. The term, and the technology, encompasses everything from pre-viz to post-viz. There are tools for this derived from the video game world that can work for filmmakers on any budget.
“Filmmakers are looking at these toolsets and thinking, ‘I used to need a team to do that, but now I can do my shot blocking in the game engine,’ ” says Jim Geduldick, SVP of Virtual Production at Dimension North America, a branch of London-based Dimension Studio.
Cine Tracer is a $90 real-time cinematography simulator. The app uses Unreal Engine to help you work out scene lighting, shot blocking, and storyboards all inside what is essentially a video game.
“These sorts of tools will be the bridge for many into virtual production,” Frazer says. “Workflows will evolve to include these kinds of software tools, and then as more affordable LED stages open up in major shooting markets, we’ll start to see turnkey services being offered to filmmakers.”
Another example: Ghostwheel’s Previs Pro storyboarding app for iPhone & iPad creates storyboards from virtual cameras, lights, characters and props in 3D environments. It even has an Augmented Reality mode to help you visualize your scene inside a real space.
Matt Workman of CineTracer adds, “If you’re on a small team and you want to make a film using these technologies, you can do that without ever setting foot on an LED volume. Use it to do all of your VFX, blocking, storyboarding and previs, then go off and shoot traditionally.”
Kickstarting the boom in game engine filmmaking was Epic Games’ decision to open up its technology in 2015 by making Unreal source code available on GitHub. Like Unity, and unlike other traditional filmmaking gear, a core technology of current and future content production is essentially free for anyone to use.
Joystick Cameras and Digital Puppeteers
When VFX legend Rob Legato assisted DP Caleb Deschanel make The Lion King as an animated photoreal production, he helped the crew work use traditional camera equipment to set up and execute shots using virtual reality in the same way the shots would be achieved on location.
“Caleb is a fabulous artist but he has no experience of digital translation so my job was to navigate the mechanics of this for him,” Legato explained.
READ MORE: Behind the Scenes: The Lion King (IBC 365)
Essentially that meant providing an interface for Deschanel between the virtual world and the tools of conventional filmmaking in such a way that the DP could call and operate a shot just like any other movie.
Camera moves were choreographed using modified camera gear — cranes, dollies, Steadicam (even a virtual helicopter, operated by director Jon Favreau) — to allow the filmmakers to “touch” their equipment with the motion tracked by sensors on the stage ceiling and simulated directly within the virtual world.
“Instead of designing a camera move as you would in previs on a computer, we lay dolly track down in the virtual environment,” says Legato. “If I want to dolly track from this rock to that tree the dolly has real grip and inertia and a pan and tilt wheel which is sending data back to the virtual environment. It’s not a facsimile. In that way you retain the imperfections, the accidents, the little idiosyncrasies that make human creative choices but which would never occur to you if you made it perfectly in digital.”
Since video game developers have spent decades trying to perfect the design of game controllers (the human-machine interface) it makes sense that similar designs are applied to filmmakers using game software.
Some enterprising video editors have even reprogrammed control pads aimed at professional gamers to work with post-production software, by mapping certain keyboard shortcuts to buttons, triggers, and dials, reports Frazer.
Taking this a stage further, consider a video game controllers as a puppeteering device. One that allows you to make on-screen characters interact with the live action on set in real time.
The Jim Henson Company has done just this for Disney+ show Earth to Ned, which combines live action, animatronics, and real-time on-set performance-driven CG animation. The character BETI is a CG artificial intelligence entity rendered in real time on set in Unreal Engine.
READ MORE: The Jim Henson Company’s Earth to Ned uses real-time digital puppetry for final-pixel TV (Unreal Engine)
To have BETI appear to be physically on set, the plan was to create “rooms” she could float in with screens inserted into the set. By tracking all the cameras in real time, it was possible to generate the correct parallax to create the illusion of there being volumes behind the screens.
About a dozen parameters are connected to character attributes that are programmed in the Henson Performance Control System. This enables the performers to use the puppeteer interface to adjust things like brightness, frequency, and effects scaling using inputs such as hand controls, pedals, and sliders.
“The guests on the show were really excited when they came on… because I think people assumed it was going to be a post effect,” says Brian Henson. “That illusion was complete for the guests who were there. They couldn’t see any puppeteers. They just saw Ned and they saw BETI. And it’s fabulous when that happens.”
Altering Storytelling
To my mind the most exciting impact of game tech on filmmaking is how it will alter storytelling. With virtual cameras integrated into the most popular franchises — Fortnite, Minecraft, Roblox, Madden NFL, NBA 2K, and countless more — video games are teaching millions of young people not just visual storytelling but interactive digital production.
Matt Workman points out that young people are already pioneering a new, playful style of real time entertainment that hasn’t really crossed over to linear media yet.
“If anything has the potential to shake up the notoriously conservative film business, it’s a whole new generation of media-savvy creatives with an intuitive understanding of shot framing, action choreography, and editorial techniques and a decided lack of reverence for established styles and genres.”
Beyond Tron to New Formats
Tron may be the granddaddy of representing computer games on screen. Other examples, which are sometimes adaptions of video game franchises, include Street Fighter, The Lawnmower Man, Lara Croft, Warcraft, The Matrix, and Detective Pikachu. All play out in the linear directed cinema medium. None are interactive, which is the essence of the computer game.
As Frazer says, “Video games have now come into their own as a storytelling medium. Modern games boast astoundingly realistic graphics, complex open worlds, and emotionally nuanced narratives that rival the best of Hollywood. But how will the ideas and skills learned from video games influence the next generation of filmmakers?”
The medium of cinema is perhaps not the best fit for the interactive potential of a video game storytelling culture. Live streams on YouTube, Twitch and Instagram encourage interactivity, solving a problem that traditional TV has never managed to tackle. Already Vtubers like CodeMiko (a virtual character performed in real time by an L.A.-based animator and coder) has grown into a full-fledged business with 750,000 Twitch subscribers, employing five developers, a management firm and a publicist.
There’s demand for pre-recorded content, too: YouTube viewers watched 100 billion hours of gaming content in 2020, with video related to Microsoft’s megahit Minecraft alone earning 201 billion views.
“If Twitch streaming and Minecraft movies seem like niche interests compared to the shows on HBO Max and Disney+, consider that independent creators on social media are definitely getting paid,” says Frazer. “Even many small YouTubers make the same or more money than many independent filmmakers.”
So, if video game creators are eschewing traditional content distribution models, and still banking cash, what does that mean for film and TV as a business?