Mulan and Hidden Figures cinematographer Mandy Walker, ASC, ACS has expressed confidence in using cloud tools and services for dailies having experienced the technology as part of the HPA Tech Retreat.
Walker was in hiatus while filming paused (due to COVID) on Baz Luhrmann’s Elvis biopic and was able to supervise the production of a short film for the HPA Tech Retreat’s real-world stress-test of remote distributed creative workflows.
“Going forward, I feel much more relaxed about accessing remote systems on a movie,” she said. “For instance, if I have a second unit in another location or part of the world, I’d feel confident getting really good quality dailies to be able to pass comment.
“The other thing is that, in the post process, I want to be able to say we can work with someone in LA or London while I’m in Australia, or vice versa, and not feel the experience is going to suffer in terms of quality of time. I feel we’re there.”
Showing Technology and Workflows Connected
The camera-to-post demonstration at last year’s Tech Retreat proved remarkably prescient as the world entered lockdown a month later.
The 2021 Supersession took this up another level by following the progress of six short films made under Covid-conditions in different parts of the world with every element of editorial and post-production managed remotely in software and in the cloud.
“It’s very important we show the industry different variations on a similar theme: how do we solve VFX and post with a remote crew that could be anywhere in the world?” explained organizer Joachim Zell, a former EFilm and Technicolor technologist. “We want to show technology connected and creating real deliverables.”
The HPA asked groups of young filmmakers variously in London, Dubai, Mongolia, Mexico City, Brisbane and Hollywood to each write and produce a short film related to the pandemic and shot under COVID conditions. All the principal filmmakers are women and include Mexican DP and producer Sandra De La Silva, Saudi Director and Producer Abeer Abdullah, Australian director and producer Ruby Bell, British Producer, Writer and Actor Bayra Bela and Mongolia’s Azzaya Lkhagvasuren.
To test the remote cloud workflow to its limits, all the films were acquired at UHD resolution with cameras including Alexa LF (4.5K), Sony Venice (6K), RED Komodo (6K) and a Blackmagic Design Ursa Mini Pro 12K. Each movie is being made available with deliverables from HD to 8K, a range of color spaces including Rec.709 and Rec.2020 and sound mixes such as stereo, 5.1, Dolby Atmos and DTS. Language versions and archiving were also performed using distributed teams.
HPA helping organize access to a worldwide pool of craft artists for supervision including Walker and Christopher Probst, ASC.
For example, the Brisbane film was shot over a weekend on the Gold Coast with the score and editorial performed at different locations in Sydney. One producer was in Australia, another in LA with sound post at Skywalker, picture post at Dolby in LA and VFX in London.
Supervising cinematography Walker was able to view dailies from another location to the set on Moxion. She also supervised the DI using an online application form Colorfront. “The quality was amazing,” she reported. “It meant I could watch what they were doing, make comments, even rewind takes. There are existing digital systems for dailies and color timing but this has now stepped up.”
In keeping with the progressive aspects of the Supersession productions, this short included a number of women HoDs. Walker and Bell were also keen to use the opportunity to give other young filmmakers a leg up. The B camera DP on Elvis is the short’s cinematographer. One of the film’s electricians is a gaffer, a grip became a key grip.
Content for all the projects was uploaded to a central data lake on AWS cloud where AWS applied encryption and authentication as well as orchestrated compute instances to fire up virtual workstations.
“We were able to connect various editors in different to high-speed storage so they could edit BMD raw or ARRI raw without needing to take their workflow outside of the cloud,” says AWS Solutions Architect Matthew Herson. “Since the virtual workstation in the cloud are in proximity to the data it is high speed and fundamentally shifts from having to download to a laptop or edit bay. Once that process is done, it can be handed off to the next part of the chain, such as color.”
AWS’ Media Asset Preparation Service (MAPS) was used to exchange assets across AWS storage mediums such as Amazon S3.
“MAPS gives anyone the availability to individually upload files, do previews or dynamic data movement for editorial workflows,” says Herson. “When they get to the end of a project they can use media exchange to seamlessly hand off the files between facilities or vendor companies in a fast and secure fashion.”
A sticking point was uploading rushes to AWS in the first place. The first plan, which was to rely on the filmmaker’s home bandwidth connections, was a non-starter.
“We had Arri raw or 12k raw Blackmagic files set to upload overnight but when the operator checked the next morning only a couple of shots had uploaded,” said Zell. “It’s a calculation; You know what your upload speeds are and what your data volume is. We had to call on friends to help us.”
They included professional high-bandwidth links locally in each city including Sohonet and Brisbane post-house Cutting Edge.
“As 5G improves it will enable a much higher quality production in the cloud,” noted AWS Global M&E Partner manager Jack Wenzinger.
Camera to Cloud
A live stream of the productions on set was also recorded. This was achieved by taking a feed from Teradek wireless units on a camera into the Wi-Fi network. Qtake and Moxion video-assist software were used to manage the signal to the destination. In Mexico, the live TX was managed by 5th Kind, and in Dubai, the camera feed was routed directly over LTE and 5G networks managed by Samara.
In a separate demo, raw camera images were also being taken straight to the cloud using Frame.IO’s Camera to Cloud (C2C) application. The potential here is to link location capture of background plates to real-time playback on a LED volume.
“A director could request another angle, or a different camera move and these could be fed live as background plates to the virtual production stage,” says Zell.
Hosting everything in the cloud can also eliminate the traditional conform stage of relinking high rez media back to editorial assemblies. This was made easier by doing editorial and DI in one platform.
“We had original 6K camera sources and a few scenes with three different tracks which we had to cut altogether,” explains Aleksandras Brokers, a Lithuanian-based editor who worked on the London project Kintsugi. “When building the workflow we decided not to make proxies. We basically eliminated conform by going straight from editorial to DI within Resolve. It was game-changing and saved us a lot of time.” Resolve’s an audio tool, Fairlight, was also used by the audio mixing team in Taiwan for the same project.
On the audio front, Steve Morris, Director of Engineering at Skywalker Sound, said, “Ideally every sound mixer is at least in a controlled room to get the basics right to start. Trying to interact and review on zoom meetings and streaming content with time zone differences on this was quite challenging.”
He added, “You can take a multi-channel mix and encode it binaurally and get some spatial aspect of the mix for listening back on headphones. If the headphones are high quality you can add the signature of the [soundstage] itself. With Covid, everyone has to work with whatever kit is in their house.”
Skywalker Sound mixer Roy Waldsburger found he was able to suggest ways of improving the audio while it was being captured on set, rather than having to fix it in post months later.
“I could watch the live stream of the location shoot in Brisbane and felt I wanted them to see if they capture some background chatter by miccing up a pair of the other actors. I learned that while it’s very useful to communicate with the location recording there are limits to what I should be saying.
”In other words, there are protocols to be worked through about how much ‘interference’ is helpful from crew not physically present on set.
The Supersession was a demonstration of what is possible but one area deliberately not touched on as part of this is the cost. Zell and the HPA were able to draw on the resources and expertise of vendors and cloud providers like AWS at no cost. Being involved in an event which such high visibility among Hollywood CTOs is a big draw. Zell says another event is needed in order to discuss the various cost implications of ingress and egress in and out of cloud and cloud archive.
“These are absolutely key questions but I wanted to take everything to its limit and not to worry about cost,” he says. “What it has done is thrown up other questions such if the data gets lost then whose fault is it? If the company which hosts your data goes bankrupt what happens to your content? We will need to do another major investigation of this in time.”
He added, “I also learned a lot about certain people I will never want to work with again and certain technologies I never want to work with again. I won’t be making this list public.”