From live HDR and 12K capture to realtime LED lighting and 3D gesture-based visualization, here are the production breakthroughs of the year.
12K Production Arrives
2020 was destined to be the year of mainstream 8K and it didn’t disappoint. The growing list of series and feature film captured in the format includes Season 2 of Amazon’s Homecoming and Jurassic World: Dominion. Producers can’t sell productions delivered in less than 4K to Netflix and while concentration on K alone can obscure viable reasons to shoot on digital cameras with lower resolution, we can expect 8K shooting options to multiply.
While RED (and its Panavised version) offers the largest 8K sensor choice, Canon’s EOS R5 full-frame camera costs a lot less and is able to record 8K Raw up to 30fps. Chinese manufacturer Kinefinity also brought out the MAVO Edge which offers 8K at 75fps. Blackmagic Design skipped 8K entirely in its product line, spending three years in development on a new 12K chip for the Ursa Mini body. The camera houses a 12,288 x 6480 S35 sensor, claims 14 stops of dynamic range and shoots 80 megapixels per frame in BMD RAW.
One of the first to use the camera in anger is London-based indie Hangman. “I’ve always looked for a digital format which has the same robustness you get from negative film,” says its director James Tonkin. “Most sensors in raw deliver high resolution and high dynamic range which gives you all the benefits of flexibility when it comes to post. Plus, the noise gets smaller. The final image just looks better.”
During lockdown, Hangman shot promos for Bastille and Damon Albarn and a concert performed by singer-songwriter Passenger at London’s Royal Albert Hall all using a mix of Red 8K and BMD 12K cameras.
“Shooting 8K on the BMD 12K is the sweet spot for me,” Tonkin says. “At that resolution it pairs nicely with my A cameras shooting 8K.”
Shooting 12K eats more disc space but for Tonkin that’s the only downside. “It costs about $100 for a portable 5TB drive,” he says. “That’s less than a catering bill for lunchtime sandwiches so I’m never really that concerned about data.”
Each project is recorded on 2TB SSDs and transferred to a Lumaforge Jellyfish shared storage system. “If we’re remote editing we’ll send H.264 proxies via Dropbox but working on prem we don’t need to create proxies,” he says. “We will work with the original camera raw on our Mac suite (Mac Pro 2019 and iMac Pro) with finishing in Resolve.”
Rushes for the Passenger project totalled 4.3TB which grew to nearly 6TB during post. Hangman backup onto dual SANs during production and archive for a year on LTO tape. Clients are sent copies on a 5TB drive.
“If you’re shooting 120 days and need to account for three backups then you need to cost and prepare for that kind of scale but the hard drives are one of the smaller costs. To me it’s worth it to have more pixels to play with in post,” says Tonkin.
“Unless you’re pursuing a specific Super 8 or 16mm look from acquisition to delivery and it’s all baked-in, then why would you not give yourself greater creative options?”
Live UHD HDR Takes Off
Fox Sports claimed an historic production in February with the first UHD HDR Super Bowl but in truth it is BT Sport which is streets ahead.
The UK broadcaster has already produced hundreds of hours of live sport in UHD HDR and natively too from 4K cameras not upscaled from 1080p as was Super Bowl LIV’s production.
“HDR is the norm for all our Tier 1 sport,” says Andy Beale, Chief Engineer, BT Sport. This includes all Champions League matches shown on its Ultimate premium channel, Premiership Rugby and all 50 English Premier League soccer matches in the 2020-21 season.
BT Sport works from a Sony S-Log3 HDR using a large sensor Sony F55 in the main camera position with other feeds from Sony HDC-4300s and P43s.
“When we began we knew our HDR delivery would be to mobile (phone, tablet) and large screen apps (Samsung TVs, PS4, Xbox, Apple TV, Roku and Fire sticks) and not via TV because of complications around HDMI signaling. Most of those devices support PQ HDR and don’t currently have HLG HDR support.
“We therefore had to choose the right workflow to match that deliverable,” Beale explains. “The great thing about S-Log3 is we can cut both variables. It’s a bit like working in the highest resolution and cutting the frame you need from it. We’re doing the same in the color space by starting in the best quality master which is S-Log3 and cutting a PQ distribution first. We can easily do an HLG deliverable as and when.”
Mindful of the need to deliver an SDR output from the same source, BT Sport operates what Beale calls a “closed loop” system. “We still need to deliver superb HD SDR pictures for our other platforms and we are mindful that all of our live sport deliverables are not just for ourselves but for third parties. Consequently, we always measure the performance of our HDR system against the relative performance of the HD SDR that will be derived from it.”
BT Sport supplies copies of every angle to VAR and other rights holders, most of which are still in SDR. “We shade on the SDR output as normal and we’ve introduced a new role of HDR supervisor who looks after all HDR settings to give us really the most exciting and realistic wide color gamut pictures. The closed loop is that we also make sure those pictures color match completely with SDR ISOs.” That’s fine when BT Sport is in end to end control of the workflow but presents a wrinkle when taking in third party feeds, as it has begun doing this season with German Bundasliga games.
“Every broadcaster will have their own decision about the offset in gain between SDR and HDR pictures and it can vary massively from country to country,” Beale says. “You can’t just flick an HRD switch. You have to have a conversation with the content provider and then build that gain into your workflow so it matches your feed.”
Due to Covid-19 restrictions, BT Sport is now mixing this HDR workflow from its base in Stratford regardless of whether presentation is in its studio or at the venue.
What’s more it has already upped the ante to 8K HDR. BT Sport screened the UK’s first public live 8K HDR sports broadcast (a UEFA Europa League clash between Arsenal v Olympiacos) just before lockdown and were it not for Covid would have pressed go on the world’s first regular season 8K UHD live broadcasts from September.
“Because of Covid we are restricted from numbers in the stadia and we’re awaiting camera inventory from Japan,” he says. “This is still the plan, it’s just on pause.”
LEDs Light Virtual Sets for Live Broadcast
The use of LEDs as digital backlot is transformative in allowing directors, cinematographers and every other department to see and often manipulate in real-time the physical set and actors composited with digital images and creatures.
Equally transformative is the ability to use LEDs as the principal light source. DPs are able to photograph and make lighting decisions in real-time, which they can’t when shooting blue or green screen. Most famously used to produce The Mandalorian, the technology has also filtered into live broadcast.
Virtual studio the Cube, designed by Eurosport, debuted at the 2018 Winter Olympics but clunky workflows meant it was never used live. This year the tech was revamped and used for remote presentation of the US Open and Roland Garros tennis majors. Players including Karolina Pliskova were interviewed live at Flushing Meadows and presented as if standing next to presenter Barbara Schett-Eagle in the Cube, which was located in London.
“The key factor was that production needed to be much quicker,” explains Alex Dinnin, Director of Graphics & Innovation at Eurosport. “We needed to run EVS and graphics and not have to convert it to a bespoke codec. It needed to be controlled by a vision mixer just like a normal studio.”
Lighting control is vital and with the new Cube the lighting director is able to synchronize both real and virtual lighting. If they fade up a real light in the studio, the system will turn a version of that light on in the virtual world at the same time. Likewise, if the color temperature of the live feed in the virtual world changes, the lighting director can change the studio light to match.
“The LD needs to get involved early in the virtual studio production to make sure the lights are in the right place in both worlds,” explains Andy Hook, lead technical specialist at Cube co-designers White Light. “You need to plan shots around what will and won’t be seen in the real world and in the virtual world.”
BlackTrax tracking automatically lights the presenter as they move. “Rather than having lots of different lit positions depending on where [Barbara] stands in the space, the lighting automatically maintains a perfect light on her,” Hook explains. “She creates a natural shadow on the floor because she is being lit properly with a key, fill and back light. You also get realistic reflections on the set because the virtual environment and video sources reflect onto the walls, an interactive table, onto glasses, jewelry. It’s those little things that add believability and cause people to question whether what they are seeing is real.”
R&D continues at pace. The Cube’s next outing is the Australian Open in January, where Eurosport will hope to achieve greater control over the lighting and camera positioning of green screen studios on site. Then comes the postponed 2020 Olympics.
Hook reveals, “There are definitely tricks we haven’t shown yet which will show we have moved the goalposts even further on.”
Spatial Computing Enters Production
The way we interact with screens is changing. 3D is entering the mainstream for both consumers and commercial applications through VR and AR. Our interaction with virtual objects won’t remain two-dimensional for long.
Witness Sony’s launch of a desktop monitor that enables users to view 3D content with the naked eye. The SR Display is a $5,000 4K panel intended to assist in the creation of 3D CGI.
“We’re focusing the product on film production and animators for CG design, modelling and virtual production,” explains Nick Colsey, VP Business Development, Sony Electronics. “Creative industries have been trying to visualize a 3D object using only 2D tools or designing VR and AR with headsets. It makes sense to see the content without needing a headset.”
Sony Pictures has used the SR Display to pre-viz characters and concepts in 3D models for Ghostbusters: Afterlife. Technicolor-owned facility The Mill have also tested the display. “This isn’t just a clearer screen moving more pixels around — this is something new,” affirms Andrew Proctor, The Mill’s Creative director. “You’re not defining a set frame but you’re given a window you find yourself leaning around.”
Like other autostereoscopic screens, such as one from Looking Glass, Sony’s Display is compatible with leading games engine software Unity and Unreal. These render CG assets in realtime and display it in accordance to the viewer’s head position.
“Eye tracking is the unique part of the product,” says Colsey. “A high-speed vision sensor follows the exact eye position in space, on vertical, horizontal and depth axes simultaneously. Combined with our machine learning algorithm the stereoscopic image is displayed in real-time even if the viewer moves around.”
Making a larger display than the current 15.6-inches would not mean additional cost but reduced pixel density. Plus, the display’s 3D images can only be viewed by one person at time. Nonetheless, the development is a marker on the path toward being able to fully interact with 3D objects for both content creation and consumption.
“We’re already seeing the early stages of spatial computing in our demos,” says Colsey. “The Ghostbusters team used gesture commands to build their content. Paired with the Leap Motion controller creators are able to manipulate the image on the screen using hand gestures such as pinch and zoom. Users can create a virtual button to change the attributes on a 3D model.”
He adds, “You can imagine multiple displays like this connected with colleagues in different parts of the world. You will be able to share and collaborate on work in 3D in realtime. You would need enough bandwidth for this but it is definitely coming.”
The future of imaging is one in which every screen is touch. By the time we get to that point, it’s likely that spatial video will have birthed a new visual language.
As architect Keisuke Toyota says about the SR Display’s potential “This really feels like a step toward remotely communicating in shapes.”