READ MORE: How AI is Changing Post-Production for the Better (Media Silo)
AI and ML tools are already having an impact on post-production with software largely aimed at delivering more efficient workflows. This article takes a closer look at the applications which include logging and discovery, color grading and high frame rate smoothing.
Broadly speaking, we’re talking about machine learning rather than artificial intelligence though many tools get marketed as AI
As reported by journalist Jonny Elwyn, writing at the Mediasilo blog, ML is a branch of AI and far more constrained in its scope and ability:
Given a large enough dataset of images (e.g. people smiling), a computer program would be able to iteratively learn what a smile looks like, based on patterns of pixels, and so eventually, predictively and accurately find and tag photos of people smiling.
This same approach allows the ‘Neural Engine’ in DaVinci Resolve Studio to power such features as “facial recognition, object detection, smart reframing, speed warp retiming, super scale upscaling, auto color, color matching and more,” according to Blackmagic Design.
EXPLORING ARTIFICIAL INTELLIGENCE:
With nearly half of all media and media tech companies incorporating Artificial Intelligence into their operations or product lines, AI and machine learning tools are rapidly transforming content creation, delivery and consumption. Find out what you need to know with these essential insights curated from the NAB Amplify archives:
- This Will Be Your 2032: Quantum Sensors, AI With Feeling, and Life Beyond Glass
- Learn How Data, AI and Automation Will Shape Your Future
- Where Are We With AI and ML in M&E?
- How Creativity and Data Are a Match Made in Hollywood/Heaven
- How to Process the Difference Between AI and Machine Learning
Similarly, Adobe’s Sensei powers features across the Creative Cloud suite of applications, including content-aware-fill in After Effects and Auto-ducking, Morph Cut, Color Match and Auto Reframe in Premiere Pro.
Elwyn highlights the painstaking technique of rotoscoping as ripe for an intelligently automated overhaul.
Runway is a browser-based video editing tool which offers AI powered rotoscoping as well as other useful tools, such as automatic removal of backgrounds, the removal of objects from a shot with smart ‘in-painting’ and automatic Beat Detection described as ‘Pre-edit to the beat.’
In Elwyn’s estimation, Runway seems to be pushing the boundaries of what is possible in an online video editor that you can use for free, “and are definitely ones to watch.”
Other AI/ML-driven tools include the ability to generate synthetic voices. The text-to-speech engine of Editingtools.io can make creating and re-creating temp voiceovers quick and easy.
Color Grading in the Cloud
Making every shot in a final sequence have the same look and feel regardless of how varied their starting points is one of the reasons professional colorists exist. It takes real skill and a practiced eye to make everything look consistent, Elwyn writes, though colorists might take issue with his summation that this “also tends to be the less creative part of the process compared to the stylistic look creation.”
That’s because the colorist will often work with a DP to set the look in prep — and are often credited with saving many scenes in post that perhaps didn’t benefit from the correct lighting conditions on set.
That said, AI is making inroads into what is often a time-consuming part of the postprocess where a DP will want to supervise the grade but typically lacks the time and has moved on to another project.
Elwyn picks out fylm.ai, which is an in-browser grading application equipped with the ability to replicate the look of any reference image and apply it to footage. It can also match the sensor characteristics of any two cameras and recreate LUTs based on matching a before and after image.
Another tool is Colourlab Ai, co-developed by colorist Dado Valentic, the founder of post house Mytherapy. The software is capable of automatically matching the grade of an entire sequence of shots to a reference shot or image to a quality described in early reviews as equivalent to the first pass by a human colorist.
READ MORE: AI Finally Comes to Grading With Colourlab (No Film School)
One of the traditional tasks for an assistant editor is to do much of the “grunt work” of ingesting, organizing, categorizing, labeling and generally preparing the footage ready for the creative editorial phase.
This usually takes a long time and a lot of repetitive actions, both of which can be eliminated thanks to machine learning techniques for recognizing objects and adding the associated metadata.
Nova.ai has a huge range of AI driven post-production tools that can do everything from visual searches of your footage to automatic translation into 30 other languages using transcription and subtitle translation.
For example, you can search a video to find and tag a range of facial emotions, objects and activities as well as celebrity faces. You can also upload an image of a person and then get Nova to search your footage for that person.
“This could make traditionally time consuming editorial tasks much faster such as finding all of the close ups of a particular actor for a trailer or all of the goals in a football match for a highlights reel,” says Elwyn. “The days of trawling through archives for shots of particular people or places would be a thing of the past.”
NAB Show Announces Call for Proposals for 2023 BEIT Conference
By Amplify Content Team
Proposals for technical papers and panel sessions are now being accepted for the NAB Show Broadcast Engineering and IT (BEIT) Conference, taking place April 15-19, 2023 at the Las Vegas Convention Center. Selected papers will be presented during the program and included in the official “Proceedings of the 2023 Broadcast Engineering and IT Conference.”
The BEIT Conference at NAB Show is designed for broadcast engineers and technicians, media technology managers, contract engineers, broadcast equipment manufacturers and distributors, engineering consultants and R&D engineers. The conference focuses on technical issues facing today’s media professionals with an emphasis on the evolution of next-generation content delivery systems for radio, television and the broader media and IT ecosystem. Paper and session proposals may be submitted online through the 2023 BEIT Call for Papers and Panels portal.
Proposals explaining the underlying technologies used in new broadcast products or services will be considered. Proposals submitted for the purpose of promoting company products or services will not be accepted.
The deadline for proposals is November 4, 2022. Submissions accepted for presentation will be notified by January 20, 2023.
Proposal and abstract submissions for conference sessions, panels, exhibition floor theaters, speakers and other events are also being accepted for multiple NAB Show programs through November 4, 2022. For general session ideas, visit the NAB Show general Call for Speakers.
You should also take a look at the motion grading technology TrueCut Motion from Pixelworks. The software technology allows filmmakers to dial in the judder and motion blur, with any source frame rate, shot-by-shot in post-production. It then ensures that these creative choices are delivered consistently across every screen, whether in the theatre or the home.
James Cameron is using the technique to re-release Avatar and Titanic in 48 fps and 4K HDR. High frame rate titles like The Hobbit: An Unexpected Journey werenot universally well received but TrueCut Motion applies the conversion in a way that maintains the cinematic look and feel of the director’s intent.
As an example of how to work with HFR capture, at CinemaCon earlier this year, Pixelworks demonstrated a clip from The Hobbit from the original release captured at 48fps and contrasted with a TrueCut motion grade still at 48fps but now with a cinematic 24fps look.
“Motion grading is a creative choice,” explains Miguel Casillas, senior director of ecosystem marketing at Pixelworks. “The process is led by the director and/or DP and can be as granular as they like. We create a series of motion grades for review and the filmmaker selects which they prefer shot by shot. After a while, the process becomes intuitive.”