Skip to content
NAB Show NAB Show New York
  • Stories
    • Create
    • Connect
    • Capitalize
    • Intelligent Content
  • Events
  • Video Library
    • 2023 NAB Show New York
    • Demo Days
    • Video Learning Lab
  • Sign Up
  • Sign In
To See More Search Results, Hit Enter...
Showing 1–10 of 235 for “nab”
< Previous
  • 1
  • 2
  • 3
Next >
October 26, 2023
Posted October 26, 2023

Motion Capture Makes a Move Into Mainstream

author
Adrian Pennington


TL;DR

  • In a recent webinar, Performit examines the current state of mocap, emphasizing the importance of capturing high-quality RAW mocap data.


Motion capture may have brought to life the Na’vi in Avatar for multi-billion dollar success but creating realistic motion is always a challenge and perfect data is a myth. What’s more, no one outside of Marvel or James Cameron has the budget for the most high-end systems or the time to work with them.

As Jon Dalzell, co-founder of the British-based developer Performit Live, explains in a webinar, traditionally there are two ways to capture motion.

You can use generic motion capture libraries, which involves searching for assets and paying a royalty fee for use. You then would need to adjust every animation for cohesion and manually keyframe any missing motion.

Or you can originate a location-based shoot, which entails everything from sourcing talent to hiring technical support, while typically waiting for days for the capture data to be processed.

All techniques, including top-of-the-range studio-based models using advanced camera tracking and markers, generate imperfect or noisy data that needs to be cleaned up.

Foot slide, for example, is a common issue where an animated character’s feet appear to slide or glide across the floor, instead of having a firm, realistic contact. This problem occurs due to inaccurate capture or translation of motion data onto the character model. It can also result from inadequate synchronization between the captured motion data and the animation rig, due to imprecise calibration.

Facial capture technology in mocap studios involves tracking dots on the actor’s face using infrared light. Dalzell notes that the number of cameras and their proximity to the actor affect the accuracy of the subtle movements captured in 3D space.

Studios must procure specialized cameras to detect markers on the motion capture suits, converting physical movements into digital data. Commercially available camera systems can cost upwards of $250,000 — custom systems used by large studios likely cost even more.

Those suits are also expensive. They are embedded with sensors, crucial for capturing the essence of human motion. A $2,500 suit is considered cheap, with many camera-based options costing more than $15,000 each.

Alongside these, there’s the software to process and convert raw motion data for animators to work with.

“To capture your vision, you need to be able to create authentic motion,” Dalzell says. “Researching, organizing and producing motion capture takes time and money, often leaving you working against the clock.”

He claims the high attrition rate in the industry, with 90% of animators citing high stress and burnout, meaning there’s a need for more efficient and effective working processes and methods.

(That’s where Performit Live comes in. “Our platform digitally connects you seamlessly with highly skilled professional performers wearing our smart motion capture system enabling you to direct the performer through remote rehearsal, capturing the exact moves you need and downloaded to you in seconds.”)

View this post on Instagram

A post shared by Performit Live 🇬🇧 (@performit_live)

“Wearable technology will have a renaissance with advancements in fabrics and electrical integration,” Dalzell says. “You will be able to capture motion data in any location without cables or limitations.

Wearable technology like Performit’s can store motion data locally and upload it to the cloud when a connection is available, “allowing for unique and nuanced captures of elite performers in their real environments.”

Performit reports that they are developing technology for multi-performer remote production.

  • Content Creation
  • Live Event Production
  • Broadcast
  • Streaming
  • Create
  • Acquisition and Production
  • Motion Capture / Virtual Production

Subscribe

for more content like this sent directly to your inbox:

Sign Up
Related Article
Connecting Motion Capture to Virtual Production
Connecting Motion Capture to Virtual Production

Many virtual production setups rely on motion tracking to locate the camera, even when motion capture isn’t being for animation.

  • #NABAmplify
  • More NAB:
  • NAB Amplify
  • NAB Show
  • NAB Show New York
  • Policy
  • Privacy Policy
  • Terms of Use
  • Code of Conduct
  • Cookie Policy
  • Quick Links
  • Advertising & Thought Leadership
  • NAB Amplify Press
  • FAQs
  • Technical Difficulties
  • Contact
  • Cookie Preferences
  • RSS Feed
The Angle Newsletter

Weekly editorial newsletter covering the latest content, events and more taking place on NAB Amplify.

Subscribe

The thoughts and opinions expressed on NAB Amplify do not constitute official statements or positions by the National Association of Broadcasters.

© 2023 National Association of Broadcasters. All Rights Reserved.