TL;DR
- In a recent webinar, Performit examines the current state of mocap, emphasizing the importance of capturing high-quality RAW mocap data.
Motion capture may have brought to life the Na’vi in Avatar for multi-billion dollar success but creating realistic motion is always a challenge and perfect data is a myth. What’s more, no one outside of Marvel or James Cameron has the budget for the most high-end systems or the time to work with them.
As Jon Dalzell, co-founder of the British-based developer Performit Live, explains in a webinar, traditionally there are two ways to capture motion.
You can use generic motion capture libraries, which involves searching for assets and paying a royalty fee for use. You then would need to adjust every animation for cohesion and manually keyframe any missing motion.
Or you can originate a location-based shoot, which entails everything from sourcing talent to hiring technical support, while typically waiting for days for the capture data to be processed.
All techniques, including top-of-the-range studio-based models using advanced camera tracking and markers, generate imperfect or noisy data that needs to be cleaned up.
Foot slide, for example, is a common issue where an animated character’s feet appear to slide or glide across the floor, instead of having a firm, realistic contact. This problem occurs due to inaccurate capture or translation of motion data onto the character model. It can also result from inadequate synchronization between the captured motion data and the animation rig, due to imprecise calibration.
Facial capture technology in mocap studios involves tracking dots on the actor’s face using infrared light. Dalzell notes that the number of cameras and their proximity to the actor affect the accuracy of the subtle movements captured in 3D space.
Studios must procure specialized cameras to detect markers on the motion capture suits, converting physical movements into digital data. Commercially available camera systems can cost upwards of $250,000 — custom systems used by large studios likely cost even more.
Those suits are also expensive. They are embedded with sensors, crucial for capturing the essence of human motion. A $2,500 suit is considered cheap, with many camera-based options costing more than $15,000 each.
Alongside these, there’s the software to process and convert raw motion data for animators to work with.
“To capture your vision, you need to be able to create authentic motion,” Dalzell says. “Researching, organizing and producing motion capture takes time and money, often leaving you working against the clock.”
He claims the high attrition rate in the industry, with 90% of animators citing high stress and burnout, meaning there’s a need for more efficient and effective working processes and methods.
(That’s where Performit Live comes in. “Our platform digitally connects you seamlessly with highly skilled professional performers wearing our smart motion capture system enabling you to direct the performer through remote rehearsal, capturing the exact moves you need and downloaded to you in seconds.”)
“Wearable technology will have a renaissance with advancements in fabrics and electrical integration,” Dalzell says. “You will be able to capture motion data in any location without cables or limitations.
Wearable technology like Performit’s can store motion data locally and upload it to the cloud when a connection is available, “allowing for unique and nuanced captures of elite performers in their real environments.”
Performit reports that they are developing technology for multi-performer remote production.