Producers and industry marketers have long aspired to make video experiences “immersive” without ever being challenged on what this means.
Glimpses into research at Dolby Labs shed light on this and particularly how new technology such as VR headgear as well as interactive storytelling techniques need to be tuned to how our brains actually work.
Scott Daly, a bioengineer on the Dolby Laboratories technical staff, has given an interview to SMPTE where he suggests that immersive experiences, such as virtual reality systems, should find ways to mimic how human brains become emotionally impacted by visual experiences.
His starting point is that the use of the term, “immersion” in entertainment is a pretty ineffective way of producing tech like VR gear or the content to match.
“Mirror neurons are what allow us to step into the shoes of somebody else. That is a true aspect of immersion. In immersive media, having that experience would allow you to be literally ‘immersed’ in what is happening to another person.”— Scott Daly, Dolby Laboratories
The term “engagement” is no better. He argues that this often refers to how long you’re willing to wait for streaming content to buffer, or how often you’re willing to return to the same content.
“So really, we have to sidestep simple terms and look at the psychology of immersive media, relating immersiveness to neuroscience and psychophysics,” Daly says.
Along those lines, the science of mirror neurons is of particular interest to Dolby. Mirror neurons are described as specialized brain circuits or systems that connect different sensory portions of the brain to the “premotor cortex” which helps control the body.
“Mirror neurons are what allow us to step into the shoes of somebody else,” Daly states. “That is a true aspect of immersion. In immersive media, having that experience would allow you to be literally ‘immersed’ in what is happening to another person.”
If virtual reality systems could replicate such an experience, Daly suggests, then those systems can have a greater emotional and communicative impact, which is, after all, the goal of any personalized entertainment experience.
The tech industry is making strides in this area. Qualcomm, Apple, Facebook, Microsoft and others are iterating new versions of VR/AR wearables to not only feel more comfortable on the head — but more compatible with how we actually experience the world.
This goes so far as smart contact lenses, an area where Daly says progress has been “startling.”
A significant roadblock to this pursuit, however, is the ongoing reality that some neuroscience problems involving the brain’s perception system can stand in the way. Nausea and motion sickness experienced by some people viewing stereoscopic 3D a decade ago haven’t been solved. It remains an even bigger challenge in the evolution of virtual reality systems.
The problem revolves around vergence-accommodation conflict (VAC) and it’s been plaguing the VR industry for years.
As Michael Goldman at SMPTE puts it, VAC is essentially a strain on the brain when it receives mismatching cues between a 3D object (vergence) and the focusing distance (accommodation) the eyes require in order to focus properly on an object. The eyes, in other words, are required to focus at a different distance from the distance at which a static display is focused.
Solutions to this are in the lab. One avenue of enquiry is to incorporate code into the VR system that will tune the experience down for user’s experiencing discomfort.
Personalize the Experience
Daly uses the analogy of an amusement park rides. “Some people are very aggressive and enjoy dramatic rides like steep rollercoasters where they go through rapid changes in acceleration or ones with complex rotations,” he says. “Others get queasy with such things. So, you run into the problem with playing back VR content involving motion of how to satisfy these ranges of people’s experiences.”
Tone it down has the upside of no-one feeling ill but a bland experience for many.
Daly suggests that a VR system apply a kind of brake for certain types of acceleration or rotation, so that every person can navigate themselves through a VR environment.
“If you start to feel queasy, you push a button indicating you are feeling queasy, and the system remembers that and tries to avoid it in the future.”
“Our habituated sensitivity to more realistic imagery makes us more demanding customers as we get used to better and better quality. These are issues that, originally, were considered more aesthetic or awareness issues, but which have now become physiological issues.”— Scott Daly, Dolby Laboratories
Another approach to the same end could be to use bio-sensors — such as pulse rate to allow the VR software to automatically dial into a user’s physiological responses.
“The idea is to catch physical reactions that might indicate nausea at a very early stage, maybe even before the subject is aware of it.”
The teething problems begin to stack up. Also needing work is that perception and comprehension of 360-degree imagery can be altered by a subject’s head movements, potentially interfering with the desired impact.
“In many systems, the center of projection won’t [match up with] your head movements, even the smallest head movements,” Daly relates. “So that tends to make you see a world where there is something of an unstable VR environment — some people might say it looks rubbery or not as solid as it feels like it should.”
Can We Take Hyper-Realism?
Some of this may be solved by simple changes in human behavior — or what Daly calls “habituation.” The more we are accustomed to experiencing something a certain way, the more we become immune to it. The classic example in film is the — some say arbitrary — use of 24 frames a second, speeds introduced over 90 years ago to sync film reels with sound tracks on projection. Our understanding of what constitutes a film look is innately connected to the perceived flicker that this frame rate entails. Experiments by filmmakers the caliber of Ang Lee at speeds to 120fps produce a more photoreal visual experience that has not yet found favor among audiences in part because it doesn’t look how we expect it to.
Our relationship to hyper-real imagery will change over time thanks to the experience of younger generations growing up with video gaming and moves over the next few years to increase the frame rates of TV.
New Stories in a New Language
“Basically, our habituated sensitivity to more realistic imagery makes us more demanding customers as we get used to better and better quality,” Daly adds. “These are issues that, originally, were considered more aesthetic or awareness issues, but which have now become physiological issues.”
Directors and DPs working in VR need to change the grammar with which they tell conventional stories. For example, filmmakers have long played with objective and subjective points of view but how well they can do that with immersive media, and whether it even makes any sense to do it, is another question.
“VR content creators are essentially conducting informal psychological experiments over time with their audiences as subjects, and they get feedback from the success or failure of those attempts, and try again,” says Daly. “All of those questions with VR are yet to be determined.”