TL;DR
- More than half of Gen Z and millennial media consumers prefer subtitles, according to new survey results from YPulse and Preply.
- While subtitles haven’t always been seen as a first choice, they’ve grown in ubiquity — especially with the rise of online videos that include automatic captioning.
- Captions help watchers keep up with murmuring dialogue, distinguish thick accents and get a head start on a scene.
READ MORE: Survey: Why America is obsessed with subtitles (Preply)
Closed captions aren’t just for the hearing impaired — the rise in its popularity is being driven by younger viewers who are in fact making the use of subtitles while watching television the norm.
In a new “TV and Entertainment report,” YPulse found that more than half of 13-39-year-olds prefer to use subtitles.
And it’s not just because they need them; the younger generation makes use of reading text while watching movies/TV to keep up with murmuring dialogue, to distinguish less familiar accents, and some say just to get a head start on a scene and go back to looking at their phone.
Per the report, 59% of Gen Z survey respondents and 52% of millennials said they use subtitles. 61% of Gen Z males say they prefer to use them.
READ MORE: The Majority of Young People Are Using Subtitles When They Watch TV (YPulse)
These are no outliers. A 2023 report by Preply found Gen Z overwhelmingly the generation most likely to be turning on subtitles (70% of Gen Z respondents said so compared to 53% of Millennials, and just 35% of Baby Boomers).
As to why Gen Z likes to turn on text while watching their shows, part of it, according to Wilson Chapman at IndieWire, is that people in that generation grew up watching videos on social media, where subtitles are the algorithmically encouraged default.
Sara Fischer at Axios writes that TikTok helped normalize captions for young media consumers, who are now turning regularly to subtitles as part of their streaming habits.
READ MORE: Streaming surpasses cable as top way to consume TV (Axios)
“TikTok has an auto caption feature that a lot of content creators will use,” Axios reporter April Rubin told WGBH Morning Edition co-host Jeremy Siegel. “And so people are just a little bit more used to reading as they watch. Another factor that may play into this is that it has been a little tougher to maintain quality sound in the streaming era. So they could be watching subtitles just because they’re missing some of the dialogue with background noise or changing volumes.”
READ MORE: Turn closed captions on: Why Gen-Z loves subtitles on movies, shows and TikToks (WGBH)
Younger kids actively need subtitles to enjoy the content they are watching, according to a Kids Industries survey of US and UK parents with kids 5-15 years old. In this case, subtitles add an increased dimension of understanding to viewing. Watching content with closed captions can reportedly improve literacy, vocabulary, and the speed of reading, the report said.
“For kids’ media brands, the widespread use of closed captions should be a sign to improve accuracy and make sure subtitles are available for all programs,” suggests YPulse.
But closed captions are being used more by all of us. A 2022 report by Netflix revealed that 40% of its global users use closed captions on all the time, while 80% switch them on at least once a month.
In its survey Preply determined that half of Americans used closed captions with the top reason (cited by 72% of respondents) being that subtitles make dialogue easier to understand.
As Chapman lays out in IndieWire, the causes behind muddled dialogue are many and might vary between person to person. For some, the problem is the design of modern televisions; the majority of which place internal speakers at the bottom of the set instead of facing towards the audience, causing significantly worse audio quality. Other issues are caused by sound designs optimized for theatrical experiences, which can result in compressed audio when translated to home.
“A lot of people struggle to hear dialogue now, so turning on closed captioning to decipher what people are saying has become a no brainer move,” he says.
READ MORE: Everyone Watches TV with Subtitles Now. How’d That Happen? (IndieWire)
An article in British broadsheet The Guardian also focuses on the issue of hard-to-hear dialogue which is a known issue in the industry, according to sound mixer Guntis Sics (Thor: Ragnarok), who is quoted in the piece.
Where once actors had to project loudly towards a fixed microphone on set. more portable mics has allowed a shift towards a more intimate and naturalistic style of performance, where actors can speak more softly — or, some might say, mumble.
“Antony Hopkins on Thor spoke like a normal human being, whereas on a lot of other films, there’s a new style with young actors — it’s like they just talk to themselves. That might work in a cinema, but not necessarily when it gets into people’s lounge rooms,” Sics says.
The Guardian’s Katie Cunningham also suggests sound mixes have become more complicated — fine for the 22.2 speakers of Dolby Atmos in a theater but indistinct when played back through a TV’s tiny and tinny speakers.
“When sound is mixed with the best possible audio experience in mind much of that detail can be lost when it’s folded down to laptop speakers, or even your television. It’s often the dialogue that suffers most.”
If you haven’t invested in an expensive speaker set up at home then reliance on the TV’s speaker output alone “could leave you with a subpar experience.”
READ MORE: Mumbling actors, bad speakers or lazy listeners? Why everyone is watching TV with subtitles on (The Guardian)
Of course, the volume of foreign language shows and the phenomenal popularity of some of them — from Squid Game to Money Heist — demands subtitles, but even English-language shows seem too hard for many Americans to understand.
British comedies and dramas that aren’t the usual period dramas like The Crown are often acted with authentic local accents. Peaky Blinders (Birmingham), Derry Girls (Northern Irish) and even contestants on reality TV shows like Love Island are called out. As is Irish Oscar-winning drama The Banshees of Inisherin.
“If people get used to using subtitles where it’s basically required, it becomes a matter of habit to keep them in use even when watching American productions,” says Chapman.
Naturally there’s a whole industry geared up to meet this demand. The Hollywood Reporter’s Kirsten Chuba has done the homework on this.
In her article, she states the captioning services market in the US as valued at nearly $170 million in 2022. Studios however often outsource the work to companies like Rev, which in turn has 75,000 international freelancers on its books for transcription work.
Some studios issue very specific subtitle requirements. Netflix’s style guide includes rules like a limit of 42 characters per line, a set reading speed of up to 20 characters-per-second for adult shows (up to 17 for children’s programs) and an emphasis that “dialogue must never be censored.”
To prepare for live events like awards shows, captioners are given a script in advance of everything from the teleprompter — except for the names of the winners. When people ad-lib or give their acceptance speeches, the captioners are working from scratch.
“The person gets up and thanks someone with a very complicated name. We take a guess at it, but we’re going to spell it wrong. That’s bound to happen,” says Heather York, VP of marketing for captioning company Vitac.
Streamers often ask for subtitles in up to nine languages before their shows drop, creating a new challenge for service providers.
“We’ve got to pivot with our workflows, with our resources,” says Deluxe senior VP Magda Jagucka. “That process to bring non-English original content to global audiences requires multiple translation and adaptation steps.”
AI is already being used to give a first pass at transcription, with human editors then going through to make corrections — but there are current limitations.
“There’s a lot of nuance, and the audio-visual translation isn’t really just based on text,” says Jagucka. “When you’re thinking about AI, it goes through that textual base, but translators get our cues from the sound, from the visual, from the picture, from the tonality of the dialogue and the actors acting, as well.”
It is another instance where AI is a tool to assist rather than replace humans, at least at this stage.
Pat Krouse, VP of operations at Rev, tells THR, “AI is really helpful where it speeds up … moving from a pure typist to an editor and a proofreader, and eventually a summarizer. It makes humans focus on higher value things, as opposed to just pure typing work.”