recentpopularlog in

robertogreco : subtitles   9

LLN [Language Learning with Netflix]
"LLN is a Chrome extension that gives you superpowers over Netflix. It makes studying languages with films/series more effective and enjoyable."
languages  learning  netflix  chrome  extensions  subtitles  srg  onlinetoolkit  glvo 
may 2019 by robertogreco
The UX design case of closed captions for everyone // Sebastian Greger
"Are video subtitles really chiefly for users who cannot hear or lack an audio device? A recent Twitter thread on “closed captions for the hearing” triggered a brief qualitative exploration and thought experiment – there may well be a growing group of users being forgotten in the design of closed captions.

Most commonly perceived as an auxiliary means for the hearing impaired, video subtitles, a.k.a. closed captions (CC), have only recently started to be widely considered as an affordance for users in situations with no audio available/possible (think mobile devices in public settings, libraries, shared office spaces); the latter to the extend that contemporary “social media marketing guidelines” strongly recommend subtitling video clips uploaded to Facebook, Twitter et al.

So: subtitles are for those who cannot hear, or with muted devices?

Who else uses closed captions?

I’m personally a great fan of closed captions, for various reasons unrelated to either of the above, and have often noticed certain limitations in their design. Hence, the user researcher inside me just did a somersault as I randomly encountered a Twitter thread [https://twitter.com/jkottke/status/1091338252475396097 ] following Jason Kottke asking his 247.000 followers:
After seeing several photos my (English-speaking, non-deaf) friends have taken of their TV screens over the past week, I’m realizing that many of you watch TV with closed captions (or subtitles) on?! Is this a thing? And if so, why?

The 150+ replies (I guess this qualifies as a reasonable sample for a qualitative analysis of sorts?) are a wonderful example of “accessibility features” benefiting everybody (I wrote about another instance recently [https://sebastiangreger.net/2018/11/twitter-alt-texts-on-db-trains/ ]). The reasons why people watch TV with closed captions on, despite having good hearing abilities and not being constrained by having to watch muted video, are manifold and go far beyond those two most commonly anticipated use cases.

[image: Close-up image of a video with subtitles (caption: "Closed captions are used by people with good hearing and audio playback turned on. An overseen use case?")]

Even applying a rather shallow, ex-tempore categorisation exercise based on the replies on Twitter, I end up with an impressive list to start with:

• Permanent difficulties with audio content
◦ audio processing disorders
◦ short attention span (incl., but not limited to clinical conditions)
◦ hard of hearing, irrespective of age
• Temporary impairments of hearing or perception
◦ watching under the influence of alcohol
◦ noise from eating chips while watching
• Environmental/contextual factors
◦ environment noise from others in the room (or a snoring dog)
◦ distractions and multitasking (working out, child care, web browsing, working, phone calls)
• Reasons related to the media itself
◦ bad audio levels of voice vs. music
• Enabler for improved understanding
◦ easier to follow dialogue
◦ annoyance with missing dialogue
◦ avoidance of misinterpretations
◦ better appreciation of dialogue
• Better access to details
◦ able to take note of titles of songs played
◦ ability to understand song lyrics
◦ re-watching to catch missed details
• Language-related reasons
◦ strong accents
◦ fast talking, mumbling
◦ unable to understand foreign language
◦ insecurity with non-native language
• Educational goals, learning and understanding
◦ language learning
◦ literacy development for children
◦ seeing the spelling of unknown words/names
◦ easier memorability of content read (retainability)
• Social reasons
◦ courtesy to others, either in need for silence or with a need/preference for subtitles
◦ presence of pets or sleeping children
◦ avoiding social conflict over sound level or distractions (“CC = family peace”)
• Media habits
◦ ability to share screen photos with text online
• Personal preferences
◦ preference for reading
◦ acquired habit
• Limitations of technology skills
◦ lack of knowledge of how to turn them off

An attempt at designerly analysis

The reasons range from common sense to surprising, such as the examples of closed captions used to avoid family conflict or the two respondents explicitly mentioning “eating chips” as a source of disturbing noise. Motivations mentioned repeatedly refer to learning and/or understanding, but also such apparently banal reasons like not knowing how to turn them off (a usability issue?). Most importantly, though, it becomes apparent that using CC is more often than not related to choice/preference, rather than to impairment or restraints from using audio.

At the same time, it becomes very clear that not everybody likes them, especially when forced to watch with subtitles by another person. The desire/need of some may negatively affect the experience of others present. A repeat complaint that, particularly with comedy, CC can kill the jokes may also hint at the fact that subtitles and their timing could perhaps be improved by considering them as more than an accessibility aid for those who would not hear the audio? (It appears as if the scenario of audio and CC consumed simultaneously is not something considered when subtitles are created and implemented; are we looking at another case for “exclusive design”?)

And while perceived as distracting when new – this was the starting point of Kottke’s Tweet – many of the comments share the view that it becomes less obtrusive over time; people from countries where TV is not dubbed in particular are so used to it they barely notice it (“becomes second nature”). Yet, there are even such interesting behaviours like people skipping back to re-read a dialogue they only listened to at first, as well as that of skipping back to be able to pay better attention to the picture at second view (e.g. details of expression) after reading the subtitles initially.

Last but not least, it is interesting how people may even feel shame over using CC. Only a conversation like the cited Twitter thread may help them realise that it is much more common than they thought. And most importantly that it has nothing to do with a perceived stigmatisation of being “hard of hearing”.

CC as part of video content design

The phenomenon is obviously not new. Some articles on the topic suggest that it is a generational habit [https://medium.com/s/the-upgrade/why-gen-z-loves-closed-captioning-ec4e44b8d02f ] of generation Z (though Kottke’s little survey proves the contrary), or even sees [https://www.wired.com/story/closed-captions-everywhere/ ] it as paranoid and obsessive-compulsive behaviour of “postmodern completists” as facilitated by new technological possibilities. Research on the benefits of CC for language learning, on the other hand, reaches back [https://www.tandfonline.com/doi/abs/10.1080/19388078909557984 ] several decades.

No matter what – the phenomenon in itself is interesting enough to make this a theme for deeper consideration in any design project that contains video material. Because, after all, one thing is for sure: closed captions are not for those with hearing impairments or with muted devices alone – and to deliver great UX, these users should be considered as well."

[See also: https://kottke.org/19/04/why-everyone-is-watching-tv-with-closed-captioning-on-these-days ]
closedcaptioning  subtitles  closedcaptions  text  reading  genz  generationz  audio  video  tv  film  dialogue  listening  howweread  2019  sebastiangreger  literacy  language  languages  ux  ui  television  ocd  attention  adhd  languagelearning  learning  howwelearn  processing  hearing  sound  environment  parenting  media  multimedia  clarity  accents  memory  memorization  children  distractions  technology  classideas 
march 2019 by robertogreco
Why Gen Z Loves Closed Captioning – The Upgrade – Medium
"Old technology finds a surprising new application

“Everyone does it.”

These were the words from my college-aged daughter when I caught her lounging on our couch, streaming Friends with 24-point closed captioning on. She has no hearing impairment, and I wanted to know what she was up to.

Does “everyone” do it? My wife and I turned to Facebook and a private, nationwide group for parents with near-adult children. “Anyone else’s college student (without a hearing disability) watch TV with the closed captioning on and insist that everyone does it?” my wife posted. Seven hundred responses (and counting) later, we had our answer.

“It helps me with my ADHD: I can focus on the words, I catch things I missed, and I never have to go back.”
Many parents expressed similar confusion with the TV-watching habits of their millennial and Gen Z children, often followed with, “I thought it was just us.”

I returned to my daughter, who had now switched to the creepy Lifetime import You.

“Why do you have captions on?” I asked.

“It helps me with my ADHD: I can focus on the words, I catch things I missed, and I never have to go back,” she replied. “And I can text while I watch.”

My multitasking daughter used to watch TV while working on her laptop and texting or FaceTiming on her phone. She kept rewinding the DVR to catch the last few minutes she’d missed because she either zoned out or was distracted by another screen.

Her response turned out to be even more insightful than I realized at first. A number of mental health experts I spoke with — and even one study I found — supported the notion that watching with closed captioning serves a valuable role for those who struggle with focus and listening.

“I do see this a lot in my practice,” said Dr. Andrew Kent, an adolescent psychiatrist practicing in New York and Medical Director of New York START, Long Island. “I believe auditory processing is more easily impacted upon by distractions, and that they need to read [captions] to stay focused.”

Closed captioning is a relatively recent development in the history of broadcasting, and it was designed with the hearing impaired in mind. According to a useful history on the National Captioning Institute’s (NCI) website, the technology dates back to the early 1970s, when Julia Child’s The French Chef “made history as the first television program accessible to deaf and hard-of-hearing viewers.” Real-time captioning arrived later, with stenographers typing at a blazing 250 words-per-minute to keep up with live news and sporting events.

They use captions to focus more intently on the content.
If it wasn’t for the Twenty-First Century Communications and Video Accessibility Act of 2010 and additional rules adopted by the FCC in 2012, it’s unlikely my daughter’s IP-based Netflix streaming content would even have closed captioning options today.

While the NCI doesn’t explicitly acknowledge the growing use of closed captioning by those without hearing impairments, it does note that “closed captioning has grown from an experimental service intended only for people who are deaf to a truly global communications service that touches the lives of millions of people every day in vital ways.”

It’s certainly not just a phenomenon for young people. There are many people my age who admit to using them because they have some middle-aged hearing loss or simply need help understanding what the characters on Luther or Peaky Blinders are saying. They use captions to focus more intently on the content.

The need to read captions for what you can hear might even have a biological base. According to Dr. Sudeepta Varma, a psychiatrist at New York University’s Langone Medical Center, some people may have trouble processing the audio from television.

“I believe that there are a number of individuals who have ADHD who may also suffer from undiagnosed auditory processing disorder (APD), and for these individuals… this may be very helpful,” Dr. Varma told me via email. Closed captioning can provide the visual cues that APD sufferers need to overcome their issues with listening and comprehension, she added.

APD refers to how the brain processes auditory information, and though it supposedly only affects around 5 percent of school-age children, there’s reportedly been a significant uptick in overall awareness. As Dr. Varma pointed out, there may be a lot of people who don’t realize they have APD, but are aware of some of the symptoms, which include being bothered by loud noises, difficulty focusing in loud environments, and forgetfulness.

There may be applications in the classroom, too. In a 2015 study of 2,800 college-age students on the impact of closed captioning on video learning, 75 percent of respondents mentioned that they struggle with paying attention in class. “The most common reasons students used captions… was to help them focus,” Dr. Katie Linder, the research director at Oregon State University who led the study, told me.

And even four years ago, there were hints that the use of closed captioning as a focusing tool would bleed outside the classroom.

As a report on the study put it, “Several people in this study also mentioned that they use captions all the time, not just for their learning experience. Captions with Netflix was mentioned multiple times. So, we know that students are engaging with them outside of the classroom.”

When the NCI first co-developed closed captioning technology some 50 years ago, they called it “words worth watching,” and it did transform millions of lives. Today, we may be witnessing — or reading — a similar revolution."
closedcaptioning  subtitles  closedcaptions  text  reading  genz  generationz  audio  video  tv  film  dialogue  listening  howweread  2019  lanceulnoff  television  adhd  attention  classideas 
march 2019 by robertogreco
Why the Spanish Dialogue in 'Spider-Verse' Doesn't Have Subtitles
"While watching the new animated feature Spider-Man: Into the Spider-Verse – featuring Miles Morales’ big screen debut as the arachnid superhero – it’s reassuring to notice the subtle, yet transcendent details through which the creators ensured both parts of his cultural identity are present.

Miles (voiced by Shameik Moore), an Afro-Latino teen who lives in Brooklyn and first appeared in Marvel’s comics back in 2011, is the son of a Puerto Rican mother and an African-American father. The protagonist’s significance – when it comes to representation – cannot be overstated, making the fact that he and his mother (Rio Morales who’s voiced by Nuyorican actress Luna Lauren Velez) speak Spanish throughout the action-packed narrative truly momentous.

Although brief, the Spanish phrases and words we hear connote the genuine colloquialisms that arise in bilingual homes as opposed to the artificiality that sometimes peppers US-produced movies and feels like the result of lines being fed through Google Translate. It might come as a surprise for some that Phil Lord, known for writing and directing The Lego Movie and 21 Jump Street with his close collaborator Christopher Miller, was not only one of the main scribes and a producer on Spider-Verse, but also the person in charge of the Spanish-language dialogue.

“I grew up in a bilingual household in the bilingual city of Miami where you hear Spanish all over the place, and it’s not particularly remarkable,” he told Remezcla at the film’s premiere in Los Angeles. Lord’s mother is from Cuba and his father is from the States. As part of a Cuban-American family, the filmmaker empathized with Miles’ duality: “I certainly understand what it’s like to feel like you’re half one thing and half something else,” he noted.

[image]

Despite the massive success of Pixar’s Coco, including Spanish-language dialogue in a major studio’s animated release is still rare – doing so without adding subtitles, even for some of the longer lines, is outright daring. “It was important for us to hear Spanish and not necessarily have it subtitled,” said Lord. “It’s just part of the fabric of Miles’ community and family life.”

For Luna Lauren Velez, whose character speaks mostly in Spanish to Miles, Lord and the directors’ decision to not translate her text in any way helped validate the Latino experience on screen. “That was really bold, because if you use subtitles all of a sudden we are outside, and we are not part of this world anymore. It was brilliant that they just allowed for it to exist,” she told Remezcla. Her role as Rio Morales also benefited from the production’s adherence to specificity in the source material, she is not portrayed as just generically Latina but as a Puerto Rican woman from Brooklyn.

With the help of a dialect coach, Velez and Lord were also partially responsible for getting Shameik Moore (who has roots in Jamaica) to learn the handful of Spanish-language expressions Miles uses during the opening sequence were he walks around his neighborhood. “[Luna] has been getting on me! I need to go to Puerto Rico, and really learn Spanish for real,” Moore candidly told Remezcla on the red carpet.

Aside from Rio and Miles, the only other Spanish-speaking character is a villain named Scorpion. The insect-like bad guy who speaks only in Spanish is voiced by famed Mexican performer Joaquín Cosio. “He is an actor from Mexico City who was using slang that we had to look up because we didn’t understand it! I had never heard some of the words he used,” explained Lord.

[video: "Spider-Man: Into the Spider-Verse - "Gotta Go" Clip"
https://www.youtube.com/watch?v=9Q9foLtQidk ]

For Lord, having different Spanish accents represented is one of the parts of Into the Spider-Verse he’s the most proud of. He wanted to make sure Miles and Rio didn’t sound alike to indicate how language changes through different generations. Being himself the child of a Cuban immigrant, the parallels were very direct. “Miles is second-generation, so he speaks different than his mother.”

Velez, who like Miles is born in New York, identifies with what it’s like to communicate in both tongues. “Growing my parents spoke to us in Spanish and we responded in English. Now this happens with my nieces and nephews,” she said. “You want to make sure kids remember their culture and where they come from.” In playing Rio, she thought of her mother who instilled in her not only the language but appreciation for her Latinidad.

Clearly, casting Velez was essential to upholding the diversity and authenticity embedded into Miles Morales’ heroic adventure since not doing so would have been a disservice to an iteration of an iconic figure that is so meaningful for many. “If Spider-Man’s Puerto Rican mom had been played by somebody who isn’t Latino I’d have a problem with that,” Velez stated emphatically."
language  translation  spanish  español  bilingualism  bilingual  srg  edg  glvo  carlosaguilar  2018  spider-verse  spiderman  miami  losangeles  nyc  coco  subtitles  specificity  puertorico  cuba  immigration  via:tealtan  accents  change  adaptation  latinidad 
february 2019 by robertogreco
Language Log: Autour-du-mondegreens: bunkum unbound
"One lesson to learn from these subtitling efforts is how easy it is to find non-systematic phonetic similarities across languages, of the sort that Daniel Cassidy has used to see the relationship between bunkum and Buanchumadh and more broadly to argue t
language  translation  humor  internet  online  trends  video  foreign  subtitles 
november 2007 by robertogreco
veotag :.. home
"Place clickable tags and comments anywhere within a video or audio file; Divide video and audio files into chapters and segments; Let your audience see what's coming up -- and jump to the parts that interest them most"
subtitles  annotation  video  tools  online  presentations  remix  editing  digital  socialsoftware  comments  tags  taxonomy  folksonomy  text  social 
december 2006 by robertogreco
Mojiti
"Mojiti lets you annotate any moment in any online video. Dive into the experience and tell everyone what you really think."
subtitles  annotation  video  tools  online  presentations  remix  editing  digital  socialsoftware  social 
december 2006 by robertogreco
dotSUB.com
"dotSUB is a resource and gathering place for subtitling films from one language into many languages using our unique subtitling tools. These tools expand the power and reach of films by making it possible for people to view and enjoy films in their nativ
subtitles  film  tools  usability  video  translation  language 
june 2006 by robertogreco

Copy this bookmark:





to read