Why do television shows have ASL interpreters, and not CC?

222 views

I’m watching the Apple livestream (yeah, bored). There is an optional large section, about a third of the screen, with someone translating into ASL. Most government livestreams do the same, and many others.

It would *seem* that anyone capable of reading onscreen ASL would be capable of reading closed captions. I understand that ASL is its own language, so it’s not just English, but it’s easy to add multi-language CC, fully automated.

So what is the purpose of a separate ASL stream, as opposed to just having multi-lingual CC?

In: 23

4 Answers

Anonymous 0 Comments

Others mentioned that ASL communicates more effectively than bare written language to folks who sign, to elaborate on that: watch an ASL interpreter and notice that they’ll often exaggerate spoken language to make it easier for folks who read lips to follow along. They also sometimes exaggerate emotive words or forceful speech via facial expressions and body language – we pick up a lot of information from nonverbal cues and ASL interpreters try to incorporate as much of that information as possible in their interpretation.

Also, note that they’re called “interpreters”. Closed captioning is an incomplete transcription of a spoken language, it doesn’t include information about the emotional content of a speech/performance or other contextual cues. ASL interpreters can translate from idiomatic English to idiomatic ASL via a skilled performance of the speech. It’s not a skill that every hearing person who knows ASL can master, there’s a lot of art to interpreting.

You are viewing 1 out of 4 answers, click here to view all answers.