Why do television shows have ASL interpreters, and not CC?

218 views

I’m watching the Apple livestream (yeah, bored). There is an optional large section, about a third of the screen, with someone translating into ASL. Most government livestreams do the same, and many others.

It would *seem* that anyone capable of reading onscreen ASL would be capable of reading closed captions. I understand that ASL is its own language, so it’s not just English, but it’s easy to add multi-language CC, fully automated.

So what is the purpose of a separate ASL stream, as opposed to just having multi-lingual CC?

In: 23

4 Answers

Anonymous 0 Comments

CC is great for prerecorded content, but ASL interpreters are way more accurate than auto-generated instant closed captions. It’s better for them to have the interpreter onscreen than assume the voice recognition software is sending the correct message.

Anonymous 0 Comments

ASL interpreters are more inclusive for people who rely on sign language as their primary means of communication. Closed captioning doesn’t capture the full essence of spoken language for those who are culturally Deaf or hard of hearing.

So, first off, ASL (American Sign Language) is more than just a set of hand gestures to represent English words; it’s a full-fledged language with its own syntax, idioms, and nuances. For someone who’s been using ASL their whole life, interpreting English closed captions can be like reading a second language. For real, imagine you had to constantly translate stuff in your head while trying to keep up with a live event. That’s a mental workout, and not everyone’s down for that.

Plus, you gotta realize that closed captioning isn’t always reliable. It’s often delayed, or the text can get jumbled up. Ever watch something with CC on and go, “Wait, what?” cause the captions just butchered what was actually said? ASL interpreters are trained to convey not just the words, but the tone and context as well.

Government livestreams, Apple events, and the like, they’re all trying to be more accessible. It’s not just about meeting legal requirements but actually making sure the info is comprehensible to as many folks as possible. Deaf culture puts a big emphasis on visual and physical interaction, and ASL interpreters fit that bill.

So yeah, multilingual CC could work for some people, but it’s not a one-size-fits-all solution. Adding an ASL interpreter makes the content more accessible and inclusive for a community that might find captions insufficient.

Anonymous 0 Comments

This is similar to saying that Spanish speakers should just read English closed captions. They’re literally two different languages.

But even further:

In the USA, there is a powerful law called the Americans with Disabilities Act, which requires the government to provide services to people with varying disabilities. In that sense, Deaf people are entitled to receiving communications in their own language.

Anonymous 0 Comments

Others mentioned that ASL communicates more effectively than bare written language to folks who sign, to elaborate on that: watch an ASL interpreter and notice that they’ll often exaggerate spoken language to make it easier for folks who read lips to follow along. They also sometimes exaggerate emotive words or forceful speech via facial expressions and body language – we pick up a lot of information from nonverbal cues and ASL interpreters try to incorporate as much of that information as possible in their interpretation.

Also, note that they’re called “interpreters”. Closed captioning is an incomplete transcription of a spoken language, it doesn’t include information about the emotional content of a speech/performance or other contextual cues. ASL interpreters can translate from idiomatic English to idiomatic ASL via a skilled performance of the speech. It’s not a skill that every hearing person who knows ASL can master, there’s a lot of art to interpreting.