Cued Speech for the Deaf

Jun 12 2013 David Titmus
Filters

Popular posts

Slightly blurred image of the YouTube home screen, with thumbnail images of videos running across the top of the screen.
A Guide to Downloading Subtitles and Captions from YouTube: Enhancing Accessibility and User Experience A Guide to Downloading Subtitles and Captions from YouTube: Enhancing Accessibility and User Experience
Cell phone laying on a desk near a computer keyboard with the Twitch logo displayed on the phone screen
How to Add Captions to Twitch How to Add Captions to Twitch

Related posts

A hand points a remote control at a blurred television screen
VITAC Partners with ‘Turn on the Subtitles’ to Promote Children’s Literacy Through Captions and Subtitles VITAC Partners with ‘Turn on the Subtitles’ to Promote Children’s Literacy Through Captions and Subtitles
A view of the Detroit Tigers baseball field from the upper deck, down the first base line
VITAC Captions Keep Fans in the Game During the MLB Playoffs VITAC Captions Keep Fans in the Game During the MLB Playoffs
Share
Copied!

Most profoundly deaf individuals benefit in some way from lip-reading. Though the skill is sometimes viewed as a “can” or “can’t ” ability that one either possesses or does not, nearly every Deaf, hard-of-hearing and hearing individual could probably tell the difference between understanding their boss speaking to them face-to-face, and talking to someone in darkness. Like all human perception, speech recognition draws from a range of senses not limited to speech and hearing.

Though many advocates of Deaf rights incorrectly state that only 30% of English words can be discered through lip-reading (the study states that 30% of English phonemes cannot be distinguished, but does not discuss contextualized speech) the skill of lip-reading is just that — a capability learned through practice. Lip-reading, also called “speech-reading,” is an important link to the hearing world for Deaf and hard-of-hearing individuals, but also one that can be difficult and frustrating. Consonents often sound similar, such as p and b, s and z, and f and v, and while context provides clues as to the speaker’s meaning, it offers a finite level of help.

To improve the level of understanding of individuals who rely on lip-reading, Dr. R. Orin Cornett at Gallaudet University, the prominent American Deaf college, invented cued speech in 1966. Cued speech is a system of eight signs performed in four movements or positions around the face that indicate phonemes that are similar to one another. The system is meant to supplement a speaker’s words with critical information in order to improve comprehension in the listener.

Though to some, it looks like sign language, it is not: cued speech has a much smaller vocabulary of signs and relies more on mouth shapes than hand gestures. It is especially effective among Deaf or hard of hearing individuals who go on to get cochlear implants, as it gives them a better understanding of the different phonemes before they are able to effectively hear them. Though the cued speech method originated as an educational tool for deaf, hard of hearing, and Austistic students, (the NCSA also recommends it as a learning tool for early readers who can hear), it has evolved into a communication method of its own and is a primary dialect between some cuers.

Read more about cued speech on the National Cued Speech Association website, or see a video of what cued speech looks like.

by Carlin Twedt