Experiments on Auditory-Visual Perception of Sentences by Users of Unilateral, Bimodal, and Bilateral Cochlear Implants Purpose Five experiments probed auditory-visual (AV) understanding of sentences by users of cochlear implants (CIs). Method Sentence material was presented in auditory (A), visual (V), and AV test conditions to listeners with normal hearing and CI users. Results (a) Most CI users report that most of ... Research Article
EDITOR'S AWARD
Research Article  |   December 01, 2016
Experiments on Auditory-Visual Perception of Sentences by Users of Unilateral, Bimodal, and Bilateral Cochlear Implants
 
Author Affiliations & Notes
  • Michael F. Dorman
    Department of Speech and Hearing Science, Arizona State University, Tempe
  • Julie Liss
    Arizona State University, Tempe
  • Shuai Wang
    Arizona State University, Tempe
  • Visar Berisha
    Arizona State University, Tempe
  • Cimarron Ludwig
    Arizona State University, Tempe
  • Sarah Cook Natale
    Arizona State University, Tempe
  • Disclosure: The authors have declared that no competing interests existed at the time of publication.
    Disclosure: The authors have declared that no competing interests existed at the time of publication. ×
  • Correspondence to Michael Dorman: mdorman@asu.edu
  • Editor: Nancy Tye-Murray
    Editor: Nancy Tye-Murray×
  • Associate Editor: Richard Dowell
    Associate Editor: Richard Dowell×
Article Information
Hearing & Speech Perception / Acoustics / Hearing Aids, Cochlear Implants & Assistive Technology / Audiologic / Aural Rehabilitation / Speech, Voice & Prosody / Hearing / Research Articles
Research Article   |   December 01, 2016
Experiments on Auditory-Visual Perception of Sentences by Users of Unilateral, Bimodal, and Bilateral Cochlear Implants
Journal of Speech, Language, and Hearing Research, December 2016, Vol. 59, 1505-1519. doi:10.1044/2016_JSLHR-H-15-0312
History: Received September 8, 2015 , Revised February 12, 2016 , Accepted April 4, 2016
 
Journal of Speech, Language, and Hearing Research, December 2016, Vol. 59, 1505-1519. doi:10.1044/2016_JSLHR-H-15-0312
History: Received September 8, 2015; Revised February 12, 2016; Accepted April 4, 2016

Purpose Five experiments probed auditory-visual (AV) understanding of sentences by users of cochlear implants (CIs).

Method Sentence material was presented in auditory (A), visual (V), and AV test conditions to listeners with normal hearing and CI users.

Results (a) Most CI users report that most of the time, they have access to both A and V information when listening to speech. (b) CI users did not achieve better scores on a task of speechreading than did listeners with normal hearing. (c) Sentences that are easy to speechread provided 12 percentage points more gain to speech understanding than did sentences that were difficult. (d) Ease of speechreading for sentences is related to phrase familiarity. (e) Users of bimodal CIs benefit from low-frequency acoustic hearing even when V cues are available, and a second CI adds to the benefit of a single CI when V cues are available. (f) V information facilitates lexical segmentation by improving the recognition of the number of syllables produced and the relative strength of these syllables.

Conclusions Our data are consistent with the view that V information improves CI users' ability to identify syllables in the acoustic stream and to recognize their relative juxtaposed strengths. Enhanced syllable resolution allows better identification of word onsets, which, when combined with place-of-articulation information from visible consonants, improves lexical access.

Acknowledgments
This article reports the results from the PhD dissertation of Shuai Wang and the master's-thesis work of Cimarron Ludwig. Experiment 3 was entirely the work of Shuai Wang and Visar Berisha. These projects were supported by National Institute on Deafness and Other Communication Disorders Grants R01 DC 010821 and R01 DC DC010494, awarded to Michael F. Dorman, and R01 DC006859, awarded to Julie Liss.
Order a Subscription
Pay Per View
Entire Journal of Speech, Language, and Hearing Research content & archive
24-hour access
This Article
24-hour access