A Measure of the Contribution of a Gesture to the Perception of Speech in Listeners With Aphasia The contribution of a visual source of contextual information to speech perception was measured in 12 listeners with aphasia. The three experimental conditions were: Visual-Only (referential gesture), Auditory-Only (computer-edited speech), and Audio-Visual. In a two-alternative, forced-choice task, subjects indicated which picture had been requested. The stimuli were first validated with ... Research Article
Research Article  |   October 01, 1994
A Measure of the Contribution of a Gesture to the Perception of Speech in Listeners With Aphasia
 
Author Affiliations & Notes
  • Nancy L. Records
    The University of Iowa, Iowa City
  • Contact author: Nancy L. Records, PhD, Department of Communication Disorders, The Pennsylvania State University, 219 Moore Building, University Park, PA 16802.
  • Currently affiliated with Pennsylvania State University, University Park.
    Currently affiliated with Pennsylvania State University, University Park.×
Article Information
Normal Language Processing / Language Disorders / Aphasia / Language / Research Articles
Research Article   |   October 01, 1994
A Measure of the Contribution of a Gesture to the Perception of Speech in Listeners With Aphasia
Journal of Speech, Language, and Hearing Research, October 1994, Vol. 37, 1086-1099. doi:10.1044/jshr.3705.1086
History: Received June 1, 1993 , Accepted March 1, 1994
 
Journal of Speech, Language, and Hearing Research, October 1994, Vol. 37, 1086-1099. doi:10.1044/jshr.3705.1086
History: Received June 1, 1993; Accepted March 1, 1994

The contribution of a visual source of contextual information to speech perception was measured in 12 listeners with aphasia. The three experimental conditions were: Visual-Only (referential gesture), Auditory-Only (computer-edited speech), and Audio-Visual. In a two-alternative, forced-choice task, subjects indicated which picture had been requested. The stimuli were first validated with listeners without brain damage. The listeners with aphasia were subgrouped as having high or low language comprehension based on standardized test scores. Results showed a significantly larger contribution of gestural information to the responses of the lower-comprehension subgroup. The contribution of gesture was significantly correlated with the amount of ambiguity experienced with the auditory-only information. These results show that as the auditory information becomes more ambiguous, individuals with impaired language comprehension deficits make greater use of the visual information. The results support clinical observations that speech information received without visual context is perceived differently than when received with visual context.

Acknowledgments
This research was completed in partial fulfillment of the PhD program requirements. The author gratefully acknowledges the guidance of Drs. Bruce Tomblin and Linda Jordan during this project. Ning Li provided valuable assistance with the statistical analyses. Appreciation is expressed to the individuals who participated in this research and to the referral sources (Mercy Hospital of both Davenport and Des Moines; Mercy Hospital Stroke Club, Iowa City; University of Iowa Hospitals and Clinics; Yonkers Rehabilitation Center, Des Moines; Methodist Medical Center of Illinois, Peoria, IL). This research was supported by grants from Sigma XI, the University of Iowa’s Video Production Center, and the University of Iowa’s Department of Speech Pathology and Audiology. Portions of this paper were presented at the ASHA Convention in Atlanta, November 1991.
Order a Subscription
Pay Per View
Entire Journal of Speech, Language, and Hearing Research content & archive
24-hour access
This Article
24-hour access