Jeffery A. Jones
Neural processes underlying perceptual enhancement by visual speech gestures.
NeuroReport, 14, 2213-2218.
Callan, D. E., Jones, J. A., Munhall, K. G., Kroos, C., Callan, A. M. & Vatikiotis-Bateson, E.
published: 2003 | Research publication | Jones Lab
This fMRI studyexploresbrainregions involvedwith perceptual enhancement
a¡orded by observation of visual speech gesture information.
Subjects passively identi¢ed words presented in the
following conditions: audio-only, audiovisual, audio-only with
noise, audiovisual with noise, and visual only. The brain may use
concordant audio and visual information to enhance perception by
integrating the information in a convergingmultisensory site. Consistentwith
response properties ofmultisensory integration sites,
enhanced activity in middle and superior temporal gyrus/sulcus
was greatest when concordant audiovisual stimuli were presented
with acoustic noise. Activity found in brain regions involved with
planning and execution of speech production in response to visual
speech presentedwith degraded or absent auditory stimulation, is
consistent with the use of an additional pathway through which
speech perception is facilitatedby a process of internally simulating
the intended speech act of the observed speaker.
Download: PDF (242k) Callan_et_alNeuroReport03.pdf
revised Feb 12/08
View all Jeffery A. Jones documents