Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
Abstract
A general problem in learning is how the brain determines what lesson to learn (and
what lessons not to learn). For example, sound localization is a behavior that is
partially learned with the aid of vision. This process requires correctly matching
a visual location to that of a sound. This is an intrinsically circular problem when
sound location is itself uncertain and the visual scene is rife with possible visual
matches. Here, we develop a simple paradigm using visual guidance of sound localization
to gain insight into how the brain confronts this type of circularity. We tested two
competing hypotheses. 1: The brain guides sound location learning based on the synchrony
or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative
mechanism. 2: The brain uses a 'guess and check' heuristic in which visual feedback
that is obtained after an eye movement to a sound alters future performance, perhaps
by recruiting the brain's reward-related circuitry. We assessed the effects of exposure
to visual stimuli spatially mismatched from sounds on performance of an interleaved
auditory-only saccade task. We found that when humans and monkeys were provided the
visual stimulus asynchronously with the sound but as feedback to an auditory-guided
saccade, they shifted their subsequent auditory-only performance toward the direction
of the visual cue by 1.3-1.7 degrees, or 22-28% of the original 6 degree visual-auditory
mismatch. In contrast when the visual stimulus was presented synchronously with the
sound but extinguished too quickly to provide this feedback, there was little change
in subsequent auditory-only performance. Our results suggest that the outcome of our
own actions is vital to localizing sounds correctly. Contrary to previous expectations,
visual calibration of auditory space does not appear to require visual-auditory associations
based on synchrony/simultaneity.
Type
Journal articleSubject
AnimalsMacaca mulatta
Humans
Eye Movements
Acoustic Stimulation
Sound Localization
Space Perception
Visual Perception
Psychomotor Performance
Reaction Time
Saccades
Adolescent
Adult
Middle Aged
Female
Male
Young Adult
Permalink
https://hdl.handle.net/10161/17890Published Version (Please cite this version)
10.1371/journal.pone.0072562Publication Info
Pages, Daniel S; & Groh, Jennifer M (2013). Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.
PloS one, 8(8). pp. e72562. 10.1371/journal.pone.0072562. Retrieved from https://hdl.handle.net/10161/17890.This is constructed from limited available data and may be imprecise. To cite this
article, please review & use the official citation provided by the journal.
Collections
More Info
Show full item recordScholars@Duke
Jennifer M. Groh
Professor of Psychology and Neuroscience
Research in my laboratory concerns how sensory and motor systems work together, and
how neural representations play a combined role in sensorimotor and cognitive processing
(embodied cognition).
Most of our work concerns the interactions between vision and hearing. We frequently
perceive visual and auditory stimuli as being bound together if they seem likely to
have arisen from a common source. That's why we tend not to notice that the speakers
on TV sets or in movie theatres are located bes

Articles written by Duke faculty are made available through the campus open access policy. For more information see: Duke Open Access Policy
Rights for Collection: Scholarly Articles
Works are deposited here by their authors, and represent their research and opinions, not that of Duke University. Some materials and descriptions may include offensive content. More info