Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.

Loading...
Thumbnail Image

Date

2013-01

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

109
views
31
downloads

Citation Stats

Abstract

A general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a 'guess and check' heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain's reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3-1.7 degrees, or 22-28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.

Department

Description

Provenance

Citation

Published Version (Please cite this version)

10.1371/journal.pone.0072562

Publication Info

Pages, Daniel S, and Jennifer M Groh (2013). Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization. PloS one, 8(8). p. e72562. 10.1371/journal.pone.0072562 Retrieved from https://hdl.handle.net/10161/17890.

This is constructed from limited available data and may be imprecise. To cite this article, please review & use the official citation provided by the journal.

Scholars@Duke

Groh

Jennifer M. Groh

Professor of Psychology and Neuroscience

Research in my laboratory concerns how sensory and motor systems work together, and how neural representations play a combined role in sensorimotor and cognitive processing (embodied cognition).

Most of our work concerns the interactions between vision and hearing. We frequently perceive visual and auditory stimuli as being bound together if they seem likely to have arisen from a common source. That's why we tend not to notice that the speakers on TV sets or in movie theatres are located beside, and not behind, the screen. Research in my laboratory is devoted to investigating the question of how the brain coordinates the information arising from the ears and eyes. Our findings challenge the historical view of the brain's sensory processing as being automatic, autonomous, and immune from outside influence. We have recently established that neurons in the auditory pathway (inferior colliculus, auditory cortex) alter their responses to sound depending on where the eyes are pointing. This finding suggests that the different sensory pathways meddle in one another's supposedly private affairs, making their respective influences felt even at very early stages of processing. The process of bringing the signals from two different sensory pathways into a common frame of reference begins at a surprisingly early point along the primary sensory pathways.


Unless otherwise indicated, scholarly articles published by Duke faculty members are made available here with a CC-BY-NC (Creative Commons Attribution Non-Commercial) license, as enabled by the Duke Open Access Policy. If you wish to use the materials in ways not already permitted under CC-BY-NC, please consult the copyright owner. Other materials are made available here through the author’s grant of a non-exclusive license to make their work openly accessible.