ALERT: This system is being upgraded on Tuesday December 12. It will not be available
for use for several hours that day while the upgrade is in progress. Deposits to DukeSpace
will be disabled on Monday December 11, so no new items are to be added to the repository
while the upgrade is in progress. Everything should be back to normal by the end of
day, December 12.
Different stimuli, different spatial codes: a visual map and an auditory rate code for oculomotor space in the primate superior colliculus.
Abstract
Maps are a mainstay of visual, somatosensory, and motor coding in many species. However,
auditory maps of space have not been reported in the primate brain. Instead, recent
studies have suggested that sound location may be encoded via broadly responsive neurons
whose firing rates vary roughly proportionately with sound azimuth. Within frontal
space, maps and such rate codes involve different response patterns at the level of
individual neurons. Maps consist of neurons exhibiting circumscribed receptive fields,
whereas rate codes involve open-ended response patterns that peak in the periphery.
This coding format discrepancy therefore poses a potential problem for brain regions
responsible for representing both visual and auditory information. Here, we investigated
the coding of auditory space in the primate superior colliculus(SC), a structure known
to contain visual and oculomotor maps for guiding saccades. We report that, for visual
stimuli, neurons showed circumscribed receptive fields consistent with a map, but
for auditory stimuli, they had open-ended response patterns consistent with a rate
or level-of-activity code for location. The discrepant response patterns were not
segregated into different neural populations but occurred in the same neurons. We
show that a read-out algorithm in which the site and level of SC activity both contribute
to the computation of stimulus location is successful at evaluating the discrepant
visual and auditory codes, and can account for subtle but systematic differences in
the accuracy of auditory compared to visual saccades. This suggests that a given population
of neurons can use different codes to support appropriate multimodal behavior.
Type
Journal articlePermalink
https://hdl.handle.net/10161/8313Published Version (Please cite this version)
10.1371/journal.pone.0085017Publication Info
Groh, JM; & Lee, J (2014). Different stimuli, different spatial codes: a visual map and an auditory rate code
for oculomotor space in the primate superior colliculus. PLoS One, 9(1). pp. e85017. 10.1371/journal.pone.0085017. Retrieved from https://hdl.handle.net/10161/8313.This is constructed from limited available data and may be imprecise. To cite this
article, please review & use the official citation provided by the journal.
Collections
More Info
Show full item recordScholars@Duke
Jennifer M. Groh
Professor of Psychology and Neuroscience
Research in my laboratory concerns how sensory and motor systems work together, and
how neural representations play a combined role in sensorimotor and cognitive processing
(embodied cognition).
Most of our work concerns the interactions between vision and hearing. We frequently
perceive visual and auditory stimuli as being bound together if they seem likely to
have arisen from a common source. That's why we tend not to notice that the speakers
on TV sets or in movie theatres are located bes

Articles written by Duke faculty are made available through the campus open access policy. For more information see: Duke Open Access Policy
Rights for Collection: Scholarly Articles
Works are deposited here by their authors, and represent their research and opinions, not that of Duke University. Some materials and descriptions may include offensive content. More info