Skip to main content
Duke University Libraries
DukeSpace Scholarship by Duke Authors
  • Login
  • Ask
  • Menu
  • Login
  • Ask a Librarian
  • Search & Find
  • Using the Library
  • Research Support
  • Course Support
  • Libraries
  • About
View Item 
  •   DukeSpace
  • Duke Scholarly Works
  • Scholarly Articles
  • View Item
  •   DukeSpace
  • Duke Scholarly Works
  • Scholarly Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Auditory signals evolve from hybrid- to eye-centered coordinates in the primate superior colliculus.

Thumbnail
View / Download
2.2 Mb
Date
2012-07
Authors
Lee, Jungah
Groh, Jennifer M
Repository Usage Stats
114
views
20
downloads
Abstract
Visual and auditory spatial signals initially arise in different reference frames. It has been postulated that auditory signals are translated from a head-centered to an eye-centered frame of reference compatible with the visual spatial maps, but, to date, only various forms of hybrid reference frames for sound have been identified. Here, we show that the auditory representation of space in the superior colliculus involves a hybrid reference frame immediately after the sound onset but evolves to become predominantly eye centered, and more similar to the visual representation, by the time of a saccade to that sound. Specifically, during the first 500 ms after the sound onset, auditory response patterns (N = 103) were usually neither head nor eye centered: 64% of neurons showed such a hybrid pattern, whereas 29% were more eye centered and 8% were more head centered. This differed from the pattern observed for visual targets (N = 156): 86% were eye centered, <1% were head centered, and only 13% exhibited a hybrid of both reference frames. For auditory-evoked activity observed within 20 ms of the saccade (N = 154), the proportion of eye-centered response patterns increased to 69%, whereas the hybrid and head-centered response patterns dropped to 30% and <1%, respectively. This pattern approached, although did not quite reach, that observed for saccade-related activity for visual targets: 89% were eye centered, 11% were hybrid, and <1% were head centered (N = 162). The plainly eye-centered visual response patterns and predominantly eye-centered auditory motor response patterns lie in marked contrast to our previous study of the intraparietal cortex, where both visual and auditory sensory and motor-related activity used a predominantly hybrid reference frame (Mullette-Gillman et al. 2005, 2009). Our present findings indicate that auditory signals are ultimately translated into a reference frame roughly similar to that used for vision, but suggest that such signals might emerge only in motor areas responsible for directing gaze to visual and auditory stimuli.
Type
Journal article
Subject
Neurons
Animals
Macaca mulatta
Acoustic Stimulation
Analysis of Variance
Photic Stimulation
Sound Localization
Space Perception
Psychomotor Performance
Action Potentials
Head Movements
Saccades
Time Factors
Female
Male
Statistics as Topic
Superior Colliculi
Permalink
https://hdl.handle.net/10161/17893
Published Version (Please cite this version)
10.1152/jn.00706.2011
Publication Info
Lee, Jungah; & Groh, Jennifer M (2012). Auditory signals evolve from hybrid- to eye-centered coordinates in the primate superior colliculus. Journal of neurophysiology, 108(1). pp. 227-242. 10.1152/jn.00706.2011. Retrieved from https://hdl.handle.net/10161/17893.
This is constructed from limited available data and may be imprecise. To cite this article, please review & use the official citation provided by the journal.
Collections
  • Scholarly Articles
More Info
Show full item record

Scholars@Duke

Groh

Jennifer M. Groh

Professor of Psychology and Neuroscience
Research in my laboratory concerns how sensory and motor systems work together, and how neural representations play a combined role in sensorimotor and cognitive processing (embodied cognition). Most of our work concerns the interactions between vision and hearing. We frequently perceive visual and auditory stimuli as being bound together if they seem likely to have arisen from a common source. That's why we tend not to notice that the speakers on TV sets or in movie theatres are located bes
Open Access

Articles written by Duke faculty are made available through the campus open access policy. For more information see: Duke Open Access Policy

Rights for Collection: Scholarly Articles


Works are deposited here by their authors, and represent their research and opinions, not that of Duke University. Some materials and descriptions may include offensive content. More info

Make Your Work Available Here

How to Deposit

Browse

All of DukeSpaceCommunities & CollectionsAuthorsTitlesTypesBy Issue DateDepartmentsAffiliations of Duke Author(s)SubjectsBy Submit DateThis CollectionAuthorsTitlesTypesBy Issue DateDepartmentsAffiliations of Duke Author(s)SubjectsBy Submit Date

My Account

LoginRegister

Statistics

View Usage Statistics
Duke University Libraries

Contact Us

411 Chapel Drive
Durham, NC 27708
(919) 660-5870
Perkins Library Service Desk

Digital Repositories at Duke

  • Report a problem with the repositories
  • About digital repositories at Duke
  • Accessibility Policy
  • Deaccession and DMCA Takedown Policy

TwitterFacebookYouTubeFlickrInstagramBlogs

Sign Up for Our Newsletter
  • Re-use & Attribution / Privacy
  • Harmful Language Statement
  • Support the Libraries
Duke University