Browsing by Author "Lee, Jungah"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Open Access Auditory signals evolve from hybrid- to eye-centered coordinates in the primate superior colliculus.(Journal of neurophysiology, 2012-07) Lee, Jungah; Groh, Jennifer MVisual and auditory spatial signals initially arise in different reference frames. It has been postulated that auditory signals are translated from a head-centered to an eye-centered frame of reference compatible with the visual spatial maps, but, to date, only various forms of hybrid reference frames for sound have been identified. Here, we show that the auditory representation of space in the superior colliculus involves a hybrid reference frame immediately after the sound onset but evolves to become predominantly eye centered, and more similar to the visual representation, by the time of a saccade to that sound. Specifically, during the first 500 ms after the sound onset, auditory response patterns (N = 103) were usually neither head nor eye centered: 64% of neurons showed such a hybrid pattern, whereas 29% were more eye centered and 8% were more head centered. This differed from the pattern observed for visual targets (N = 156): 86% were eye centered, <1% were head centered, and only 13% exhibited a hybrid of both reference frames. For auditory-evoked activity observed within 20 ms of the saccade (N = 154), the proportion of eye-centered response patterns increased to 69%, whereas the hybrid and head-centered response patterns dropped to 30% and <1%, respectively. This pattern approached, although did not quite reach, that observed for saccade-related activity for visual targets: 89% were eye centered, 11% were hybrid, and <1% were head centered (N = 162). The plainly eye-centered visual response patterns and predominantly eye-centered auditory motor response patterns lie in marked contrast to our previous study of the intraparietal cortex, where both visual and auditory sensory and motor-related activity used a predominantly hybrid reference frame (Mullette-Gillman et al. 2005, 2009). Our present findings indicate that auditory signals are ultimately translated into a reference frame roughly similar to that used for vision, but suggest that such signals might emerge only in motor areas responsible for directing gaze to visual and auditory stimuli.Item Open Access Single neurons may encode simultaneous stimuli by switching between activity patterns.(Nature communications, 2018-07-13) Caruso, Valeria C; Mohl, Jeff T; Glynn, Christopher; Lee, Jungah; Willett, Shawn M; Zaman, Azeem; Ebihara, Akinori F; Estrada, Rolando; Freiwald, Winrich A; Tokdar, Surya T; Groh, Jennifer MHow the brain preserves information about multiple simultaneous items is poorly understood. We report that single neurons can represent multiple stimuli by interleaving signals across time. We record single units in an auditory region, the inferior colliculus, while monkeys localize 1 or 2 simultaneous sounds. During dual-sound trials, we find that some neurons fluctuate between firing rates observed for each single sound, either on a whole-trial or on a sub-trial timescale. These fluctuations are correlated in pairs of neurons, can be predicted by the state of local field potentials prior to sound onset, and, in one monkey, can predict which sound will be reported first. We find corroborating evidence of fluctuating activity patterns in a separate dataset involving responses of inferotemporal cortex neurons to multiple visual stimuli. Alternation between activity patterns corresponding to each of multiple items may therefore be a general strategy to enhance the brain processing capacity, potentially linking such disparate phenomena as variable neural firing, neural oscillations, and limits in attentional/memory capacity.