Single neurons may encode simultaneous stimuli by switching between activity patterns.


How the brain preserves information about multiple simultaneous items is poorly understood. We report that single neurons can represent multiple stimuli by interleaving signals across time. We record single units in an auditory region, the inferior colliculus, while monkeys localize 1 or 2 simultaneous sounds. During dual-sound trials, we find that some neurons fluctuate between firing rates observed for each single sound, either on a whole-trial or on a sub-trial timescale. These fluctuations are correlated in pairs of neurons, can be predicted by the state of local field potentials prior to sound onset, and, in one monkey, can predict which sound will be reported first. We find corroborating evidence of fluctuating activity patterns in a separate dataset involving responses of inferotemporal cortex neurons to multiple visual stimuli. Alternation between activity patterns corresponding to each of multiple items may therefore be a general strategy to enhance the brain processing capacity, potentially linking such disparate phenomena as variable neural firing, neural oscillations, and limits in attentional/memory capacity.





Published Version (Please cite this version)


Publication Info

Caruso, Valeria C, Jeff T Mohl, Christopher Glynn, Jungah Lee, Shawn M Willett, Azeem Zaman, Akinori F Ebihara, Rolando Estrada, et al. (2018). Single neurons may encode simultaneous stimuli by switching between activity patterns. Nature communications, 9(1). p. 2715. 10.1038/s41467-018-05121-8 Retrieved from

This is constructed from limited available data and may be imprecise. To cite this article, please review & use the official citation provided by the journal.



Surya Tapas Tokdar

Professor of Statistical Science

Jennifer M. Groh

Professor of Psychology and Neuroscience

Research in my laboratory concerns how sensory and motor systems work together, and how neural representations play a combined role in sensorimotor and cognitive processing (embodied cognition).

Most of our work concerns the interactions between vision and hearing. We frequently perceive visual and auditory stimuli as being bound together if they seem likely to have arisen from a common source. That's why we tend not to notice that the speakers on TV sets or in movie theatres are located beside, and not behind, the screen. Research in my laboratory is devoted to investigating the question of how the brain coordinates the information arising from the ears and eyes. Our findings challenge the historical view of the brain's sensory processing as being automatic, autonomous, and immune from outside influence. We have recently established that neurons in the auditory pathway (inferior colliculus, auditory cortex) alter their responses to sound depending on where the eyes are pointing. This finding suggests that the different sensory pathways meddle in one another's supposedly private affairs, making their respective influences felt even at very early stages of processing. The process of bringing the signals from two different sensory pathways into a common frame of reference begins at a surprisingly early point along the primary sensory pathways.

Unless otherwise indicated, scholarly articles published by Duke faculty members are made available here with a CC-BY-NC (Creative Commons Attribution Non-Commercial) license, as enabled by the Duke Open Access Policy. If you wish to use the materials in ways not already permitted under CC-BY-NC, please consult the copyright owner. Other materials are made available here through the author’s grant of a non-exclusive license to make their work openly accessible.