Browsing by Author "Groh, Jennifer M"
- Results Per Page
- Sort Options
Item Open Access Auditory signals evolve from hybrid- to eye-centered coordinates in the primate superior colliculus.(Journal of neurophysiology, 2012-07) Lee, Jungah; Groh, Jennifer MVisual and auditory spatial signals initially arise in different reference frames. It has been postulated that auditory signals are translated from a head-centered to an eye-centered frame of reference compatible with the visual spatial maps, but, to date, only various forms of hybrid reference frames for sound have been identified. Here, we show that the auditory representation of space in the superior colliculus involves a hybrid reference frame immediately after the sound onset but evolves to become predominantly eye centered, and more similar to the visual representation, by the time of a saccade to that sound. Specifically, during the first 500 ms after the sound onset, auditory response patterns (N = 103) were usually neither head nor eye centered: 64% of neurons showed such a hybrid pattern, whereas 29% were more eye centered and 8% were more head centered. This differed from the pattern observed for visual targets (N = 156): 86% were eye centered, <1% were head centered, and only 13% exhibited a hybrid of both reference frames. For auditory-evoked activity observed within 20 ms of the saccade (N = 154), the proportion of eye-centered response patterns increased to 69%, whereas the hybrid and head-centered response patterns dropped to 30% and <1%, respectively. This pattern approached, although did not quite reach, that observed for saccade-related activity for visual targets: 89% were eye centered, 11% were hybrid, and <1% were head centered (N = 162). The plainly eye-centered visual response patterns and predominantly eye-centered auditory motor response patterns lie in marked contrast to our previous study of the intraparietal cortex, where both visual and auditory sensory and motor-related activity used a predominantly hybrid reference frame (Mullette-Gillman et al. 2005, 2009). Our present findings indicate that auditory signals are ultimately translated into a reference frame roughly similar to that used for vision, but suggest that such signals might emerge only in motor areas responsible for directing gaze to visual and auditory stimuli.Item Open Access Comparison of gain-like properties of eye position signals in inferior colliculus versus auditory cortex of primates.(Frontiers in integrative neuroscience, 2010-01) Maier, Joost X; Groh, Jennifer MWe evaluated to what extent the influence of eye position in the auditory pathway of primates can be described as a gain field. We compared single unit activity in the inferior colliculus (IC), core auditory cortex (A1) and the caudomedial belt (CM) region of auditory cortex (AC) in primates, and found stronger evidence for gain field-like interactions in the IC than in AC. In the IC, eye position signals showed both multiplicative and additive interactions with auditory responses, whereas in AC the effects were not as well predicted by a gain field model.Item Open Access Coordinated multiplexing of information about separate objects in visual cortex.(eLife, 2022-11) Jun, Na Young; Ruff, Douglas A; Kramer, Lily E; Bowes, Brittany; Tokdar, Surya T; Cohen, Marlene R; Groh, Jennifer MSensory receptive fields are large enough that they can contain more than one perceptible stimulus. How, then, can the brain encode information about each of the stimuli that may be present at a given moment? We recently showed that when more than one stimulus is present, single neurons can fluctuate between coding one vs. the other(s) across some time period, suggesting a form of neural multiplexing of different stimuli (Caruso et al., 2018). Here, we investigate (a) whether such coding fluctuations occur in early visual cortical areas; (b) how coding fluctuations are coordinated across the neural population; and (c) how coordinated coding fluctuations depend on the parsing of stimuli into separate vs. fused objects. We found coding fluctuations do occur in macaque V1 but only when the two stimuli form separate objects. Such separate objects evoked a novel pattern of V1 spike count ('noise') correlations involving distinct distributions of positive and negative values. This bimodal correlation pattern was most pronounced among pairs of neurons showing the strongest evidence for coding fluctuations or multiplexing. Whether a given pair of neurons exhibited positive or negative correlations depended on whether the two neurons both responded better to the same object or had different object preferences. Distinct distributions of spike count correlations based on stimulus preferences were also seen in V4 for separate objects but not when two stimuli fused to form one object. These findings suggest multiple objects evoke different response dynamics than those evoked by single stimuli, lending support to the multiplexing hypothesis and suggesting a means by which information about multiple objects can be preserved despite the apparent coarseness of sensory coding.Item Open Access Distribution of eye position information in the monkey inferior colliculus.(Journal of neurophysiology, 2012-02) Bulkin, David A; Groh, Jennifer MThe inferior colliculus (IC) is thought to have two main subdivisions, a central region that forms an important stop on the ascending auditory pathway and a surrounding shell region that may play a more modulatory role. In this study, we investigated whether eye position affects activity in both the central and shell regions. Accordingly, we mapped the location of eye position-sensitive neurons in six monkeys making spontaneous eye movements by sampling multiunit activity at regularly spaced intervals throughout the IC. We used a functional map based on auditory response patterns to estimate the anatomical location of recordings, in conjunction with structural MRI and histology. We found eye position-sensitive sites throughout the IC, including at 27% of sites in tonotopically organized recording penetrations (putatively the central nucleus). Recordings from surrounding tissue showed a larger proportion of sites indicating an influence of eye position (33-43%). When present, the magnitude of the change in activity due to eye position was often comparable to that seen for sound frequency. Our results indicate that the primary ascending auditory pathway is influenced by the position of the eyes. Because eye position is essential for visual-auditory integration, our findings suggest that computations underlying visual-auditory integration begin early in the ascending auditory pathway.Item Open Access Distribution of visual and saccade related information in the monkey inferior colliculus.(Frontiers in neural circuits, 2012-01) Bulkin, David A; Groh, Jennifer MThe inferior colliculus (IC) is an essential stop early in the ascending auditory pathway. Though normally thought of as a predominantly auditory structure, recent work has uncovered a variety of non-auditory influences on firing rate in the IC. Here, we map the location within the IC of neurons that respond to the onset of a fixation-guiding visual stimulus. Visual/visuomotor associated activity was found throughout the IC (overall, 84 of 199 sites tested or 42%), but with a far reduced prevalence and strength along recording penetrations passing through the tonotopically organized region of the IC, putatively the central nucleus (11 of 42 sites tested, or 26%). These results suggest that visual information has only a weak effect on early auditory processing in core regions, but more strongly targets the modulatory shell regions of the IC.Item Open Access Effects of Electrical Stimulation in the Inferior Colliculus on Frequency Discrimination by Rhesus Monkeys and Implications for the Auditory Midbrain Implant.(The Journal of neuroscience : the official journal of the Society for Neuroscience, 2016-05) Pages, Daniel S; Ross, Deborah A; Puñal, Vanessa M; Agashe, Shruti; Dweck, Isaac; Mueller, Jerel; Grill, Warren M; Wilson, Blake S; Groh, Jennifer MUnderstanding the relationship between the auditory selectivity of neurons and their contribution to perception is critical to the design of effective auditory brain prosthetics. These prosthetics seek to mimic natural activity patterns to achieve desired perceptual outcomes. We measured the contribution of inferior colliculus (IC) sites to perception using combined recording and electrical stimulation. Monkeys performed a frequency-based discrimination task, reporting whether a probe sound was higher or lower in frequency than a reference sound. Stimulation pulses were paired with the probe sound on 50% of trials (0.5-80 μA, 100-300 Hz, n = 172 IC locations in 3 rhesus monkeys). Electrical stimulation tended to bias the animals' judgments in a fashion that was coarsely but significantly correlated with the best frequency of the stimulation site compared with the reference frequency used in the task. Although there was considerable variability in the effects of stimulation (including impairments in performance and shifts in performance away from the direction predicted based on the site's response properties), the results indicate that stimulation of the IC can evoke percepts correlated with the frequency-tuning properties of the IC. Consistent with the implications of recent human studies, the main avenue for improvement for the auditory midbrain implant suggested by our findings is to increase the number and spatial extent of electrodes, to increase the size of the region that can be electrically activated, and to provide a greater range of evoked percepts.Patients with hearing loss stemming from causes that interrupt the auditory pathway after the cochlea need a brain prosthetic to restore hearing. Recently, prosthetic stimulation in the human inferior colliculus (IC) was evaluated in a clinical trial. Thus far, speech understanding was limited for the subjects and this limitation is thought to be partly due to challenges in harnessing the sound frequency representation in the IC. Here, we tested the effects of IC stimulation in monkeys trained to report the sound frequencies they heard. Our results indicate that the IC can be used to introduce a range of frequency percepts and suggest that placement of a greater number of electrode contacts may improve the effectiveness of such implants.Item Open Access Effects of Initial Eye Position on Saccades Evoked by Microstimulation in the Primate Superior Colliculus: Implications for Models of the SC Read-Out Process.(Frontiers in integrative neuroscience, 2011-01-19) Groh, Jennifer MThe motor layers of the superior colliculus (SC) are thought to specify saccade amplitude and direction, independent of initial eye position. However, recent evidence suggests that eye position can modulate the level of activity of SC motor neurons. In this study, we tested whether initial eye position has an effect on microstimulation-evoked saccade amplitude. High (>300 Hz) and low (<300 Hz) frequency microstimulation was applied to 30 sites in the rostral part of the SC of two monkeys while they fixated one of six different locations. We found that the amplitude of the evoked saccades decreased with more contralateral initial eye positions. This effect was more pronounced in low frequency- compared to high frequency-evoked saccades, although it was present for both. Replication of these findings in head-free experiments showed that the effect of initial eye position was not due to physical constraints imposed by the oculomotor range. In addition to the effect of eye position on saccade amplitude, we also observed an increase in saccade latency and a decrease in the probability that microstimulation would evoke a saccade for low frequency stimulation at more contralateral eye positions. These findings suggest that an eye position signal can contribute to the read-out of the SC. Models of the saccadic pulse-step generator may need revision to incorporate an eye position modulation at the input stage.Item Open Access Encoding of Concurrent Sounds in the Monkey Inferior Colliculus(2020) Willett, Shawn MThe inferior colliculus (IC) is an auditory midbrain nucleus essential to the perception of sound frequency and the localization of sound source; yet it remains unclear how the firing rate of primate IC neurons contribute to the localization of concurrent sounds of variable sound frequencies. In this work, I extracellularly recorded the activity of 105 IC neurons while two adult macaque monkeys reported the location(s) of either a single bandpass filtered sound or two concurrent bandpass filtered sounds spatially separated by 24° and separated in sound frequency by 0.25 - 2 octaves. Monkeys performed this task well, with an accuracy of about 80% on single sound trials and about 90% on dual sound trials. The improvement in performance on dual sound trials was not explained by dual sound modulations of IC neural response functions. On dual sound trials, IC neuron receptive fields broadened, and sound frequency accounted for less variance in the dual sound response; and these changes decreased the performance of a maximum-likelihood decoder in correctly labeling the condition of a held out dual sound trial by about 20%. Overall, these results suggest that changes to the IC neural response functions elicited by the presence of a second, concurrent, sound should impair rather than facilitate the IC encoding of concurrent sounds and that an alternative explanation is required to account for monkey performance. I next investigated if recently discovered response alternations, suggested to underlie the encoding of concurrent sounds, were present in the recorded populations. These response alternations occur when an IC neuron alternates its firing rate between the rate corresponding to each component sound of a dual sound pair. These response alternations were observed in about 60% of IC neurons and their contribution to the population response remained stable across the full, 2 octave, range of frequency separations tested. Thus, response alternations are a general mechanism used by the IC to potentially facilitate the encoding of multiple sounds and these results add to a growing body of work observing response alternations across brain areas. The measurements I performed clearly indicate that neurons in the primate IC are sensitive to not only sound frequency and location but also the number of sounds in the environment. Future empirical and theoretical work is needed to elucidate how exactly these response alternations arise and are read out by downstream neurons to allow for the perception of concurrent sounds.
Item Open Access Functional Mapping of the Macaque Inferior Colliculus(2010) Bulkin, DavidThe study of neural phenomena often depends critically on functional maps. Physiologists use topographic organization to identify the location of neural recordings, and in turn to relate activity patterns to underlying anatomy. In this work I investigated the functional architecture of the inferior colliculus (IC) in rhesus monkeys. The IC is an important site of convergence along the mammalian auditory pathway, composed of a number of anatomically and functionally defined regions. I mapped responses to tonal stimuli and acoustic noise in the IC and surrounding tissue using a systematic method, collecting extracellular electrophysiolgical data at evenly placed locations. I found three well organized regions: a central core that showed a topographic organization of sound tuning properties, a surrounding shell that contained neurons tuned to low frequency sounds, and a peripheral area that showed little tuning to frequency. The parcellation of the IC based on tuning properties was confirmed using measures of temporal properties of responses. I then put the map to use, to ask how two non-auditory signals are distributed within the structure. I found that neurons sensitive to eye position are spread throughout the IC, including in the central core region associated with the primary ascending auditory stream. This result has an important implication for models of sensory integration, namely that information from multiple senses meets early, in an area previously thought to be unisensory. Neurons that showed changes in activity directly linked to the presentation of a visual stimulus were less evenly distributed, and only weak responses were found in the core region. Nonetheless, visual sensitivity was not confined to a small subregion of the IC. The results challenge the notion that the senses are combined in nuclei specialized for this purpose.
Item Open Access Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.(PloS one, 2013-01) Pages, Daniel S; Groh, Jennifer MA general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a 'guess and check' heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain's reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3-1.7 degrees, or 22-28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.Item Open Access Multisensory Integration, Segregation, and Causal Inference in the Superior Colliculus(2020) Mohl, Jeffrey ThomasThe environment is sampled by multiple senses, which are woven together to produce a unified perceptual state. However, unifying these senses requires assigning particular signals to the same or different underlying objects or events. Sensory signals originating from the same source should be integrated together, while signals originating from separate sources should be segregated from one another. Each of these computations is associated with different neural encoding strategies, and it is unknown how these strategies interact. Here, we begin to characterize how this problem is solved in the primate brain. First, we developed a behavioral paradigm and applied a computational modeling approach to demonstrate that monkeys, like humans, implement a form of Bayesian causal inference to decide whether two stimuli (one auditory and one visual) originated from the same source. We then recorded single unit neural activity from a representative multisensory brain region, the superior colliculus (SC), while monkeys performed this task. We found that SC neurons encoded either segregated unisensory or integrated multisensory target representations in separate sub-populations of neurons. These responses were well described by a weighted linear combination of unisensory responses which did not account for spatial separation between targets, suggesting that SC sensory responses did not immediately discriminate between common cause and separate cause conditions as predicted by Bayesian causal inference. These responses became less linear as the trial progressed, hinting that such a causal inference may evolve over time. Finally, we implemented a single trial analysis method to determine whether the observed linearity was indicative of true weighted combinations on each trial, or whether this observation was an artifact of pooling data across trials. We found that initial sensory responses (0-150 ms) were well described by linear models even at the single trial level, but that later sustained (150-600 ms) and saccade period responses were instead better described as fluctuating between encoding either the auditory or visual stimulus alone. We also found that these fluctuations were correlated with behavior, suggesting that they may reflect a convergence from the SC encoding all potential targets to preferentially encoding only a specific target on a given trial. Together, these results demonstrate that non-human primates (like humans) perform an idealized version of Bayesian causal inference, that this inference may depend on separate sub-populations of neurons maintaining either integrated or segregated stimulus representations, and that these responses then evolve over time to reflect more complex encoding rules.
Item Open Access Non-auditory Influences on the Auditory Periphery(2016) Gruters, Kurtis GOnce thought to be predominantly the domain of cortex, multisensory integration has now been found at numerous sub-cortical locations in the auditory pathway. Prominent ascending and descending connection within the pathway suggest that the system may utilize non-auditory activity to help filter incoming sounds as they first enter the ear. Active mechanisms in the periphery, particularly the outer hair cells (OHCs) of the cochlea and middle ear muscles (MEMs), are capable of modulating the sensitivity of other peripheral mechanisms involved in the transduction of sound into the system. Through indirect mechanical coupling of the OHCs and MEMs to the eardrum, motion of these mechanisms can be recorded as acoustic signals in the ear canal. Here, we utilize this recording technique to describe three different experiments that demonstrate novel multisensory interactions occurring at the level of the eardrum. 1) In the first experiment, measurements in humans and monkeys performing a saccadic eye movement task to visual targets indicate that the eardrum oscillates in conjunction with eye movements. The amplitude and phase of the eardrum movement, which we dub the Oscillatory Saccadic Eardrum Associated Response or OSEAR, depended on the direction and horizontal amplitude of the saccade and occurred in the absence of any externally delivered sounds. 2) For the second experiment, we use an audiovisual cueing task to demonstrate a dynamic change to pressure levels in the ear when a sound is expected versus when one is not. Specifically, we observe a drop in frequency power and variability from 0.1 to 4kHz around the time when the sound is expected to occur in contract to a slight increase in power at both lower and higher frequencies. 3) For the third experiment, we show that seeing a speaker say a syllable that is incongruent with the accompanying audio can alter the response patterns of the auditory periphery, particularly during the most relevant moments in the speech stream. These visually influenced changes may contribute to the altered percept of the speech sound. Collectively, we presume that these findings represent the combined effect of OHCs and MEMs acting in tandem in response to various non-auditory signals in order to manipulate the receptive properties of the auditory system. These influences may have a profound, and previously unrecognized, impact on how the auditory system processes sounds from initial sensory transduction all the way to perception and behavior. Moreover, we demonstrate that the entire auditory system is, fundamentally, a multisensory system.
Item Open Access Oculomotor Influence on the Mechanics of Hearing: Eye Movement-Related Eardrum Oscillations and Their Potential Role in Audio-Visual Spatial Integration(2020) Murphy, David LKAfter every eye movement, the brain must realign the visual and auditory reference frames in order to co-locate sights and sounds. Exactly where, when, and how such visual-auditory spatial integrations occur is not fully understood. We recently discovered that the eardrum oscillates beginning a few milliseconds before saccades and continuing until well into ensuing periods of fixation (Gruters et al., 2018)(Gruters, Murphy et al PNAS 2018). Information about at least the horizontal direction and length of saccades appear to be reflected in the phase and magnitude of these eye movement-related eardrum oscillations (EMREO).
Here, we sought to assess the full spatial characteristics of this signal for saccade parameters in both vertical and horizontal dimensions. Concurrently we sought to validate that independent estimations of vertical and horizontal saccade parameter contributions can be linearly combined to predict EMREO waveforms for saccades in all directions – a fundamental assumption of current analyses.
We found that EMREOs depend on both horizontal and vertical saccade components, varying predominantly with eye displacement, but modulated by absolute (initial or final) position as well. In toto, EMREO appear to represent combinations of these saccade parameters such that any saccade corresponds to a specific eardrum oscillation that contains a linear combination of the vertical and horizontal saccade parameters. Regressions in both the time and frequency domain create a fuller picture of the spatial information contained in EMREO. These results demonstrate that detailed information about the relationship between visual and auditory reference frames is present in the earliest stage of the auditory pathway. They also demonstrate that this information is mapped linearly and can therefore be recovered with a small set of basis components.
Future work delving into the relationship between EMREO and the transduction of incoming sounds will be needed to ascertain their effects on the processing of auditory spatial locations in relation to the visual scene. While the frequency and magnitude of EMREO suggest that they may be related to middle ear muscle contractions, the underlying mechanisms that generate them are unknown.
Item Open Access Single neurons may encode simultaneous stimuli by switching between activity patterns.(Nature communications, 2018-07-13) Caruso, Valeria C; Mohl, Jeff T; Glynn, Christopher; Lee, Jungah; Willett, Shawn M; Zaman, Azeem; Ebihara, Akinori F; Estrada, Rolando; Freiwald, Winrich A; Tokdar, Surya T; Groh, Jennifer MHow the brain preserves information about multiple simultaneous items is poorly understood. We report that single neurons can represent multiple stimuli by interleaving signals across time. We record single units in an auditory region, the inferior colliculus, while monkeys localize 1 or 2 simultaneous sounds. During dual-sound trials, we find that some neurons fluctuate between firing rates observed for each single sound, either on a whole-trial or on a sub-trial timescale. These fluctuations are correlated in pairs of neurons, can be predicted by the state of local field potentials prior to sound onset, and, in one monkey, can predict which sound will be reported first. We find corroborating evidence of fluctuating activity patterns in a separate dataset involving responses of inferotemporal cortex neurons to multiple visual stimuli. Alternation between activity patterns corresponding to each of multiple items may therefore be a general strategy to enhance the brain processing capacity, potentially linking such disparate phenomena as variable neural firing, neural oscillations, and limits in attentional/memory capacity.Item Open Access Sounds and beyond: multisensory and other non-auditory signals in the inferior colliculus.(Frontiers in neural circuits, 2012-01) Gruters, Kurtis G; Groh, Jennifer MThe inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory sources. Neurophysiological studies corroborate that non-auditory stimuli can modulate auditory processing in the IC and even elicit responses independent of coincident auditory stimulation. In this article, we review anatomical and physiological evidence for multisensory and other non-auditory processing in the IC. Specifically, the contributions of signals related to vision, eye movements and position, somatosensation, and behavioral context to neural activity in the IC will be described. These signals are potentially important for localizing sound sources, attending to salient stimuli, distinguishing environmental from self-generated sounds, and perceiving and generating communication sounds. They suggest that the IC should be thought of as a node in a highly interconnected sensory, motor, and cognitive network dedicated to synthesizing a higher-order auditory percept rather than simply reporting patterns of air pressure detected by the cochlea. We highlight some of the potential pitfalls that can arise from experimental manipulations that may disrupt the normal function of this network, such as the use of anesthesia or the severing of connections from cortical structures that project to the IC. Finally, we note that the presence of these signals in the IC has implications for our understanding not just of the IC but also of the multitude of other regions within and beyond the auditory system that are dependent on signals that pass through the IC. Whatever the IC "hears" would seem to be passed both "upward" to thalamus and thence to auditory cortex and beyond, as well as "downward" via centrifugal connections to earlier areas of the auditory pathway such as the cochlear nucleus.Item Open Access Stimulus Integration and Parsing in the Primate Auditory Midbrain(2016) Pages, Daniel SIntegrating information from multiple sources is a crucial function of the brain. Examples of such integration include multiple stimuli of different modalties, such as visual and auditory, multiple stimuli of the same modality, such as auditory and auditory, and integrating stimuli from the sensory organs (i.e. ears) with stimuli delivered from brain-machine interfaces.
The overall aim of this body of work is to empirically examine stimulus integration in these three domains to inform our broader understanding of how and when the brain combines information from multiple sources.
First, I examine visually-guided auditory, a problem with implications for the general problem in learning of how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a ‘guess and check’ heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain’s reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3-1.7 degrees, or 22-28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.
My next line of research examines how electrical stimulation of the inferior colliculus influences perception of sounds in a nonhuman primate. The central nucleus of the inferior colliculus is the major ascending relay of auditory information before it reaches the forebrain, and thus an ideal target for understanding low-level information processing prior to the forebrain, as almost all auditory signals pass through the central nucleus of the inferior colliculus before reaching the forebrain. Thus, the inferior colliculus is the ideal structure to examine to understand the format of the inputs into the forebrain and, by extension, the processing of auditory scenes that occurs in the brainstem. Therefore, the inferior colliculus was an attractive target for understanding stimulus integration in the ascending auditory pathway.
Moreover, understanding the relationship between the auditory selectivity of neurons and their contribution to perception is critical to the design of effective auditory brain prosthetics. These prosthetics seek to mimic natural activity patterns to achieve desired perceptual outcomes. We measured the contribution of inferior colliculus (IC) sites to perception using combined recording and electrical stimulation. Monkeys performed a frequency-based discrimination task, reporting whether a probe sound was higher or lower in frequency than a reference sound. Stimulation pulses were paired with the probe sound on 50% of trials (0.5-80 µA, 100-300 Hz, n=172 IC locations in 3 rhesus monkeys). Electrical stimulation tended to bias the animals’ judgments in a fashion that was coarsely but significantly correlated with the best frequency of the stimulation site in comparison to the reference frequency employed in the task. Although there was considerable variability in the effects of stimulation (including impairments in performance and shifts in performance away from the direction predicted based on the site’s response properties), the results indicate that stimulation of the IC can evoke percepts correlated with the frequency tuning properties of the IC. Consistent with the implications of recent human studies, the main avenue for improvement for the auditory midbrain implant suggested by our findings is to increase the number and spatial extent of electrodes, to increase the size of the region that can be electrically activated and provide a greater range of evoked percepts.
My next line of research employs a frequency-tagging approach to examine the extent to which multiple sound sources are combined (or segregated) in the nonhuman primate inferior colliculus. In the single-sound case, most inferior colliculus neurons respond and entrain to sounds in a very broad region of space, and many are entirely spatially insensitive, so it is unknown how the neurons will respond to a situation with more than one sound. I use multiple AM stimuli of different frequencies, which the inferior colliculus represents using a spike timing code. This allows me to measure spike timing in the inferior colliculus to determine which sound source is responsible for neural activity in an auditory scene containing multiple sounds. Using this approach, I find that the same neurons that are tuned to broad regions of space in the single sound condition become dramatically more selective in the dual sound condition, preferentially entraining spikes to stimuli from a smaller region of space. I will examine the possibility that there may be a conceptual linkage between this finding and the finding of receptive field shifts in the visual system.
In chapter 5, I will comment on these findings more generally, compare them to existing theoretical models, and discuss what these results tell us about processing in the central nervous system in a multi-stimulus situation. My results suggest that the brain is flexible in its processing and can adapt its integration schema to fit the available cues and the demands of the task.
Item Open Access Systematic mapping of the monkey inferior colliculus reveals enhanced low frequency sound representation.(Journal of neurophysiology, 2011-04) Bulkin, David A; Groh, Jennifer MWe investigated the functional architecture of the inferior colliculus (IC) in rhesus monkeys. We systematically mapped multiunit responses to tonal stimuli and noise in the IC and surrounding tissue of six rhesus macaques, collecting data at evenly placed locations and recording nonresponsive locations to define boundaries. The results show a modest tonotopically organized region (17 of 100 recording penetration locations in 4 of 6 monkeys) surrounded by a large mass of tissue that, although vigorously responsive, showed no clear topographic arrangement (68 of 100 penetration locations). Rather, most cells in these recordings responded best to frequencies at the low end of the macaque auditory range. The remaining 15 (of 100) locations exhibited auditory responses that were not sensitive to sound frequency. Potential anatomical correlates of functionally defined regions and implications for midbrain auditory prosthetic devices are discussed.Item Open Access The eardrums move when the eyes move: A multisensory effect on the mechanics of hearing.(Proc Natl Acad Sci U S A, 2018-02-06) Gruters, Kurtis G; Murphy, David LK; Jenson, Cole D; Smith, David W; Shera, Christopher A; Groh, Jennifer MInteractions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n= 19 ears in 16 subjects) and monkeys (n= 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub eye movement-related eardrum oscillations (EMREOs), occurred in the absence of a sound stimulus. The amplitude and phase of the EMREOs depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.