Browsing by Subject "Auditory"
Results Per Page
Sort Options
Item Open Access Interactions of Attention, Stimulus Conflict, and Multisensory Processing(2012) Donohue, Sarah ElizabethAt every moment in life we are receiving input from multiple sensory modalities. We are limited, however, in the amount of information we can selectively attend to and fully process at any one time. The ability to integrate the relevant corresponding multisensory inputs together and to segregate other sensory information that is conflicting or distracting is therefore fundamental to our ability to successfully navigate through our complex environment. Such multisensory integration and segregation is done on the basis of temporal, spatial, and semantic cues, often aided by selective attention to particular inputs from one or multiple modalities. The precise nature of how attention interacts with multisensory perception, and how this ramifies behaviorally and neurally, has been largely underexplored. Here, in a series of six cognitive experiments in humans using auditory and visual stimuli, along with electroencephalography (EEG) measures of brain activity and behavioral measures of task performance, I examine the interactions between attention, stimulus conflict, and multisensory processing. I demonstrate that attention can spread across modalities in a pattern that closely follows the temporal linking of multisensory stimuli, while also engendering the spatial linking of such multisensory stimuli. When stimulus inputs either within audition or across modalities conflict, I observe an electrophysiological signature of the processing of this conflict that is similar to what had been previously observed within the visual modality. Moreover, using neural measures of attentional distraction, I show that when task-irrelevant stimulus input from one modality conflicts with task-relevant input from another, attention is initially pulled toward the conflicting irrelevant modality, thereby contributing to the observed impairment in task performance. Finally, I demonstrate that there are individual differences in multisensory temporal processing in the population, in particular between those with extensive action-video-game experience versus those with little. However, everyone appears to be susceptible to multisensory distraction, a finding that should be taken into serious consideration in today's complex world of multitasking.
Item Open Access Non-auditory Influences on the Auditory Periphery(2016) Gruters, Kurtis GOnce thought to be predominantly the domain of cortex, multisensory integration has now been found at numerous sub-cortical locations in the auditory pathway. Prominent ascending and descending connection within the pathway suggest that the system may utilize non-auditory activity to help filter incoming sounds as they first enter the ear. Active mechanisms in the periphery, particularly the outer hair cells (OHCs) of the cochlea and middle ear muscles (MEMs), are capable of modulating the sensitivity of other peripheral mechanisms involved in the transduction of sound into the system. Through indirect mechanical coupling of the OHCs and MEMs to the eardrum, motion of these mechanisms can be recorded as acoustic signals in the ear canal. Here, we utilize this recording technique to describe three different experiments that demonstrate novel multisensory interactions occurring at the level of the eardrum. 1) In the first experiment, measurements in humans and monkeys performing a saccadic eye movement task to visual targets indicate that the eardrum oscillates in conjunction with eye movements. The amplitude and phase of the eardrum movement, which we dub the Oscillatory Saccadic Eardrum Associated Response or OSEAR, depended on the direction and horizontal amplitude of the saccade and occurred in the absence of any externally delivered sounds. 2) For the second experiment, we use an audiovisual cueing task to demonstrate a dynamic change to pressure levels in the ear when a sound is expected versus when one is not. Specifically, we observe a drop in frequency power and variability from 0.1 to 4kHz around the time when the sound is expected to occur in contract to a slight increase in power at both lower and higher frequencies. 3) For the third experiment, we show that seeing a speaker say a syllable that is incongruent with the accompanying audio can alter the response patterns of the auditory periphery, particularly during the most relevant moments in the speech stream. These visually influenced changes may contribute to the altered percept of the speech sound. Collectively, we presume that these findings represent the combined effect of OHCs and MEMs acting in tandem in response to various non-auditory signals in order to manipulate the receptive properties of the auditory system. These influences may have a profound, and previously unrecognized, impact on how the auditory system processes sounds from initial sensory transduction all the way to perception and behavior. Moreover, we demonstrate that the entire auditory system is, fundamentally, a multisensory system.
Item Open Access Separable codes for read-out of mouse primary visual cortex across attentional states(2019) Wilson, Ashley MarieAttentional modulation of neuronal activity in sensory cortex could alter perception by enhancing the local representation of attended stimuli or its behavioral read-out downstream. We tested these hypotheses using a task in which mice are cued on interleaved trials to attend visual or auditory targets. Neurons in primary visual cortex (V1) that encode task stimuli have larger visually-evoked responses when attention is directed toward vision. To determine whether the attention-dependent changes in V1 reflect changes in representation or read-out, we decoded task stimuli and choices from population activity. Surprisingly, both visual and auditory choices can be decoded from V1, but decoding takes advantage of unique activity patterns across modalities. Furthermore, decoding of choices, but not stimuli, is impaired when attention is directed toward the opposite modality. The specific effect on choice suggests behavioral improvements with attention are largely due to targeted read-out of the most informative V1 neurons.