Browsing by Subject "Otoacoustic emissions"
Results Per Page
Sort Options
Item Open Access Non-auditory Influences on the Auditory Periphery(2016) Gruters, Kurtis GOnce thought to be predominantly the domain of cortex, multisensory integration has now been found at numerous sub-cortical locations in the auditory pathway. Prominent ascending and descending connection within the pathway suggest that the system may utilize non-auditory activity to help filter incoming sounds as they first enter the ear. Active mechanisms in the periphery, particularly the outer hair cells (OHCs) of the cochlea and middle ear muscles (MEMs), are capable of modulating the sensitivity of other peripheral mechanisms involved in the transduction of sound into the system. Through indirect mechanical coupling of the OHCs and MEMs to the eardrum, motion of these mechanisms can be recorded as acoustic signals in the ear canal. Here, we utilize this recording technique to describe three different experiments that demonstrate novel multisensory interactions occurring at the level of the eardrum. 1) In the first experiment, measurements in humans and monkeys performing a saccadic eye movement task to visual targets indicate that the eardrum oscillates in conjunction with eye movements. The amplitude and phase of the eardrum movement, which we dub the Oscillatory Saccadic Eardrum Associated Response or OSEAR, depended on the direction and horizontal amplitude of the saccade and occurred in the absence of any externally delivered sounds. 2) For the second experiment, we use an audiovisual cueing task to demonstrate a dynamic change to pressure levels in the ear when a sound is expected versus when one is not. Specifically, we observe a drop in frequency power and variability from 0.1 to 4kHz around the time when the sound is expected to occur in contract to a slight increase in power at both lower and higher frequencies. 3) For the third experiment, we show that seeing a speaker say a syllable that is incongruent with the accompanying audio can alter the response patterns of the auditory periphery, particularly during the most relevant moments in the speech stream. These visually influenced changes may contribute to the altered percept of the speech sound. Collectively, we presume that these findings represent the combined effect of OHCs and MEMs acting in tandem in response to various non-auditory signals in order to manipulate the receptive properties of the auditory system. These influences may have a profound, and previously unrecognized, impact on how the auditory system processes sounds from initial sensory transduction all the way to perception and behavior. Moreover, we demonstrate that the entire auditory system is, fundamentally, a multisensory system.
Item Open Access Oculomotor Influence on the Mechanics of Hearing: Eye Movement-Related Eardrum Oscillations and Their Potential Role in Audio-Visual Spatial Integration(2020) Murphy, David LKAfter every eye movement, the brain must realign the visual and auditory reference frames in order to co-locate sights and sounds. Exactly where, when, and how such visual-auditory spatial integrations occur is not fully understood. We recently discovered that the eardrum oscillates beginning a few milliseconds before saccades and continuing until well into ensuing periods of fixation (Gruters et al., 2018)(Gruters, Murphy et al PNAS 2018). Information about at least the horizontal direction and length of saccades appear to be reflected in the phase and magnitude of these eye movement-related eardrum oscillations (EMREO).
Here, we sought to assess the full spatial characteristics of this signal for saccade parameters in both vertical and horizontal dimensions. Concurrently we sought to validate that independent estimations of vertical and horizontal saccade parameter contributions can be linearly combined to predict EMREO waveforms for saccades in all directions – a fundamental assumption of current analyses.
We found that EMREOs depend on both horizontal and vertical saccade components, varying predominantly with eye displacement, but modulated by absolute (initial or final) position as well. In toto, EMREO appear to represent combinations of these saccade parameters such that any saccade corresponds to a specific eardrum oscillation that contains a linear combination of the vertical and horizontal saccade parameters. Regressions in both the time and frequency domain create a fuller picture of the spatial information contained in EMREO. These results demonstrate that detailed information about the relationship between visual and auditory reference frames is present in the earliest stage of the auditory pathway. They also demonstrate that this information is mapped linearly and can therefore be recovered with a small set of basis components.
Future work delving into the relationship between EMREO and the transduction of incoming sounds will be needed to ascertain their effects on the processing of auditory spatial locations in relation to the visual scene. While the frequency and magnitude of EMREO suggest that they may be related to middle ear muscle contractions, the underlying mechanisms that generate them are unknown.