Browsing by Subject "Acoustic Stimulation"
Now showing 1 - 20 of 24
Results Per Page
Sort Options
Item Open Access A mutual information analysis of neural coding of speech by low-frequency MEG phase information.(J Neurophysiol, 2011-08) Cogan, Gregory B; Poeppel, DavidRecent work has implicated low-frequency (<20 Hz) neuronal phase information as important for both auditory (<10 Hz) and speech [theta (∼4-8 Hz)] perception. Activity on the timescale of theta corresponds linguistically to the average length of a syllable, suggesting that information within this range has consequences for segmentation of meaningful units of speech. Longer timescales that correspond to lower frequencies [delta (1-3 Hz)] also reflect important linguistic features-prosodic/suprasegmental-but it is unknown whether the patterns of activity in this range are similar to theta. We investigate low-frequency activity with magnetoencephalography (MEG) and mutual information (MI), an analysis that has not yet been applied to noninvasive electrophysiological recordings. We find that during speech perception each frequency subband examined [delta (1-3 Hz), theta(low) (3-5 Hz), theta(high) (5-7 Hz)] processes independent information from the speech stream. This contrasts with hypotheses that either delta and theta reflect their corresponding linguistic levels of analysis or each band is part of a single holistic onset response that tracks global acoustic transitions in the speech stream. Single-trial template-based classifier results further validate this finding: information from each subband can be used to classify individual sentences, and classifier results that utilize the combination of frequency bands provide better results than single bands alone. Our results suggest that during speech perception low-frequency phase of the MEG signal corresponds to neither abstract linguistic units nor holistic evoked potentials but rather tracks different aspects of the input signal. This study also validates a new method of analysis for noninvasive electrophysiological recordings that can be used to formally characterize information content of neural responses and interactions between these responses. Furthermore, it bridges results from different levels of neurophysiological study: small-scale multiunit recordings and local field potentials and macroscopic magneto/electrophysiological noninvasive recordings.Item Open Access A neurophysiological study into the foundations of tonal harmony.(Neuroreport, 2009-02-18) Bergelson, Elika; Idsardi, William JOur findings provide magnetoencephalographic evidence that the mismatch-negativity response to two-note chords (dyads) is modulated by a combination of abstract cognitive differences and lower-level differences in the auditory signal. Participants were presented with series of simple-ratio sinusoidal dyads (perfect fourths and perfect fifths) in which the difference between the standard and deviant dyad exhibited an interval change, a shift in pitch space, or both. In addition, the standard-deviant pair of dyads either shared one note or both notes were changed. Only the condition that featured both abstract changes (interval change and pitch-space shift) and two novel notes showed a significantly larger magnetoencephalographic mismatch-negativity response than the other conditions in the right hemisphere. Implications for music and language processing are discussed.Item Open Access A pilot investigation of audiovisual processing and multisensory integration in patients with inherited retinal dystrophies.(BMC ophthalmology, 2017-12-07) Myers, Mark H; Iannaccone, Alessandro; Bidelman, Gavin MIn this study, we examined audiovisual (AV) processing in normal and visually impaired individuals who exhibit partial loss of vision due to inherited retinal dystrophies (IRDs).Two groups were analyzed for this pilot study: Group 1 was composed of IRD participants: two with autosomal dominant retinitis pigmentosa (RP), two with autosomal recessive cone-rod dystrophy (CORD), and two with the related complex disorder, Bardet-Biedl syndrome (BBS); Group 2 was composed of 15 non-IRD participants (controls). Audiovisual looming and receding stimuli (conveying perceptual motion) were used to assess the cortical processing and integration of unimodal (A or V) and multimodal (AV) sensory cues. Electroencephalography (EEG) was used to simultaneously resolve the temporal and spatial characteristics of AV processing and assess differences in neural responses between groups. Measurement of AV integration was accomplished via quantification of the EEG's spectral power and event-related brain potentials (ERPs).Results show that IRD individuals exhibit reduced AV integration for concurrent audio and visual (AV) stimuli but increased brain activity during the unimodal A (but not V) presentation. This was corroborated in behavioral responses, where IRD patients showed slower and less accurate judgments of AV and V stimuli but more accurate responses in the A-alone condition.Collectively, our findings imply a neural compensation from auditory sensory brain areas due to visual deprivation.Item Open Access Age-related effects on the neural correlates of autobiographical memory retrieval.(Neurobiol Aging, 2012-07) St Jacques, Peggy L; Rubin, David C; Cabeza, RobertoOlder adults recall less episodically rich autobiographical memories (AM), however, the neural basis of this effect is not clear. Using functional MRI, we examined the effects of age during search and elaboration phases of AM retrieval. Our results suggest that the age-related attenuation in the episodic richness of AMs is associated with difficulty in the strategic retrieval processes underlying recovery of information during elaboration. First, age effects on AM activity were more pronounced during elaboration than search, with older adults showing less sustained recruitment of the hippocampus and ventrolateral prefrontal cortex (VLPFC) for less episodically rich AMs. Second, there was an age-related reduction in the modulation of top-down coupling of the VLPFC on the hippocampus for episodically rich AMs. In sum, the present study shows that changes in the sustained response and coupling of the hippocampus and prefrontal cortex (PFC) underlie age-related reductions in episodic richness of the personal past.Item Open Access An operant-based detection method for inferring tinnitus in mice.(Journal of neuroscience methods, 2017-11) Zuo, Hongyan; Lei, Debin; Sivaramakrishnan, Shobhana; Howie, Benjamin; Mulvany, Jessica; Bao, JianxinBackground
Subjective tinnitus is a hearing disorder in which a person perceives sound when no external sound is present. It can be acute or chronic. Because our current understanding of its pathology is incomplete, no effective cures have yet been established. Mouse models are useful for studying the pathophysiology of tinnitus as well as for developing therapeutic treatments.New method
We have developed a new method for determining acute and chronic tinnitus in mice, called sound-based avoidance detection (SBAD). The SBAD method utilizes one paradigm to detect tinnitus and another paradigm to monitor possible confounding factors, such as motor impairment, loss of motivation, and deficits in learning and memory.Results
The SBAD method has succeeded in monitoring both acute and chronic tinnitus in mice. Its detection ability is further validated by functional studies demonstrating an abnormal increase in neuronal activity in the inferior colliculus of mice that had previously been identified as having tinnitus by the SBAD method.Comparison with existing methods
The SBAD method provides a new means by which investigators can detect tinnitus in a single mouse accurately and with more control over potential confounding factors than existing methods.Conclusion
This work establishes a new behavioral method for detecting tinnitus in mice. The detection outcome is consistent with functional validation. One key advantage of mouse models is they provide researchers the opportunity to utilize an extensive array of genetic tools. This new method could lead to a deeper understanding of the molecular pathways underlying tinnitus pathology.Item Open Access Auditory signals evolve from hybrid- to eye-centered coordinates in the primate superior colliculus.(Journal of neurophysiology, 2012-07) Lee, Jungah; Groh, Jennifer MVisual and auditory spatial signals initially arise in different reference frames. It has been postulated that auditory signals are translated from a head-centered to an eye-centered frame of reference compatible with the visual spatial maps, but, to date, only various forms of hybrid reference frames for sound have been identified. Here, we show that the auditory representation of space in the superior colliculus involves a hybrid reference frame immediately after the sound onset but evolves to become predominantly eye centered, and more similar to the visual representation, by the time of a saccade to that sound. Specifically, during the first 500 ms after the sound onset, auditory response patterns (N = 103) were usually neither head nor eye centered: 64% of neurons showed such a hybrid pattern, whereas 29% were more eye centered and 8% were more head centered. This differed from the pattern observed for visual targets (N = 156): 86% were eye centered, <1% were head centered, and only 13% exhibited a hybrid of both reference frames. For auditory-evoked activity observed within 20 ms of the saccade (N = 154), the proportion of eye-centered response patterns increased to 69%, whereas the hybrid and head-centered response patterns dropped to 30% and <1%, respectively. This pattern approached, although did not quite reach, that observed for saccade-related activity for visual targets: 89% were eye centered, 11% were hybrid, and <1% were head centered (N = 162). The plainly eye-centered visual response patterns and predominantly eye-centered auditory motor response patterns lie in marked contrast to our previous study of the intraparietal cortex, where both visual and auditory sensory and motor-related activity used a predominantly hybrid reference frame (Mullette-Gillman et al. 2005, 2009). Our present findings indicate that auditory signals are ultimately translated into a reference frame roughly similar to that used for vision, but suggest that such signals might emerge only in motor areas responsible for directing gaze to visual and auditory stimuli.Item Open Access Cross-modal stimulus conflict: the behavioral effects of stimulus input timing in a visual-auditory Stroop task.(PLoS One, 2013) Donohue, Sarah E; Appelbaum, Lawrence G; Park, Christina J; Roberts, Kenneth C; Woldorff, Marty GCross-modal processing depends strongly on the compatibility between different sensory inputs, the relative timing of their arrival to brain processing components, and on how attention is allocated. In this behavioral study, we employed a cross-modal audio-visual Stroop task in which we manipulated the within-trial stimulus-onset-asynchronies (SOAs) of the stimulus-component inputs, the grouping of the SOAs (blocked vs. random), the attended modality (auditory or visual), and the congruency of the Stroop color-word stimuli (congruent, incongruent, neutral) to assess how these factors interact within a multisensory context. One main result was that visual distractors produced larger incongruency effects on auditory targets than vice versa. Moreover, as revealed by both overall shorter response times (RTs) and relative shifts in the psychometric incongruency-effect functions, visual-information processing was faster and produced stronger and longer-lasting incongruency effects than did auditory. When attending to either modality, stimulus incongruency from the other modality interacted with SOA, yielding larger effects when the irrelevant distractor occurred prior to the attended target, but no interaction with SOA grouping. Finally, relative to neutral-stimuli, and across the wide range of the SOAs employed, congruency led to substantially more behavioral facilitation than did incongruency to interference, in contrast to findings that within-modality stimulus-compatibility effects tend to be more evenly split between facilitation and interference. In sum, the present findings reveal several key characteristics of how we process the stimulus compatibility of cross-modal sensory inputs, reflecting stimulus processing patterns that are critical for successfully navigating our complex multisensory world.Item Open Access Development of hemispheric specialization for lexical pitch-accent in Japanese infants.(J Cogn Neurosci, 2010-11) Sato, Yutaka; Sogabe, Yuko; Mazuka, ReikoInfants' speech perception abilities change through the first year of life, from broad sensitivity to a wide range of speech contrasts to becoming more finely attuned to their native language. What remains unclear, however, is how this perceptual change relates to brain responses to native language contrasts in terms of the functional specialization of the left and right hemispheres. Here, to elucidate the developmental changes in functional lateralization accompanying this perceptual change, we conducted two experiments on Japanese infants using Japanese lexical pitch-accent, which changes word meanings with the pitch pattern within words. In the first behavioral experiment, using visual habituation, we confirmed that infants at both 4 and 10 months have sensitivities to the lexical pitch-accent pattern change embedded in disyllabic words. In the second experiment, near-infrared spectroscopy was used to measure cortical hemodynamic responses in the left and right hemispheres to the same lexical pitch-accent pattern changes and their pure tone counterparts. We found that brain responses to the pitch change within words differed between 4- and 10-month-old infants in terms of functional lateralization: Left hemisphere dominance for the perception of the pitch change embedded in words was seen only in the 10-month-olds. These results suggest that the perceptual change in Japanese lexical pitch-accent may be related to a shift in functional lateralization from bilateral to left hemisphere dominance.Item Open Access Different mechanisms are responsible for dishabituation of electrophysiological auditory responses to a change in acoustic identity than to a change in stimulus location.(Neurobiol Learn Mem, 2013-11) Smulders, Tom V; Jarvis, Erich DRepeated exposure to an auditory stimulus leads to habituation of the electrophysiological and immediate-early-gene (IEG) expression response in the auditory system. A novel auditory stimulus reinstates this response in a form of dishabituation. This has been interpreted as the start of new memory formation for this novel stimulus. Changes in the location of an otherwise identical auditory stimulus can also dishabituate the IEG expression response. This has been interpreted as an integration of stimulus identity and stimulus location into a single auditory object, encoded in the firing patterns of the auditory system. In this study, we further tested this hypothesis. Using chronic multi-electrode arrays to record multi-unit activity from the auditory system of awake and behaving zebra finches, we found that habituation occurs to repeated exposure to the same song and dishabituation with a novel song, similar to that described in head-fixed, restrained animals. A large proportion of recording sites also showed dishabituation when the same auditory stimulus was moved to a novel location. However, when the song was randomly moved among 8 interleaved locations, habituation occurred independently of the continuous changes in location. In contrast, when 8 different auditory stimuli were interleaved all from the same location, a separate habituation occurred to each stimulus. This result suggests that neuronal memories of the acoustic identity and spatial location are different, and that allocentric location of a stimulus is not encoded as part of the memory for an auditory object, while its acoustic properties are. We speculate that, instead, the dishabituation that occurs with a change from a stable location of a sound is due to the unexpectedness of the location change, and might be due to different underlying mechanisms than the dishabituation and separate habituations to different acoustic stimuli.Item Open Access Different stimuli, different spatial codes: a visual map and an auditory rate code for oculomotor space in the primate superior colliculus.(PLoS One, 2014) Groh, JM; Lee, JMaps are a mainstay of visual, somatosensory, and motor coding in many species. However, auditory maps of space have not been reported in the primate brain. Instead, recent studies have suggested that sound location may be encoded via broadly responsive neurons whose firing rates vary roughly proportionately with sound azimuth. Within frontal space, maps and such rate codes involve different response patterns at the level of individual neurons. Maps consist of neurons exhibiting circumscribed receptive fields, whereas rate codes involve open-ended response patterns that peak in the periphery. This coding format discrepancy therefore poses a potential problem for brain regions responsible for representing both visual and auditory information. Here, we investigated the coding of auditory space in the primate superior colliculus(SC), a structure known to contain visual and oculomotor maps for guiding saccades. We report that, for visual stimuli, neurons showed circumscribed receptive fields consistent with a map, but for auditory stimuli, they had open-ended response patterns consistent with a rate or level-of-activity code for location. The discrepant response patterns were not segregated into different neural populations but occurred in the same neurons. We show that a read-out algorithm in which the site and level of SC activity both contribute to the computation of stimulus location is successful at evaluating the discrepant visual and auditory codes, and can account for subtle but systematic differences in the accuracy of auditory compared to visual saccades. This suggests that a given population of neurons can use different codes to support appropriate multimodal behavior.Item Open Access Distress tolerance to auditory feedback and functional connectivity with the auditory cortex.(Psychiatry research. Neuroimaging, 2018-12) Addicott, Merideth A; Daughters, Stacey B; Strauman, Timothy J; Appelbaum, L GregoryDistress tolerance is the capacity to withstand negative affective states in pursuit of a goal. Low distress tolerance may bias an individual to avoid or escape experiences that induce affective distress, but the neural mechanisms underlying the bottom-up generation of distress and its relationship to behavioral avoidance are poorly understood. During a neuroimaging scan, healthy participants completed a mental arithmetic task with easy and distress phases, which differed in cognitive demands and positive versus negative auditory feedback. Then, participants were given the opportunity to continue playing the distress phase for a financial bonus and were allowed to quit at any time. The persistence duration was the measure of distress tolerance. The easy and distress phases activated auditory cortices and fronto-parietal regions. A task-based functional connectivity analysis using the left secondary auditory cortex (i.e., planum temporale) as the seed region revealed stronger connectivity to fronto-parietal regions and anterior insula during the distress phase. The distress-related connectivity between the seed region and the left anterior insula was negatively correlated with distress tolerance. The results provide initial evidence of the role of the anterior insula as a mediating link between the bottom-up generation of affective distress and top-down behavioral avoidance of distress.Item Open Access Effects of Electrical Stimulation in the Inferior Colliculus on Frequency Discrimination by Rhesus Monkeys and Implications for the Auditory Midbrain Implant.(The Journal of neuroscience : the official journal of the Society for Neuroscience, 2016-05) Pages, Daniel S; Ross, Deborah A; Puñal, Vanessa M; Agashe, Shruti; Dweck, Isaac; Mueller, Jerel; Grill, Warren M; Wilson, Blake S; Groh, Jennifer MUnderstanding the relationship between the auditory selectivity of neurons and their contribution to perception is critical to the design of effective auditory brain prosthetics. These prosthetics seek to mimic natural activity patterns to achieve desired perceptual outcomes. We measured the contribution of inferior colliculus (IC) sites to perception using combined recording and electrical stimulation. Monkeys performed a frequency-based discrimination task, reporting whether a probe sound was higher or lower in frequency than a reference sound. Stimulation pulses were paired with the probe sound on 50% of trials (0.5-80 μA, 100-300 Hz, n = 172 IC locations in 3 rhesus monkeys). Electrical stimulation tended to bias the animals' judgments in a fashion that was coarsely but significantly correlated with the best frequency of the stimulation site compared with the reference frequency used in the task. Although there was considerable variability in the effects of stimulation (including impairments in performance and shifts in performance away from the direction predicted based on the site's response properties), the results indicate that stimulation of the IC can evoke percepts correlated with the frequency-tuning properties of the IC. Consistent with the implications of recent human studies, the main avenue for improvement for the auditory midbrain implant suggested by our findings is to increase the number and spatial extent of electrodes, to increase the size of the region that can be electrically activated, and to provide a greater range of evoked percepts.Patients with hearing loss stemming from causes that interrupt the auditory pathway after the cochlea need a brain prosthetic to restore hearing. Recently, prosthetic stimulation in the human inferior colliculus (IC) was evaluated in a clinical trial. Thus far, speech understanding was limited for the subjects and this limitation is thought to be partly due to challenges in harnessing the sound frequency representation in the IC. Here, we tested the effects of IC stimulation in monkeys trained to report the sound frequencies they heard. Our results indicate that the IC can be used to introduce a range of frequency percepts and suggest that placement of a greater number of electrode contacts may improve the effectiveness of such implants.Item Open Access Factors affecting pitch discrimination performance in a cohort of extensively phenotyped healthy volunteers.(Scientific reports, 2017-11-28) Smith, Lauren M; Bartholomew, Alex J; Burnham, Lauren E; Tillmann, Barbara; Cirulli, Elizabeth TDespite efforts to characterize the different aspects of musical abilities in humans, many elements of this complex area remain unknown. Musical abilities are known to be associated with factors like intelligence, training, and sex, but a comprehensive evaluation of the simultaneous impact of multiple factors has not yet been performed. Here, we assessed 918 healthy volunteers for pitch discrimination abilities-their ability to tell two tones close in pitch apart. We identified the minimal threshold that the participants could detect, and we found that better performance was associated with higher intelligence, East Asian ancestry, male sex, younger age, formal music training-especially before age 6-and English as the native language. All these factors remained significant when controlling for the others, with general intelligence, musical training, and male sex having the biggest impacts. We also performed a small GWAS and gene-based collapsing analysis, identifying no significant associations. Future genetic studies of musical abilities should involve large sample sizes and an unbiased genome-wide approach, with the factors highlighted here included as important covariates.Item Open Access Imagery and retrieval of auditory and visual information: neural correlates of successful and unsuccessful performance.(Neuropsychologia, 2011-06) Huijbers, Willem; Pennartz, Cyriel MA; Rubin, David C; Daselaar, Sander MRemembering past events - or episodic retrieval - consists of several components. There is evidence that mental imagery plays an important role in retrieval and that the brain regions supporting imagery overlap with those supporting retrieval. An open issue is to what extent these regions support successful vs. unsuccessful imagery and retrieval processes. Previous studies that examined regional overlap between imagery and retrieval used uncontrolled memory conditions, such as autobiographical memory tasks, that cannot distinguish between successful and unsuccessful retrieval. A second issue is that fMRI studies that compared imagery and retrieval have used modality-aspecific cues that are likely to activate auditory and visual processing regions simultaneously. Thus, it is not clear to what extent identified brain regions support modality-specific or modality-independent imagery and retrieval processes. In the current fMRI study, we addressed this issue by comparing imagery to retrieval under controlled memory conditions in both auditory and visual modalities. We also obtained subjective measures of imagery quality allowing us to dissociate regions contributing to successful vs. unsuccessful imagery. Results indicated that auditory and visual regions contribute both to imagery and retrieval in a modality-specific fashion. In addition, we identified four sets of brain regions with distinct patterns of activity that contributed to imagery and retrieval in a modality-independent fashion. The first set of regions, including hippocampus, posterior cingulate cortex, medial prefrontal cortex and angular gyrus, showed a pattern common to imagery/retrieval and consistent with successful performance regardless of task. The second set of regions, including dorsal precuneus, anterior cingulate and dorsolateral prefrontal cortex, also showed a pattern common to imagery and retrieval, but consistent with unsuccessful performance during both tasks. Third, left ventrolateral prefrontal cortex showed an interaction between task and performance and was associated with successful imagery but unsuccessful retrieval. Finally, the fourth set of regions, including ventral precuneus, midcingulate cortex and supramarginal gyrus, showed the opposite interaction, supporting unsuccessful imagery, but successful retrieval performance. Results are discussed in relation to reconstructive, attentional, semantic memory, and working memory processes. This is the first study to separate the neural correlates of successful and unsuccessful performance for both imagery and retrieval and for both auditory and visual modalities.Item Open Access Looking at the ventriloquist: visual outcome of eye movements calibrates sound localization.(PloS one, 2013-01) Pages, Daniel S; Groh, Jennifer MA general problem in learning is how the brain determines what lesson to learn (and what lessons not to learn). For example, sound localization is a behavior that is partially learned with the aid of vision. This process requires correctly matching a visual location to that of a sound. This is an intrinsically circular problem when sound location is itself uncertain and the visual scene is rife with possible visual matches. Here, we develop a simple paradigm using visual guidance of sound localization to gain insight into how the brain confronts this type of circularity. We tested two competing hypotheses. 1: The brain guides sound location learning based on the synchrony or simultaneity of auditory-visual stimuli, potentially involving a Hebbian associative mechanism. 2: The brain uses a 'guess and check' heuristic in which visual feedback that is obtained after an eye movement to a sound alters future performance, perhaps by recruiting the brain's reward-related circuitry. We assessed the effects of exposure to visual stimuli spatially mismatched from sounds on performance of an interleaved auditory-only saccade task. We found that when humans and monkeys were provided the visual stimulus asynchronously with the sound but as feedback to an auditory-guided saccade, they shifted their subsequent auditory-only performance toward the direction of the visual cue by 1.3-1.7 degrees, or 22-28% of the original 6 degree visual-auditory mismatch. In contrast when the visual stimulus was presented synchronously with the sound but extinguished too quickly to provide this feedback, there was little change in subsequent auditory-only performance. Our results suggest that the outcome of our own actions is vital to localizing sounds correctly. Contrary to previous expectations, visual calibration of auditory space does not appear to require visual-auditory associations based on synchrony/simultaneity.Item Open Access Major and minor music compared to excited and subdued speech.(J Acoust Soc Am, 2010-01) Bowling, Daniel L; Gill, Kamraan; Choi, Jonathan D; Prinz, Joseph; Purves, DaleThe affective impact of music arises from a variety of factors, including intensity, tempo, rhythm, and tonal relationships. The emotional coloring evoked by intensity, tempo, and rhythm appears to arise from association with the characteristics of human behavior in the corresponding condition; however, how and why particular tonal relationships in music convey distinct emotional effects are not clear. The hypothesis examined here is that major and minor tone collections elicit different affective reactions because their spectra are similar to the spectra of voiced speech uttered in different emotional states. To evaluate this possibility the spectra of the intervals that distinguish major and minor music were compared to the spectra of voiced segments in excited and subdued speech using fundamental frequency and frequency ratios as measures. Consistent with the hypothesis, the spectra of major intervals are more similar to spectra found in excited speech, whereas the spectra of particular minor intervals are more similar to the spectra of subdued speech. These results suggest that the characteristic affective impact of major and minor tone collections arises from associations routinely made between particular musical intervals and voiced speech.Item Open Access Meclizine enhancement of sensorimotor gating in healthy male subjects with high startle responses and low prepulse inhibition.(Neuropsychopharmacology : official publication of the American College of Neuropsychopharmacology, 2014-02) Larrauri, José A; Kelley, Lisalynn D; Jenkins, Mason R; Westman, Eric C; Schmajuk, Nestor A; Rosenthal, M Zachary; Levin, Edward DHistamine H1 receptor systems have been shown in animal studies to have important roles in the reversal of sensorimotor gating deficits, as measured by prepulse inhibition (PPI). H1-antagonist treatment attenuates the PPI impairments caused by either blockade of NMDA glutamate receptors or facilitation of dopamine transmission. The current experiment brought the investigation of H1 effects on sensorimotor gating to human studies. The effects of the histamine H1 antagonist meclizine on the startle response and PPI were investigated in healthy male subjects with high baseline startle responses and low PPI levels. Meclizine was administered to participants (n=24) using a within-subjects design with each participant receiving 0, 12.5, and 25 mg of meclizine in a counterbalanced order. Startle response, PPI, heart rate response, galvanic skin response, and changes in self-report ratings of alertness levels and affective states (arousal and valence) were assessed. When compared with the control (placebo) condition, the two doses of meclizine analyzed (12.5 and 25 mg) produced significant increases in PPI without affecting the magnitude of the startle response or other physiological variables. Meclizine also caused a significant increase in overall self-reported arousal levels, which was not correlated with the observed increase in PPI. These results are in agreement with previous reports in the animal literature and suggest that H1 antagonists may have beneficial effects in the treatment of subjects with compromised sensorimotor gating and enhanced motor responses to sensory stimuli.Item Open Access Neural correlates of categorical perception in learned vocal communication.(Nat Neurosci, 2009-02) Prather, JF; Nowicki, S; Anderson, RC; Peters, S; Mooney, RAThe division of continuously variable acoustic signals into discrete perceptual categories is a fundamental feature of vocal communication, including human speech. Despite the importance of categorical perception to learned vocal communication, the neural correlates underlying this phenomenon await identification. We found that individual sensorimotor neurons in freely behaving swamp sparrows expressed categorical auditory responses to changes in note duration, a learned feature of their songs, and that the neural response boundary accurately predicted the categorical perceptual boundary measured in field studies of the same sparrow population. Furthermore, swamp sparrow populations that learned different song dialects showed different categorical perceptual boundaries that were consistent with the boundary being learned. Our results extend the analysis of the neural basis of perceptual categorization into the realm of vocal communication and advance the learned vocalizations of songbirds as a model for investigating how experience shapes categorical perception and the activity of categorically responsive neurons.Item Open Access Profiling of experience-regulated proteins in the songbird auditory forebrain using quantitative proteomics.(Eur J Neurosci, 2008-03) Pinaud, Raphael; Osorio, Cristina; Alzate, Oscar; Jarvis, Erich DAuditory and perceptual processing of songs are required for a number of behaviors in songbirds such as vocal learning, territorial defense, mate selection and individual recognition. These neural processes are accompanied by increased expression of a few transcription factors, particularly in the caudomedial nidopallium (NCM), an auditory forebrain area believed to play a key role in auditory learning and song discrimination. However, these molecular changes are presumably part of a larger, yet uncharacterized, protein regulatory network. In order to gain further insight into this network, we performed two-dimensional differential in-gel expression (2D-DIGE) experiments, extensive protein quantification analyses, and tandem mass spectrometry in the NCM of adult songbirds hearing novel songs. A subset of proteins was selected for immunocytochemistry in NCM sections to confirm the 2D-DIGE findings and to provide additional quantitative and anatomical information. Using these methodologies, we found that stimulation of freely behaving birds with conspecific songs did not significantly impact the NCM proteome 5 min after stimulus onset. However, following 1 and 3 h of stimulation, a significant number of proteins were consistently regulated in NCM. These proteins spanned a range of functional categories that included metabolic enzymes, cytoskeletal molecules, and proteins involved in neurotransmitter secretion and calcium binding. Our findings suggest that auditory processing of vocal communication signals in freely behaving songbirds triggers a cascade of protein regulatory events that are dynamically regulated through activity-dependent changes in calcium levels.Item Open Access Role of nicotinic receptors in the lateral habenula in the attenuation of amphetamine-induced prepulse inhibition deficits of the acoustic startle response in rats.(Psychopharmacology, 2015-08) Larrauri, José A; Burke, Dennis A; Hall, Brandon J; Levin, Edward DRationale
Prepulse inhibition (PPI) refers to the reduction of the startle response magnitude when a startling stimulus is closely preceded by a weak stimulus. PPI is commonly used to measure sensorimotor gating. In rats, the PPI reduction induced by the dopamine agonist apomorphine can be reversed by systemic administration of nicotine. A high concentration of nicotinic receptors is found in the lateral habenula (LHb), an epithalamic structure with efferent projections to brain regions involved in the modulation of PPI, which has been shown to regulate the activity of midbrain dopamine neurons.Objectives
The prospective role of nicotinic receptors in the LHb in the regulation of PPI was assessed in this study, using different pharmacological models of sensorimotor gating deficits.Methods
Interactions between systemic amphetamine and haloperidol and intra-LHb infusions of mecamylamine (10 μg/side) or nicotine (30 μg/side) on PPI were analyzed in Experiments 1 and 2. Intra-LHb infusions of different nicotine doses (25, and 50 μg/side) and their interactions with systemic administration of amphetamine or dizocilpine on PPI were examined in Experiments 3 and 4.Results
Infusions of nicotine into the LHb dose-dependently attenuated amphetamine-induced PPI deficits but had no effect on PPI disruptions caused by dizocilpine. Intra-LHb mecamylamine infusions did not affect PPI nor interact with dopaminergic manipulations.Conclusions
These results are congruent with previous reports of systemic nicotine effects on PPI, suggesting a role of the LHb in the attenuation of sensorimotor gating deficits caused by the hyperactivity of dopamine systems.