Browsing by Subject "Auditory Pathways"
Now showing 1 - 11 of 11
Results Per Page
Sort Options
Item Open Access A framework for integrating the songbird brain.(J Comp Physiol A Neuroethol Sens Neural Behav Physiol, 2002-12) Jarvis, ED; Smith, VA; Wada, K; Rivas, MV; McElroy, M; Smulders, TV; Carninci, P; Hayashizaki, Y; Dietrich, F; Wu, X; McConnell, P; Yu, J; Wang, PP; Hartemink, AJ; Lin, SBiological systems by default involve complex components with complex relationships. To decipher how biological systems work, we assume that one needs to integrate information over multiple levels of complexity. The songbird vocal communication system is ideal for such integration due to many years of ethological investigation and a discreet dedicated brain network. Here we announce the beginnings of a songbird brain integrative project that involves high-throughput, molecular, anatomical, electrophysiological and behavioral levels of analysis. We first formed a rationale for inclusion of specific biological levels of analysis, then developed high-throughput molecular technologies on songbird brains, developed technologies for combined analysis of electrophysiological activity and gene regulation in awake behaving animals, and developed bioinformatic tools that predict causal interactions within and between biological levels of organization. This integrative brain project is fitting for the interdisciplinary approaches taken in the current songbird issue of the Journal of Comparative Physiology A and is expected to be conducive to deciphering how brains generate and perceive complex behaviors.Item Open Access A mutual information analysis of neural coding of speech by low-frequency MEG phase information.(J Neurophysiol, 2011-08) Cogan, Gregory B; Poeppel, DavidRecent work has implicated low-frequency (<20 Hz) neuronal phase information as important for both auditory (<10 Hz) and speech [theta (∼4-8 Hz)] perception. Activity on the timescale of theta corresponds linguistically to the average length of a syllable, suggesting that information within this range has consequences for segmentation of meaningful units of speech. Longer timescales that correspond to lower frequencies [delta (1-3 Hz)] also reflect important linguistic features-prosodic/suprasegmental-but it is unknown whether the patterns of activity in this range are similar to theta. We investigate low-frequency activity with magnetoencephalography (MEG) and mutual information (MI), an analysis that has not yet been applied to noninvasive electrophysiological recordings. We find that during speech perception each frequency subband examined [delta (1-3 Hz), theta(low) (3-5 Hz), theta(high) (5-7 Hz)] processes independent information from the speech stream. This contrasts with hypotheses that either delta and theta reflect their corresponding linguistic levels of analysis or each band is part of a single holistic onset response that tracks global acoustic transitions in the speech stream. Single-trial template-based classifier results further validate this finding: information from each subband can be used to classify individual sentences, and classifier results that utilize the combination of frequency bands provide better results than single bands alone. Our results suggest that during speech perception low-frequency phase of the MEG signal corresponds to neither abstract linguistic units nor holistic evoked potentials but rather tracks different aspects of the input signal. This study also validates a new method of analysis for noninvasive electrophysiological recordings that can be used to formally characterize information content of neural responses and interactions between these responses. Furthermore, it bridges results from different levels of neurophysiological study: small-scale multiunit recordings and local field potentials and macroscopic magneto/electrophysiological noninvasive recordings.Item Open Access Detection of single mRNAs in individual cells of the auditory system.(Hearing research, 2018-09) Salehi, Pezhman; Nelson, Charlie N; Chen, Yingying; Lei, Debin; Crish, Samuel D; Nelson, Jovitha; Zuo, Hongyan; Bao, JianxinGene expression analysis is essential for understanding the rich repertoire of cellular functions. With the development of sensitive molecular tools such as single-cell RNA sequencing, extensive gene expression data can be obtained and analyzed from various tissues. Single-molecule fluorescence in situ hybridization (smFISH) has emerged as a powerful complementary tool for single-cell genomics studies because of its ability to map and quantify the spatial distributions of single mRNAs at the subcellular level in their native tissue. Here, we present a detailed method to study the copy numbers and spatial localizations of single mRNAs in the cochlea and inferior colliculus. First, we demonstrate that smFISH can be performed successfully in adult cochlear tissue after decalcification. Second, we show that the smFISH signals can be detected with high specificity. Third, we adapt an automated transcript analysis pipeline to quantify and identify single mRNAs in a cell-specific manner. Lastly, we show that our method can be used to study possible correlations between transcriptional and translational activities of single genes. Thus, we have developed a detailed smFISH protocol that can be used to study the expression of single mRNAs in specific cell types of the peripheral and central auditory systems.Item Open Access Distribution of eye position information in the monkey inferior colliculus.(Journal of neurophysiology, 2012-02) Bulkin, David A; Groh, Jennifer MThe inferior colliculus (IC) is thought to have two main subdivisions, a central region that forms an important stop on the ascending auditory pathway and a surrounding shell region that may play a more modulatory role. In this study, we investigated whether eye position affects activity in both the central and shell regions. Accordingly, we mapped the location of eye position-sensitive neurons in six monkeys making spontaneous eye movements by sampling multiunit activity at regularly spaced intervals throughout the IC. We used a functional map based on auditory response patterns to estimate the anatomical location of recordings, in conjunction with structural MRI and histology. We found eye position-sensitive sites throughout the IC, including at 27% of sites in tonotopically organized recording penetrations (putatively the central nucleus). Recordings from surrounding tissue showed a larger proportion of sites indicating an influence of eye position (33-43%). When present, the magnitude of the change in activity due to eye position was often comparable to that seen for sound frequency. Our results indicate that the primary ascending auditory pathway is influenced by the position of the eyes. Because eye position is essential for visual-auditory integration, our findings suggest that computations underlying visual-auditory integration begin early in the ascending auditory pathway.Item Open Access Early onset of deafening-induced song deterioration and differential requirements of the pallial-basal ganglia vocal pathway.(Eur J Neurosci, 2008-12) Horita, Haruhito; Wada, Kazuhiro; Jarvis, Erich DSimilar to humans, songbirds rely on auditory feedback to maintain the acoustic and sequence structure of adult learned vocalizations. When songbirds are deafened, the learned features of song, such as syllable structure and sequencing, eventually deteriorate. However, the time-course and initial phases of song deterioration have not been well studied, particularly in the most commonly studied songbird, the zebra finch. Here, we observed previously uncharacterized subtle but significant changes to learned song within a few days following deafening. Syllable structure became detectably noisier and silent intervals between song motifs increased. Although song motif sequences remained stable at 2 weeks, as previously reported, pronounced changes occurred in longer stretches of song bout sequences. These included deletions of syllables between song motifs, changes in the frequency at which specific chunks of song were produced and stuttering for birds that had some repetitions of syllables before deafening. Changes in syllable structure and song bout sequence occurred at different rates, indicating different mechanisms for their deterioration. The changes in syllable structure required an intact lateral part but not the medial part of the pallial-basal ganglia vocal pathway, whereas changes in the song bout sequence did not require lateral or medial portions of the pathway. These findings indicate that deafening-induced song changes in zebra finches can be detected rapidly after deafening, that acoustic and sequence changes can occur independently, and that, within this time period, the pallial-basal ganglia vocal pathway controls the acoustic structure changes but not the song bout sequence changes.Item Open Access Effects of Electrical Stimulation in the Inferior Colliculus on Frequency Discrimination by Rhesus Monkeys and Implications for the Auditory Midbrain Implant.(The Journal of neuroscience : the official journal of the Society for Neuroscience, 2016-05) Pages, Daniel S; Ross, Deborah A; Puñal, Vanessa M; Agashe, Shruti; Dweck, Isaac; Mueller, Jerel; Grill, Warren M; Wilson, Blake S; Groh, Jennifer MUnderstanding the relationship between the auditory selectivity of neurons and their contribution to perception is critical to the design of effective auditory brain prosthetics. These prosthetics seek to mimic natural activity patterns to achieve desired perceptual outcomes. We measured the contribution of inferior colliculus (IC) sites to perception using combined recording and electrical stimulation. Monkeys performed a frequency-based discrimination task, reporting whether a probe sound was higher or lower in frequency than a reference sound. Stimulation pulses were paired with the probe sound on 50% of trials (0.5-80 μA, 100-300 Hz, n = 172 IC locations in 3 rhesus monkeys). Electrical stimulation tended to bias the animals' judgments in a fashion that was coarsely but significantly correlated with the best frequency of the stimulation site compared with the reference frequency used in the task. Although there was considerable variability in the effects of stimulation (including impairments in performance and shifts in performance away from the direction predicted based on the site's response properties), the results indicate that stimulation of the IC can evoke percepts correlated with the frequency-tuning properties of the IC. Consistent with the implications of recent human studies, the main avenue for improvement for the auditory midbrain implant suggested by our findings is to increase the number and spatial extent of electrodes, to increase the size of the region that can be electrically activated, and to provide a greater range of evoked percepts.Patients with hearing loss stemming from causes that interrupt the auditory pathway after the cochlea need a brain prosthetic to restore hearing. Recently, prosthetic stimulation in the human inferior colliculus (IC) was evaluated in a clinical trial. Thus far, speech understanding was limited for the subjects and this limitation is thought to be partly due to challenges in harnessing the sound frequency representation in the IC. Here, we tested the effects of IC stimulation in monkeys trained to report the sound frequencies they heard. Our results indicate that the IC can be used to introduce a range of frequency percepts and suggest that placement of a greater number of electrode contacts may improve the effectiveness of such implants.Item Open Access Lateral symmetry of auditory attention in hemispherectomized patients.(Neuropsychologia, 1981-01) Nebes, RD; Madden, DJ; Berg, WDSingle digits were monaurally presented in a random order to the right and left ears of hemispherectomized patients. Vocal identification time was found to be equivalent for the two ears. This result does not support the existence of a massive lateral shift of attention in these patients. It is thus unlikely that the large ear difference typically found with dichotic presentation in hemispherectomized patients is due to an asymmetrical distribution of auditory attention. © 1981.Item Open Access Molecular mapping of brain areas involved in parrot vocal communication.(J Comp Neurol, 2000-03-27) Jarvis, ED; Mello, CVAuditory and vocal regulation of gene expression occurs in separate discrete regions of the songbird brain. Here we demonstrate that regulated gene expression also occurs during vocal communication in a parrot, belonging to an order whose ability to learn vocalizations is thought to have evolved independently of songbirds. Adult male budgerigars (Melopsittacus undulatus) were stimulated to vocalize with playbacks of conspecific vocalizations (warbles), and their brains were analyzed for expression of the transcriptional regulator ZENK. The results showed that there was distinct separation of brain areas that had hearing- or vocalizing-induced ZENK expression. Hearing warbles resulted in ZENK induction in large parts of the caudal medial forebrain and in 1 midbrain region, with a pattern highly reminiscent of that observed in songbirds. Vocalizing resulted in ZENK induction in nine brain structures, seven restricted to the lateral and anterior telencephalon, one in the thalamus, and one in the midbrain, with a pattern partially reminiscent of that observed in songbirds. Five of the telencephalic structures had been previously described as part of the budgerigar vocal control pathway. However, functional boundaries defined by the gene expression patterns for some of these structures were much larger and different in shape than previously reported anatomical boundaries. Our results provide the first functional demonstration of brain areas involved in vocalizing and auditory processing of conspecific sounds in budgerigars. They also indicate that, whether or not vocal learning evolved independently, some of the gene regulatory mechanisms that accompany learned vocal communication are similar in songbirds and parrots.Item Open Access Profiling of experience-regulated proteins in the songbird auditory forebrain using quantitative proteomics.(Eur J Neurosci, 2008-03) Pinaud, Raphael; Osorio, Cristina; Alzate, Oscar; Jarvis, Erich DAuditory and perceptual processing of songs are required for a number of behaviors in songbirds such as vocal learning, territorial defense, mate selection and individual recognition. These neural processes are accompanied by increased expression of a few transcription factors, particularly in the caudomedial nidopallium (NCM), an auditory forebrain area believed to play a key role in auditory learning and song discrimination. However, these molecular changes are presumably part of a larger, yet uncharacterized, protein regulatory network. In order to gain further insight into this network, we performed two-dimensional differential in-gel expression (2D-DIGE) experiments, extensive protein quantification analyses, and tandem mass spectrometry in the NCM of adult songbirds hearing novel songs. A subset of proteins was selected for immunocytochemistry in NCM sections to confirm the 2D-DIGE findings and to provide additional quantitative and anatomical information. Using these methodologies, we found that stimulation of freely behaving birds with conspecific songs did not significantly impact the NCM proteome 5 min after stimulus onset. However, following 1 and 3 h of stimulation, a significant number of proteins were consistently regulated in NCM. These proteins spanned a range of functional categories that included metabolic enzymes, cytoskeletal molecules, and proteins involved in neurotransmitter secretion and calcium binding. Our findings suggest that auditory processing of vocal communication signals in freely behaving songbirds triggers a cascade of protein regulatory events that are dynamically regulated through activity-dependent changes in calcium levels.Item Open Access Songbirds and the revised avian brain nomenclature.(Ann N Y Acad Sci, 2004-06) Reiner, Anton; Perkel, David J; Mello, Claudio V; Jarvis, Erich DIt has become increasingly clear that the standard nomenclature for many telencephalic and related brainstem structures of the avian brain is based on flawed once-held assumptions of homology to mammalian brain structures, greatly hindering functional comparisons between avian and mammalian brains. This has become especially problematic for those researchers studying the neurobiology of birdsong, the largest single group within the avian neuroscience community. To deal with the many communication problems this has caused among researchers specializing in different vertebrate classes, the Avian Brain Nomenclature Forum, held at Duke University from July 18-20, 2002, set out to develop a new terminology for the avian telencephalon and some allied brainstem cell groups. In one major step, the erroneous conception that the avian telencephalon consists mainly of a hypertrophied basal ganglia has been purged from the telencephalic terminology, and the actual parts of the basal ganglia and its brainstem afferent cell groups have been given new names to reflect their now-evident homologies. The telencephalic regions that were incorrectly named to reflect presumed homology to mammalian basal ganglia have been renamed as parts of the pallium. The prefixes used for the new names for the pallial subdivisions have retained most established abbreviations, in an effort to maintain continuity with the pre-existing nomenclature. Here we present a brief synopsis of the inaccuracies in the old nomenclature, a summary of the nomenclature changes, and details of changes for specific songbird vocal and auditory nuclei. We believe this new terminology will promote more accurate understanding of the broader neurobiological implications of song control mechanisms and facilitate the productive exchange of information between researchers studying avian and mammalian systems.Item Open Access Systematic mapping of the monkey inferior colliculus reveals enhanced low frequency sound representation.(Journal of neurophysiology, 2011-04) Bulkin, David A; Groh, Jennifer MWe investigated the functional architecture of the inferior colliculus (IC) in rhesus monkeys. We systematically mapped multiunit responses to tonal stimuli and noise in the IC and surrounding tissue of six rhesus macaques, collecting data at evenly placed locations and recording nonresponsive locations to define boundaries. The results show a modest tonotopically organized region (17 of 100 recording penetration locations in 4 of 6 monkeys) surrounded by a large mass of tissue that, although vigorously responsive, showed no clear topographic arrangement (68 of 100 penetration locations). Rather, most cells in these recordings responded best to frequencies at the low end of the macaque auditory range. The remaining 15 (of 100) locations exhibited auditory responses that were not sensitive to sound frequency. Potential anatomical correlates of functionally defined regions and implications for midbrain auditory prosthetic devices are discussed.