A mutual information analysis of neural coding of speech by low-frequency MEG phase information.

Thumbnail Image



Journal Title

Journal ISSN

Volume Title

Repository Usage Stats


Citation Stats


Recent work has implicated low-frequency (<20 Hz) neuronal phase information as important for both auditory (<10 Hz) and speech [theta (∼4-8 Hz)] perception. Activity on the timescale of theta corresponds linguistically to the average length of a syllable, suggesting that information within this range has consequences for segmentation of meaningful units of speech. Longer timescales that correspond to lower frequencies [delta (1-3 Hz)] also reflect important linguistic features-prosodic/suprasegmental-but it is unknown whether the patterns of activity in this range are similar to theta. We investigate low-frequency activity with magnetoencephalography (MEG) and mutual information (MI), an analysis that has not yet been applied to noninvasive electrophysiological recordings. We find that during speech perception each frequency subband examined [delta (1-3 Hz), theta(low) (3-5 Hz), theta(high) (5-7 Hz)] processes independent information from the speech stream. This contrasts with hypotheses that either delta and theta reflect their corresponding linguistic levels of analysis or each band is part of a single holistic onset response that tracks global acoustic transitions in the speech stream. Single-trial template-based classifier results further validate this finding: information from each subband can be used to classify individual sentences, and classifier results that utilize the combination of frequency bands provide better results than single bands alone. Our results suggest that during speech perception low-frequency phase of the MEG signal corresponds to neither abstract linguistic units nor holistic evoked potentials but rather tracks different aspects of the input signal. This study also validates a new method of analysis for noninvasive electrophysiological recordings that can be used to formally characterize information content of neural responses and interactions between these responses. Furthermore, it bridges results from different levels of neurophysiological study: small-scale multiunit recordings and local field potentials and macroscopic magneto/electrophysiological noninvasive recordings.





Published Version (Please cite this version)


Publication Info

Cogan, Gregory B, and David Poeppel (2011). A mutual information analysis of neural coding of speech by low-frequency MEG phase information. J Neurophysiol, 106(2). pp. 554–563. 10.1152/jn.00075.2011 Retrieved from https://hdl.handle.net/10161/13995.

This is constructed from limited available data and may be imprecise. To cite this article, please review & use the official citation provided by the journal.



Gregory Cogan

Assistant Professor in Neurology

Dr. Cogan's research focuses on speech, language, and cognition. This research uses a variety of analytic techniques (e.g. neural power analysis, connectivity measures, decoding algorithms) and focuses mainly on invasive human recordings (electrocorticography - ECoG) but also uses non-invasive methods such as EEG, MEG, and fMRI. Dr. Cogan is also interested in studying cognitive systems in the context of disease models to help aid recovery and treatment programs.

Unless otherwise indicated, scholarly articles published by Duke faculty members are made available here with a CC-BY-NC (Creative Commons Attribution Non-Commercial) license, as enabled by the Duke Open Access Policy. If you wish to use the materials in ways not already permitted under CC-BY-NC, please consult the copyright owner. Other materials are made available here through the author’s grant of a non-exclusive license to make their work openly accessible.