Visual input enhances selective speech envelope tracking in auditory cortex at a "Cocktail Party"
Date
2013-01-23
Journal Title
Journal ISSN
Volume Title
Repository Usage Stats
views
downloads
Citation Stats
Abstract
Our ability to selectively attend to one auditory signal amid competing input streams, epitomized by the "Cocktail Party" problem, continues to stimulate research from various approaches. How this demanding perceptual feat is achieved from a neural systems perspective remains unclear and controversial. It is well established that neural responses to attended stimuli are enhanced compared with responses to ignored ones, but responses to ignored stimuli are nonetheless highly significant, leading to interference in performance. Weinvestigated whether congruent visual input of an attended speaker enhances cortical selectivity in auditory cortex, leading to diminished representation of ignored stimuli.Werecorded magnetoencephalographic signals from human participants as they attended to segments of natural continuous speech. Using two complementary methods of quantifying the neural response to speech, we found that viewing a speaker's face enhances the capacity of auditory cortex to track the temporal speech envelope of that speaker. This mechanism was most effective in a Cocktail Party setting, promoting preferential tracking of the attended speaker, whereas without visual input no significant attentional modulation was observed. These neurophysiological results underscore the importance of visual input in resolving perceptual ambiguity in a noisy environment. Since visual cues in speech precede the associated auditory signals, they likely serve a predictive role in facilitating auditory processing of speech, perhaps by directing attentional resources to appropriate points in time when to-be-attended acoustic input is expected to arrive. © 2013 the authors.
Type
Department
Description
Provenance
Subjects
Citation
Permalink
Published Version (Please cite this version)
Publication Info
Zion Golumbic, Elana, Gregory B Cogan, Charles E Schroeder and David Poeppel (2013). Visual input enhances selective speech envelope tracking in auditory cortex at a "Cocktail Party". Journal of Neuroscience, 33(4). pp. 1417–1426. 10.1523/JNEUROSCI.3675-12.2013 Retrieved from https://hdl.handle.net/10161/13994.
This is constructed from limited available data and may be imprecise. To cite this article, please review & use the official citation provided by the journal.
Collections
Scholars@Duke

Gregory Cogan
Dr. Cogan's research focuses on speech, language, and cognition. This research uses a variety of analytic techniques (e.g. neural power analysis, connectivity measures, decoding algorithms) and focuses mainly on invasive human recordings (electrocorticography - ECoG) but also uses non-invasive methods such as EEG, MEG, and fMRI. Dr. Cogan is also interested in studying cognitive systems in the context of disease models to help aid recovery and treatment programs.
Unless otherwise indicated, scholarly articles published by Duke faculty members are made available here with a CC-BY-NC (Creative Commons Attribution Non-Commercial) license, as enabled by the Duke Open Access Policy. If you wish to use the materials in ways not already permitted under CC-BY-NC, please consult the copyright owner. Other materials are made available here through the author’s grant of a non-exclusive license to make their work openly accessible.