Neural Mechanisms of Auditory and Visual Search
Repository Usage Stats
We live in a world with incredibly rich sensory environments. Our visual system is bombarded with objects of varying colors, shapes, and levels of brightness. Our auditory system is inundated with sounds of different pitches, timbre, and loudness. Yet, we are generally not overwhelmed with our environments because we can selectively choose information to interact with. One way of accomplishing such selection is through search; we search our environments so that we can selectively process relevant information and ignore other irrelevant stimuli. Search has been extensively studied in the visual domain, but there has been very little analogous research into search-type processes in audition. Moreover, even in vision, the research has been mostly limited to the processes of spatial orienting or focusing of attention towards relevant information. The process of search, however, involves additional steps aside from the engagement of spatial attention, including the initial detection and identification of relevant Target stimuli in our environment. Here I aimed to delineate the cascade of neural processes underlying search in both the auditory and visual domains, with particular emphasis on understanding the initial mechanisms underlying stimulus detection and identification.
I conducted four experiments using event-related potentials (i.e., time-locked averages of electroencephalogram) taking advantage of the high temporal resolution of this methodology to delineate the time course of search-related processes in both audition and vision. Participants were presented with a novel temporally distributed search paradigm in either the auditory (Studies 1-3) or visual (Study 4) modalities, where the task was to find a designated Target and make a discrimination concerning a certain feature of that Target. The temporal distribution of the stimulus presentation in the experiments enabled the selective extraction of the neural responses to the relevant Target and, separately, the irrelevant Nontarget, a separation that would not be possible with simultaneous or static presentation. The results showed, for both the auditory and visual domains, a very rapid, nonlateralized, differentiation of processing between the Target and sensory-equivalent Nontarget stimuli prior to the brain activity reflecting the spatial focusing of attention toward that Target. Based on results showing a failure of early differentiation when the Targets and Nontargets were presented in isolation, I inferred that this early differentiation is a result of a Relational Template preset for the Target stimulus relative to an ongoing environmental context. Additional results showed that the larger Target/Nontarget differentiation corresponded to faster response times and that the maintenance of multiple templates, to facilitate search for more than one Target item, resulted in significantly slower processing potentially due to a serial comparison of the incoming stimuli to each of the templates. These experiments show analogous mechanisms underlying both auditory and visual feature-based search that include an initial detection process, prior to the orienting of spatial attention.
DepartmentPsychology and Neuroscience
More InfoShow full item record
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Rights for Collection: Duke Dissertations