Show simple item record

dc.contributor.advisor Nolte, Loren en_US
dc.contributor.author Qian, Ming en_US
dc.date.accessioned 2008-08-01T12:53:21Z
dc.date.available 2008-08-01T12:53:21Z
dc.date.issued 2008-04-16 en_US
dc.identifier.uri http://hdl.handle.net/10161/669
dc.description Dissertation en_US
dc.description.abstract <p>A series of fusion techniques are developed and applied to EEG and pupillary recording analysis in a rapid serial visual presentation (RSVP) based image triage task, in order to improve the accuracy of capturing single-trial neural/pupillary signatures (patterns) associated with visual target detection.</p><p>The brain response to visual stimuli is not a localized pulse, instead it reflects time-evolving neurophysiological activities distributed selectively in the brain. To capture the evolving spatio-temporal pattern, we divide an extended (``global") EEG data epoch, time-locked to each image stimulus onset, into multiple non-overlapping smaller (``local") temporal windows. While classifiers can be applied on EEG data located in multiple local temporal windows, outputs from local classifiers can be fused to enhance the overall detection performance.</p><p>According to the concept of induced/evoked brain rhythms, the EEG response can be decomposed into different oscillatory components and the frequency characteristics for these oscillatory components can be evaluated separately from the temporal characteristics. While the temporal-based analysis achieves fairly accurate detection performance, the frequency-based analysis can improve the overall detection accuracy and robustness further if frequency-based and temporal-based results are fused at the decision level.</p><p>Pupillary response provides another modality for a single-trial image triage task. We developed a pupillary response feature construction and selection procedure to extract/select the useful features that help to achieve the best classification performance. The classification results based on both modalities (pupillary and EEG) are further fused at the decision level. Here, the goal is to support increased classification confidence through inherent modality complementarities. The fusion results show significant improvement over classification results using any single modality.</p><p>For crucial image triage tasks, multiple image analysts could be asked to evaluate the same set of images to improve the probability of detection and reduce the probability of false positive. We observe significant performance gain by fusing the decisions drawn by multiple analysts.</p><p>To develop a practical real-time EEG-based application system, sometimes we have to work with an EEG system that has a limited number of electrodes. We present methods of ranking the channels, identifying a reduced set of EEG channels that can deliver robust classification performance.</p> en_US
dc.format.extent 2731773 bytes
dc.format.mimetype application/pdf
dc.language.iso en_US
dc.subject Engineering, Biomedical en_US
dc.subject Engineering, Electronics and Electrical en_US
dc.subject Engineering, Biomedical en_US
dc.subject EEG en_US
dc.subject BCI en_US
dc.subject decision fusion en_US
dc.subject cortically en_US
dc.subject coupled computer vision en_US
dc.subject biomedical pattern recognition en_US
dc.subject signal detection en_US
dc.title Fusion Methods for Detecting Neural and Pupil Responses to Task-relevant Visual Stimuli Using Computer Pattern Analysis en_US
dc.type Dissertation en_US
dc.department Electrical and Computer Engineering en_US
duke.embargo.months 6 en_US
dc.date.accessible 2009-02-02T06:00:03Z

Files in this item

This item appears in the following Collection(s)

Show simple item record