Tarokh, VahidWu, Suya2023-06-082023-06-082023https://hdl.handle.net/10161/27726<p>We consider unnormalized models in which the probability density function contains an unknown normalization constant. This term normalizes the model so that its probability density function integrates to one. The computation of this normalization term can be NP-hard and intractable in high-dimensional settings. Classical statistical analysis, e.g. hypothesis testing and quickest change detection, require the exact likelihood estimation of the data generating distribution. They may not be feasible for unnormalized models because of the complexity of computing the explicit probability density function. In this dissertation, we address this difficulty by replacing the likelihood estimation with a new estimation procedure, named score matching, and developing new approaches to the analysis of unnormalized models and applications.</p><p>This dissertation is organized into three parts. We first derive a new test statistic for hypothesis testing of unnormalized models. The test statistic is designed by the difference between Hyv\"arinen scores of the null and alternative distributions. Under some reasonable conditions, we provide the asymptotic distribution of this test statistic under the null hypothesis. When this distribution cannot be expressed in a closed form, we outline a bootstrap approach to learn the critical values and provide consistency guarantees. </p><p>Next we consider sequential analysis. We develop a new variant of the classical Cumulative Sum (CUSUM) algorithm for the quickest change detection. This variant is again based on the Hyv\"arinen score and is called the Score-based CUSUM (SCUSUM) algorithm. The asymptotic optimality of the proposed algorithm is investigated by deriving expressions for average detection delay and average running length to a false alarm. We further extend the SCUSUM to a robust scenario, where we introduce a notion of the ``least favorable'' distribution in the sense of Fisher divergence. Accordingly, we derive the asymptotic analysis of detection delay and false alarms for the robust SCUSUM.</p><p>Finally, we study the application of score-based generative methods to cross-subject mapping of neural activity. The objective is to obtain a task-specific representation of the source subject's neural signals in the feature space of the destination subject. In principle, the mapping function can be assumed to be purely deterministic. Alternatively, we propose to adopt a probabilistic approach, where we learn a conditional probability distribution of destination features given source features. Specifically, we consider learning the Restricted Boltzmann Machine with Gaussian inputs and Bernoulli hidden units (Gauss-Bernoulli RBM). We derive the closed-form gradient to learn Gauss-Bernoulli RBM by minimizing the Fisher divergence, and the well-learned RBM generates task-specific representations of source subjects into the feature space of the destination subject.</p><p>Each chapter of the contributions is accompanied by thorough numerical results demonstrating the potentials and the limits of the proposed approach with other benchmarks in various scenarios.</p>Electrical engineeringStatisticscross subject learningHypothesis testingout of distribution detectionquickest change detectionscore matchingunnormalized modelScore-based Approach to Analysis of Unnormalized Models and ApplicationsDissertation