Nonparametric Mixture Models for Covariance Matrix Estimation and Hypothesis Testing with Applications in Neuroscience

Loading...

Date

2024

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

0
views
5
downloads

Abstract

This dissertation investigates the application of nonparametric mixture models in hypothesis testing and high-dimensional covariance matrix estimation, where uncovering complex and latent structure is essential. Nonparametric mixture models provide a flexible, data-driven solution with minimal assumptions, making them suitable for complex and high-dimensional applications.

The motivation for this research arises from neuroscience, specifically the study of how individual neurons and neural populations encode multiple simultaneous stimuli. In Chapter 2, we develop a comprehensive statistical testing framework for detecting "code juggling" in single neurons, where a neuron’s spiking activity dynamically switches between patterns associated with constituent single stimulus. This framework refines previous approaches by reducing false detections of code juggling and identifying faster fluctuations. A Bayesian inference framework, incorporating predictive recursion for marginal likelihood estimation, is applied to reanalyze earlier findings. In Chapter 3, we extend the focus beyond individual neuron behavior to investigate neural population coordination using block-structured covariance matrices. The block structure introduces sparsity and natural shrinkage within blocks, effectively reduce dimensionality and adapts to various covariance structures. We propose a hierarchical Bayesian method that incorporates a shrinkage prior alongside the mixture of finite mixtures model to estimate both the latent block structure and the overall covariance matrix. Numerical experiments demonstrate the flexibility and adaptability of this approach, showing that it outperforms alternative methods across diverse applications. Chapter 4 presents two shrinkage estimators within Stein’s family, integrating predictive recursion for adaptive density estimation: a technical improvement to the Ledoit-Wolf shrinkage estimator and a Bayesian estimator that mimics the "local attraction force". These two estimators show advantages when the number of variables $p$ is small and greater than the sample size ($p > n$).

Description

Provenance

Subjects

Statistics

Citation

Citation

Chen, Yunran (2024). Nonparametric Mixture Models for Covariance Matrix Estimation and Hypothesis Testing with Applications in Neuroscience. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/32629.

Collections


Except where otherwise noted, student scholarship that was shared on DukeSpace after 2009 is made available to the public under a Creative Commons Attribution / Non-commercial / No derivatives (CC-BY-NC-ND) license. All rights in student work shared on DukeSpace before 2009 remain with the author and/or their designee, whose permission may be required for reuse.