Geometry-Informed Sampling and Estimation for Constrained Statistical Inference
Date
2025
Authors
Advisors
Journal Title
Journal ISSN
Volume Title
Repository Usage Stats
views
downloads
Attention Stats
Abstract
The geometric structure of a parameter space is a foundational consideration in statistical inference. From restricting the feasible domain for parameter estimates to informing inferential procedures, the geometry of a parameter space is inextricably linked to the underlying statistical task at hand. This dissertation sets out to address how the geometry of a parameter space informs sampling and estimation tasks from two specific viewpoints: bounded domains and the presence of externally-imposed constraints.
Chapter 2 lays the foundational groundwork for sampling from a distribution wherein externally-imposed constraints restrict the parameter space. Distance-to-set priors are introduced as a means to achieve constraint relaxation and integrated seamlessly into state-of-the-art Hamiltonian Monte Carlo methods. Theoretical convergence guarantees are given and computational benefits of sampling near the boundary of a constraint using distance-to-set priors are exemplified through multiple applications.
Chapter 3 takes a foray into estimation, focusing on a theoretical analysis of the stochastic projected mirror descent algorithm. This algorithm is of particular interest as it is domain-aware by regularizing using Bregman divergences while also allowing for additional external constraints. After establishing asymptotic consistency using the ODE method, this chapter strengthens a well-known result about asymptotic efficiency by showing that stochastic projected mirror descent converges to the constrained Cramer-Rao bound under regularity conditions. Applications, such as structured-covariance matrix estimation and low-rank matrix factorization, are considered.
Leveraging the properties of Bregman divergences introduced in the previous chapter, Chapter 4 returns back to the problem of sampling with the aim of adapting the ideas of Chapter 2 to parameter spaces characterized by constrained domains. A framework for auxiliary variable gradient-based samplers is extended to be domain-aware. Variations of this framework allow for non-Euclidean distance-to-set priors as well as fully Bayesian determination of step sizes in samplers.
In the final chapter, this dissertation explores a case study of constrained sampling by constructing an order-restricted Bayesian ordinal regression model to quantify the relationship between environmental toxicity and neuron degeneration. Enforcing an isotonic dose-response curve in the presence of non-Gaussian, ordinal data presents a challenging sampling task. A bespoke prior is introduced to enforce this structure on a latent scale, and a Gibbs sampler is proposed as an alternative to gradient-based samplers. Results and analysis for this model are presented.
Type
Department
Description
Provenance
Subjects
Citation
Permalink
Citation
Presman, Rick (2025). Geometry-Informed Sampling and Estimation for Constrained Statistical Inference. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/34095.
Collections
Except where otherwise noted, student scholarship that was shared on DukeSpace after 2009 is made available to the public under a Creative Commons Attribution / Non-commercial / No derivatives (CC-BY-NC-ND) license. All rights in student work shared on DukeSpace before 2009 remain with the author and/or their designee, whose permission may be required for reuse.
