Browsing by Subject "math.ST"
Now showing 1 - 5 of 5
- Results Per Page
- Sort Options
Item Open Access A Large Deviation Approach to Posterior Consistency in Dynamical SystemsSu, Langxuan; Mukherjee, SayanIn this paper, we provide asymptotic results concerning (generalized) Bayesian inference for certain dynamical systems based on a large deviation approach. Given a sequence of observations $y$, a class of model processes parameterized by $\theta \in \Theta$ which can be characterized as a stochastic process $X^\theta$ or a measure $\mu_\theta$, and a loss function $L$ which measures the error between $y$ and a realization of $X^\theta$, we specify the generalized posterior distribution $\pi_t(\theta \mid y)$. The goal of this paper is to study the asymptotic behavior of $\pi_t(\theta \mid y)$ as $t \to \infty.$ In particular, we state conditions on the model family $\{\mu_\theta\}_{\theta \in \Theta}$ and the loss function $L$ such that the posterior distribution converges. The two conditions we require are: (1) a conditional large deviation behavior for a single $X^\theta$, and (2) an exponential continuity condition over the model family for the map from the parameter $\theta$ to the loss incurred between $X^\theta$ and the observation sequence $y$. The proposed framework is quite general, we apply it to two very different classes of dynamical systems: continuous time hypermixing processes and Gibbs processes on shifts of finite type. We also show that the generalized posterior distribution concentrates asymptotically on those parameters that minimize the expected loss and a divergence term, hence proving posterior consistency.Item Open Access A Priori Generalization Analysis of the Deep Ritz Method for Solving High Dimensional Elliptic EquationsLu, Jianfeng; Lu, Yulong; Wang, MinThis paper concerns the a priori generalization analysis of the Deep Ritz Method (DRM) [W. E and B. Yu, 2017], a popular neural-network-based method for solving high dimensional partial differential equations. We derive the generalization error bounds of two-layer neural networks in the framework of the DRM for solving two prototype elliptic PDEs: Poisson equation and static Schr\"odinger equation on the $d$-dimensional unit hypercube. Specifically, we prove that the convergence rates of generalization errors are independent of the dimension $d$, under the a priori assumption that the exact solutions of the PDEs lie in a suitable low-complexity space called spectral Barron space. Moreover, we give sufficient conditions on the forcing term and the potential function which guarantee that the solutions are spectral Barron functions. We achieve this by developing a new solution theory for the PDEs on the spectral Barron space, which can be viewed as an analog of the classical Sobolev regularity theory for PDEs.Item Open Access Finite-Time Frequentist Regret Bounds of Multi-Agent Thompson Sampling on Sparse Hypergraphs(2023-12-24) Jin, Tianyuan; Hsu, Hao-Lun; Chang, William; Xu, PanItem Open Access On the spectral property of kernel-based sensor fusion algorithms of high dimensional dataDing, X; Wu, HTIn this paper, we apply local laws of random matrices and free probability theory to study the spectral properties of two kernel-based sensor fusion algorithms, nonparametric canonical correlation analysis (NCCA) and alternating diffusion (AD), for two sequences of random vectors $\mathcal{X}:=\{\xb_i\}_{i=1}^n$ and $\mathcal{Y}:=\{\yb_i\}_{i=1}^n$ under the null hypothesis. The matrix of interest is a product of the kernel matrices associated with $\mathcal{X}$ and $\mathcal{Y}$, which may not be diagonalizable in general. We prove that in the regime where dimensions of both random vectors are comparable to the sample size, if NCCA and AD are conducted using a smooth kernel function, then the first few nontrivial eigenvalues will converge to real deterministic values provided $\mathcal{X}$ and $\mathcal{Y}$ are independent Gaussian random vectors. We propose an eigenvalue-ratio test based on the real parts of the eigenvalues of the product matrix to test if $\mathcal{X}$ and $\mathcal{Y}$ are independent and do not share common information. Simulation study verifies the usefulness of such statistic.Item Open Access Practical tests for significance in Markov ChainsChikina, M; Frieze, A; Mattingly, JC; Pegden, WWe give improvements to theorems which enable significance testing in Markov Chains.