Skip to main content
Duke University Libraries
DukeSpace Scholarship by Duke Authors
  • Login
  • Ask
  • Menu
  • Login
  • Ask a Librarian
  • Search & Find
  • Using the Library
  • Research Support
  • Course Support
  • Libraries
  • About
View Item 
  •   DukeSpace
  • Theses and Dissertations
  • Duke Dissertations
  • View Item
  •   DukeSpace
  • Theses and Dissertations
  • Duke Dissertations
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Approximate Inference for High-Dimensional Latent Variable Models

Thumbnail
View / Download
1.1 Mb
Date
2018
Author
Tan, Zilong
Advisor
Mukherjee, Sayan
Repository Usage Stats
197
views
127
downloads
Abstract

Latent variable models are widely used in applications ranging from

natural language processing to recommender systems. Exact inference

using maximum likelihood for these models is generally NP-hard, and

computationally prohibitive on big and/or high-dimensional data. This

has motivated the development of approximate inference methods that

balance between computational complexity and statistical

efficiency. Understanding the computational and statistical tradeoff

is important for analyzing approximate inference approaches as well

as designing new ones. Towards this goal, this dissertation presents a

study of new approximate inference algorithms with provable guarantees

for three classes of inference tasks.

The first class is based on the method of moments. The inference in

this setting is typically reduced to a tensor decomposition problem

which requires decomposing a $p$-by-$p$-by-$p$ estimator tensor for

$p$ variables. We divide-and-conquer the tensor method to instead

decompose $O\left(p/k\right)$ sub-tensors each of size

$O\left(k^3\right)$, achieving significant reduction in computational

complexity when the number of latent variables $k$ is small. Our

approach can also enforce the nonnegativity of estimates for inferring

nonnegative models parameters. Theoretical analysis gives sufficient

conditions for ensuring robustness of the divide-and-conquer method,

as well as proof of linear convergence for the nonnegative

factorization.

In the second class, we further consider mixed-effect models in which

the variance of latent variables also needs to be inferred. We present

approximate estimators which have closed-form analytical

expressions. Fast computational techniques based on the subsampled

randomized Hadamard transform are also developed achieving sublinear

complexity in the dimension. This makes our approach useful for

high-dimensional applications like genome-wide association

studies. Moreover, we provide theoretical analysis that states

provable error guarantees for the approximation.

The last class is more general inference in an infinite-dimensional

function space specified by a Gaussian process (GP) prior. We provide

a dual formulation of GPs using random functions in a reproducing

kernel Hilbert space (RKHS) where the function representation is

specified as latent variables. We show that the dual GP can realize an

expanded class of functions, and can also be well-approximated by a

low-dimensional sufficient dimension reduction subspace of the RKHS. A

fast learning algorithm for the dual GP is developed which improves

upon the state-of-the-art computational complexity of GPs.

Description
Dissertation
Type
Dissertation
Department
Computer Science
Subject
Computer science
approximate inference
latent variable models
machine learning
Permalink
https://hdl.handle.net/10161/18280
Citation
Tan, Zilong (2018). Approximate Inference for High-Dimensional Latent Variable Models. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/18280.
Collections
  • Duke Dissertations
More Info
Show full item record
Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.

Rights for Collection: Duke Dissertations


Works are deposited here by their authors, and represent their research and opinions, not that of Duke University. Some materials and descriptions may include offensive content. More info

Make Your Work Available Here

How to Deposit

Browse

All of DukeSpaceCommunities & CollectionsAuthorsTitlesTypesBy Issue DateDepartmentsAffiliations of Duke Author(s)SubjectsBy Submit DateThis CollectionAuthorsTitlesTypesBy Issue DateDepartmentsAffiliations of Duke Author(s)SubjectsBy Submit Date

My Account

LoginRegister

Statistics

View Usage Statistics
Duke University Libraries

Contact Us

411 Chapel Drive
Durham, NC 27708
(919) 660-5870
Perkins Library Service Desk

Digital Repositories at Duke

  • Report a problem with the repositories
  • About digital repositories at Duke
  • Accessibility Policy
  • Deaccession and DMCA Takedown Policy

TwitterFacebookYouTubeFlickrInstagramBlogs

Sign Up for Our Newsletter
  • Re-use & Attribution / Privacy
  • Harmful Language Statement
  • Support the Libraries
Duke University