Nonparametric Bayesian Dictionary Learning and Count and Mixture Modeling

dc.contributor.advisor

Carin, Lawrence

dc.contributor.author

Zhou, Mingyuan

dc.date.accessioned

2013-05-13T15:34:20Z

dc.date.available

2013-05-13T15:34:20Z

dc.date.issued

2013

dc.department

Electrical and Computer Engineering

dc.description.abstract

Analyzing the ever-increasing data of unprecedented scale, dimensionality, diversity, and complexity poses considerable challenges to conventional approaches of statistical modeling. Bayesian nonparametrics constitute a promising research direction, in that such techniques can fit the data with a model that can grow with complexity to match the data. In this dissertation we consider nonparametric Bayesian modeling with completely random measures, a family of pure-jump stochastic processes with nonnegative increments. In particular, we study dictionary learning for sparse image representation using the beta process and the dependent hierarchical beta process, and we present the negative binomial process, a novel nonparametric Bayesian prior that unites the seemingly disjoint problems of count and mixture modeling. We show a wide variety of successful applications of our nonparametric Bayesian latent variable models to real problems in science and engineering, including count modeling, text analysis, image processing, compressive sensing, and computer vision.

dc.identifier.uri

https://hdl.handle.net/10161/7204

dc.subject

Electrical engineering

dc.subject

Statistics

dc.subject

Computer science

dc.subject

Bayesian nonparametrics

dc.subject

Count Modeling

dc.subject

Dictionary learning

dc.subject

Mixture modeling

dc.subject

Negative Binomial Process

dc.subject

Topic modeling

dc.title

Nonparametric Bayesian Dictionary Learning and Count and Mixture Modeling

dc.type

Dissertation

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Zhou_duke_0066D_11883.pdf
Size:
11.34 MB
Format:
Adobe Portable Document Format

Collections