Augment-and-conquer negative binomial processes

dc.contributor.author

Zhou, M

dc.contributor.author

Carin, L

dc.date.accessioned

2014-07-22T16:16:15Z

dc.date.issued

2012-12-01

dc.description.abstract

By developing data augmentation methods unique to the negative binomial (NB) distribution, we unite seemingly disjoint count and mixture models under the NB process framework. We develop fundamental properties of the models and derive efficient Gibbs sampling inference. We show that the gamma-NB process can be reduced to the hierarchical Dirichlet process with normalization, highlighting its unique theoretical, structural and computational advantages. A variety of NB processes with distinct sharing mechanisms are constructed and applied to topic modeling, with connections to existing algorithms, showing the importance of inferring both the NB dispersion and probability parameters.

dc.identifier.issn

1049-5258

dc.identifier.uri

https://hdl.handle.net/10161/8950

dc.relation.ispartof

Advances in Neural Information Processing Systems

dc.title

Augment-and-conquer negative binomial processes

dc.type

Journal article

pubs.begin-page

2546

pubs.end-page

2554

pubs.organisational-group

Duke

pubs.organisational-group

Electrical and Computer Engineering

pubs.organisational-group

Pratt School of Engineering

pubs.publication-status

Published

pubs.volume

4

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
1209.1119v2.pdf
Size:
785.34 KB
Format:
Adobe Portable Document Format
Description:
Published version