Show simple item record

Augment-and-conquer negative binomial processes

dc.contributor.author Carin, Lawrence
dc.contributor.author Zhou, M
dc.date.accessioned 2014-07-22T16:16:15Z
dc.date.issued 2012-12-01
dc.identifier.issn 1049-5258
dc.identifier.uri https://hdl.handle.net/10161/8950
dc.description.abstract By developing data augmentation methods unique to the negative binomial (NB) distribution, we unite seemingly disjoint count and mixture models under the NB process framework. We develop fundamental properties of the models and derive efficient Gibbs sampling inference. We show that the gamma-NB process can be reduced to the hierarchical Dirichlet process with normalization, highlighting its unique theoretical, structural and computational advantages. A variety of NB processes with distinct sharing mechanisms are constructed and applied to topic modeling, with connections to existing algorithms, showing the importance of inferring both the NB dispersion and probability parameters.
dc.relation.ispartof Advances in Neural Information Processing Systems
dc.title Augment-and-conquer negative binomial processes
dc.type Journal article
pubs.begin-page 2546
pubs.end-page 2554
pubs.organisational-group Duke
pubs.organisational-group Electrical and Computer Engineering
pubs.organisational-group Pratt School of Engineering
pubs.publication-status Published
pubs.volume 4


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record