Analysis of Score-based Generative Models

Loading...
Thumbnail Image

Date

2024

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

1
views
72
downloads

Abstract

In this thesis, we study the convergence of diffusion models and related flow-based methods, which are highly successful approaches for learning a probability distribution from data and generating further samples. For diffusion models, we established the first convergence result applying to data distributions satisfying the log-sobolev inequality without suffering the curse of dimensionality. Our analysis gives theoretical grounding to the observation that an annealed procedure is required in practice to generate good samples, as our proof depends essentially on using annealing to obtain a warm start at each step. Moreover, we show that a predictor-corrector algorithm gives better convergence than using either portion alone. Then we generalized the results to any distribution with bounded 2nd moment, relying only on a $L^2$-accurate score estimates, with polynomial dependence on all parameters and no reliance on smoothness or functional inequalities. We also provide a theoretical guarantee of generating data distribution by a progressive flow model, the so-called JKO flow model, which implements the Jordan-Kinderleherer-Otto (JKO) scheme in a normalizing flow network. Leveraging the exponential convergence of the proximal gradient descent (GD) in Wasserstein space, we provethe Kullback-Leibler (KL) guarantee of data generation by a JKO flow model where the assumption on data density is merely a finite second moment

Department

Description

Provenance

Citation

Citation

Tan, Yixin (2024). Analysis of Score-based Generative Models. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/30871.

Collections


Dukes student scholarship is made available to the public using a Creative Commons Attribution / Non-commercial / No derivative (CC-BY-NC-ND) license.