Towards Better Representations with Deep/Bayesian Learning

Loading...
Thumbnail Image

Date

2018

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

860
views
562
downloads

Abstract

Deep learning and Bayesian Learning are two popular research topics in machine learning. They provide the flexible representations in the complementary manner. Therefore, it is desirable to take the best from both fields. This thesis focuses on the intersection of the two topics— enriching one with each other. Two new research topics are inspired: Bayesian deep learning and Deep Bayesian learning.

In Bayesian deep learning, scalable Bayesian methods are proposed to learn the weight uncertainty of deep neural networks (DNNs). On this topic, I propose the preconditioned stochastic gradient MCMC methods, then show its connection to Dropout, and its applications to modern network architectures in computer vision and natural language processing.

In Deep Bayesian learning: DNNs are employed as powerful representations of conditionals in traditional Bayesian models. I will focus on understanding the recent adversarial learning methods for joint distribution matching, through which several recent bivariate adversarial models are unified. It further raises the non-identifiability issues in bidirectional adversarial learning, and propose ALICE algorithms: a conditional entropy framework to remedy the issues. The derived algorithms show significant improvement in the tasks of image generation and translation, by solving the non-identifiability issues.

Description

Provenance

Citation

Citation

Li, Chunyuan (2018). Towards Better Representations with Deep/Bayesian Learning. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/18207.

Collections


Except where otherwise noted, student scholarship that was shared on DukeSpace after 2009 is made available to the public under a Creative Commons Attribution / Non-commercial / No derivatives (CC-BY-NC-ND) license. All rights in student work shared on DukeSpace before 2009 remain with the author and/or their designee, whose permission may be required for reuse.