Towards Better Representations with Deep/Bayesian Learning

Loading...
Thumbnail Image

Date

2018

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

856
views
556
downloads

Abstract

Deep learning and Bayesian Learning are two popular research topics in machine learning. They provide the flexible representations in the complementary manner. Therefore, it is desirable to take the best from both fields. This thesis focuses on the intersection of the two topics— enriching one with each other. Two new research topics are inspired: Bayesian deep learning and Deep Bayesian learning.

In Bayesian deep learning, scalable Bayesian methods are proposed to learn the weight uncertainty of deep neural networks (DNNs). On this topic, I propose the preconditioned stochastic gradient MCMC methods, then show its connection to Dropout, and its applications to modern network architectures in computer vision and natural language processing.

In Deep Bayesian learning: DNNs are employed as powerful representations of conditionals in traditional Bayesian models. I will focus on understanding the recent adversarial learning methods for joint distribution matching, through which several recent bivariate adversarial models are unified. It further raises the non-identifiability issues in bidirectional adversarial learning, and propose ALICE algorithms: a conditional entropy framework to remedy the issues. The derived algorithms show significant improvement in the tasks of image generation and translation, by solving the non-identifiability issues.

Description

Provenance

Citation

Citation

Li, Chunyuan (2018). Towards Better Representations with Deep/Bayesian Learning. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/18207.

Collections


Dukes student scholarship is made available to the public using a Creative Commons Attribution / Non-commercial / No derivative (CC-BY-NC-ND) license.