Now showing items 1-6 of 6
Bayesian Learning with Dependency Structures via Latent Factors, Mixtures, and Copulas
Bayesian methods offer a flexible and convenient probabilistic learning framework to extract interpretable knowledge from complex and structured data. Such methods can characterize dependencies among multiple levels of hidden ...
PAC-optimal, Non-parametric Algorithms and Bounds for Exploration in Concurrent MDPs with Delayed Updates
As the reinforcement learning community has shifted its focus from heuristic methods to methods that have performance guarantees, PAC-optimal exploration algorithms have received significant attention. Unfortunately, the ...
Online Learning of Non-Stationary Networks, with Application to Financial Data
In this paper, we propose a new learning algorithm for non-stationary Dynamic Bayesian Networks is proposed. Although a number of effective learning algorithms for non-stationary DBNs have previously been proposed and applied ...
Deep Generative Models for Vision and Language Intelligence
Deep generative models have achieved tremendous success in recent years, with applications in various tasks involving vision and language intelligence. In this dissertation, I will mainly discuss the contributions that I ...
Scalable Bayesian Matrix and Tensor Factorization for Discrete Data
Matrix and tensor factorization methods decompose the observed matrix and tensor data into a set of factor matrices. They provide a useful way to extract latent factors or features from complex data, and also to predict ...
Towards Better Representations with Deep/Bayesian Learning
Deep learning and Bayesian Learning are two popular research topics in machine learning. They provide the flexible representations in the complementary manner. Therefore, it is desirable to take the best from both fields. ...