Exploring Knowledge Transfer with Deep Learning

Thumbnail Image



Journal Title

Journal ISSN

Volume Title

Repository Usage Stats



Deep learning methods have achieved significant success when trained on large amounts of data. However, in many real-world applications, data are either too expensive or impossible to collect. Therefore, how to take advantage of the knowledge acquired from one context with adequate amounts of data to resolve a different but related task is essential. This dissertation will discuss my contributions in exploring knowledge transfer utilizing deep learning methodologies in various applications.The dissertation first focuses on disentangled representation learning and its applications on voice style transfer. I will present a zero-shot voice style transfer model, which learns from non-parallel data and converses voices from/to previously unseen speakers under information-theoretic guidance. For the second part, the dissertation will focus on transfer knowledge from whole to segments. I will present a novel approach identifying and optimizing fine-grained semantic similarities between image and text entities from sentence-image matching with Optimal Transport. The third part of the dissertation will mainly focus on transfer knowledge between domains by discussing the application of domain generalization. In this problem setup, we want to learn from multiple source domains to successfully classify data sampled from unseen target domains. I will present a methodology allowing each source domain to have both shared and unique properties while explicitly training the model to encourage robust classification.





Yuan, Siyang (2022). Exploring Knowledge Transfer with Deep Learning. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/26787.


Dukes student scholarship is made available to the public using a Creative Commons Attribution / Non-commercial / No derivative (CC-BY-NC-ND) license.