Improving Natural Language Understanding via Contrastive Learning Methods
Natural language understanding (NLU) is an essential but challenging task in Natural Language Processing (NLP), aiming to automatically extract and understand the semantic information from raw text or voice data. Among the previous NLU solutions, representation learning methods have recently become the mainstream, which maps textual data into low-dimensional vector spaces for downstream tasks. With the development of deep neural networks, text representation learning has achieved state-of-the-art performance on plenty of NLP scenarios.
Although text representation learning methods with large-scale network encoders have shown significant empirical gains, many essential properties of the text encoders remain unexplored, which hinders models' further application into real-world scenarios: (1) the high computational complexity of the large-scale deep networks limits text encoders to be applied on a broader range of devices, especially on low calculation-ability resources; (2) the mechanic of networks is agnostic, limiting the control of the latent representations for downstream tasks; (3) representation learning methods are data-driven, lead to inherent social bias problems with unbalanced data.
To address the problems above in deep text encoders, I proposed a series of effective contrastive learning methods, which supervise the encoders by enlarging the difference between positive and negative data sample pairs. In this thesis, I first present a theoretical contrastive learning tool, which bridges the contrastive learning methods and the mutual information in information theory. Then, I apply contrastive learning into several NLU scenarios to improve the text encoders' effectiveness, interpretability, and fairness.
Contrastive Learning
Information Theory
Machine Learning
Natural Lauguage Processing
Neural Network

This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 United States License.
Rights for Collection: Duke Dissertations
Works are deposited here by their authors, and represent their research and opinions, not that of Duke University. Some materials and descriptions may include offensive content. More info