Time-Aware Language Models as Temporal Knowledge Bases

Abstract

<jats:title>Abstract</jats:title> <jats:p>Many facts come with an expiration date, from the name of the President to the basketball team Lebron James plays for. However, most language models (LMs) are trained on snapshots of data collected at a specific moment in time. This can limit their utility, especially in the closed-book setting where the pretraining corpus must contain the facts the model should memorize. We introduce a diagnostic dataset aimed at probing LMs for factual knowledge that changes over time and highlight problems with LMs at either end of the spectrum—those trained on specific slices of temporal data, as well as those trained on a wide range of temporal data. To mitigate these problems, we propose a simple technique for jointly modeling text with its timestamp. This improves memorization of seen facts from the training time period, as well as calibration on predictions about unseen facts from future time periods. We also show that models trained with temporal context can be efficiently “refreshed” as new data arrives, without the need for retraining from scratch.</jats:p>

Department

Description

Provenance

Subjects

Citation

Published Version (Please cite this version)

10.1162/tacl_a_00459

Publication Info

Dhingra, Bhuwan, Jeremy R Cole, Julian Martin Eisenschlos, Daniel Gillick, Jacob Eisenstein and William W Cohen (2022). Time-Aware Language Models as Temporal Knowledge Bases. Transactions of the Association for Computational Linguistics, 10. pp. 257–273. 10.1162/tacl_a_00459 Retrieved from https://hdl.handle.net/10161/27370.

This is constructed from limited available data and may be imprecise. To cite this article, please review & use the official citation provided by the journal.

Scholars@Duke

Dhingra

Bhuwan Dhingra

Assistant Professor of Computer Science

My research focuses on natural language processing (NLP), machine learning and knowledge representation.


Unless otherwise indicated, scholarly articles published by Duke faculty members are made available here with a CC-BY-NC (Creative Commons Attribution Non-Commercial) license, as enabled by the Duke Open Access Policy. If you wish to use the materials in ways not already permitted under CC-BY-NC, please consult the copyright owner. Other materials are made available here through the author’s grant of a non-exclusive license to make their work openly accessible.