This repository provides a reference implementation of the paper:
Temporal Attention for Language Models
Guy D. Rosin and and Kira Radinsky
Findings of NAACL 2022, to appear
Preprint: https://arxiv.org/abs/2202.02093
Abstract:
Pretrained language models based on the transformer architecture have shown great success in NLP. Textual training data often comes from the web and is thus tagged with time-specific information, but most language models ignore this information. They are trained on the textual data alone, limiting their ability to generalize temporally.
In this work, we extend the key component of the transformer architecture, i.e., the self-attention mechanism, and propose temporal attention - a time-aware self-attention mechanism. Temporal attention can be applied to any transformer model and requires the input texts to be accompanied with their relevant time points.
This mechanism allows the transformer to capture this temporal information and create time-specific contextualized word representations. We leverage these representations for the task of semantic change detection; we apply our proposed mechanism to BERT and experiment on three datasets in different languages (English, German, and Latin) that also vary in time, size, and genre. Our proposed model achieves state-of-the-art results on all the datasets.
-
Create an Anaconda environment with Python 3.8 and install requirements:
conda create -n tempo_att python=3.8 conda activate tempo_att conda install pytorch torchvision torchaudio cudatoolkit=11.1 -c pytorch -c nvidia pip install -r requirements.txt
-
Obtain datasets for training and evaluation on semantic change detection: SemEval-2020 Task 1 datasets.
- Train BERT with temporal attention using
train_tempobert.py
. This script is similar to Hugging Face's language modeling training script (link), and introduces atime_embedding_type
argument, that is set totemporal_attention
by default. - Evaluate the trained model on semantic change detection using
semantic_change_detection.py
.
- The temporal attention mechanism is implemented in
modeling_tempobert.py
, in theTemporalSelfAttention
class. - Note: this repository also supports the TempoBERT model (paper), which prepends time tokens to text sequences: https://github.com/guyrosin/tempobert