TacoLM
TacoLM copied to clipboard
Temporal Common Sense Acquisition with Minimal Supervision, ACL'20
TacoLM
TemporAl COmmon Sense Language Model
A variation of BERT that is aware of temporal common sense.
Introduction
This is the code repository for our ACL 2020 paper Temporal Common Sense Acquisition with Minimal Supervision. This package is built upon huggingface/transformers at its April 2019 version.
Installation
pip install -r requirements.txtpip install --editable .
Out of the box
Here are some things you can do with this package out of the box.
Train the main model
- Access and download
data/tmp_seq_dataat Google Drive (4.6 G) - run
sh train_taco_lm.sh
The script is set to default parameters and will export the model to models/. You can configure differently by editing the script.
The training process will generate one directory to store the loss logs, as well as NUM_EPOCH directories for each epoch's model.
You will need to add BERT's vocab.txt to the epoch directories for evaluation. See more detail in the next section on pre-trained models.
The training data is pre-generated and formatted. More details here.
Experiments
You can download pre-trained models in models/ at Google Drive (0.4 G each),
or follow the training procedure in the previous section.
General Usage
You can do many things with the model by just treating it as a set of transformer weights that fit exactly into a BERT-base model. Have an on-going project with BERT? Give it a try!
Intrinsic Experiments
The intrinsic evaluation relies on pre-formatted data.
- run
sh eval_intrinsic.sh - see
eval_results/intrinsic.txtfor results
TimeBank Experiment
- by default this requires the epoch 2 model.
- run
sh eval_timebank.shto produce evaluation results on 3 different seeds. They are by default stored undereval_results - run
python scripts/eval_timebank.pyto see result interpretations.
HiEVE Experiment
- by default this requires the epoch 2 model.
- run
sh eval_hieve.shto produce eval results undereval_results - run
python scripts/eval_hieve.pyto see interpretations.
MC-TACO Experiment
See MC-TACO.
- use the augmented data under
data/mctaco-tcs - use the transformer weights of
taco_lm_epoch_2
Citation
See the following paper:
@inproceedings{ZNKR20,
author = {Ben Zhou, Qiang Ning, Daniel Khashabi and Dan Roth},
title = {Temporal Common Sense Acquisition with Minimal Supervision},
booktitle = {ACL},
year = {2020},
}