ToD-BERT
ToD-BERT copied to clipboard
KeyError: 'slots'
Hi, i'm getting an error i cannot seem to get past. I've included it below:
Traceback (most recent call last): File "main.py", line 109, in <module> trn_loader = get_loader(args, "train", tokenizer, datasets, unified_meta) File "/content/ToD-BERT/utils/utils_general.py", line 58, in get_loader dataset = globals()["Dataset_"+task](data_info, tokenizer, args, unified_meta, mode, args["max_seq_length"]) File "/content/ToD-BERT/utils/dataloader_dst.py", line 20, in __init__ self.slots = list(unified_meta["slots"].keys()) KeyError: 'slots'
Is this to do with a specific dataset i need to include in the list of datasets to use? Because i do not want to use them all,just the multiwozs.
Hi,
what was the command you ran?
Please check these few lines to make sure and print what you have for your unified_meta.
Hi, running those lines gave this as output for unified_meta:
{'others': None, 'num_labels': 0, 'resp_cand_trn': {}}
In the definition of the function:
def get_unified_meta(datasets):
unified_meta = {"others":None}
for ds in datasets:
for key, value in datasets[ds]["meta"].items():
What is considered as meta within the datasets? And does it exist in specific ones, and not the others? Because i'm only using 5 of them, within this command:
!python main.py -task dst --do_train --dataset "['metalwoz','multiwoz','oos_intent','schema','taskmaster']" --model_name_or_path bert-base-uncased --my_model 'BertConfig'
The problem here is that some datasets you used do not have the DST labels to be trained as defined similarly to MultiWOZ.
You can first run only using Multiwoz and check what it has in the unified_meta or you can check the DST data loader of Multiwoz as well.
Hi Jason,
I have encountered the same issue. I only run the Multiwoz datasets (--dataset='["multiwoz"]' ) and I print out the unified_meta and here is the output: {'others': None, 'num_labels': 0}.
Could you kindly advise on this? I have tried Multiwoz 2.0 and Multiwoz 2.1 and both encountered the same issue.
Here are the error messages:
Traceback (most recent call last):
File "my_tod_pretraining.py", line 1010, in
Thanks a lot,
Rich
@Bosheng2020
Can you provide the full command you were running?
gpu=$1 model_type=$2 bert_dir=$3 output_dir=$4 add1=$5 add2=$6 add3=$7 add4=$8 add5=$9
CUDA_VISIBLE_DEVICES=$gpu python my_tod_pretraining.py
--task='dst'
--data_path='/home/rich/ToD-BERT-master/dialog_datasets'
--dataset='["multiwoz"]'
--model_type=${model_type}
--model_name_or_path=${bert_dir}
--output_dir=${output_dir}
--do_train
--do_eval
--mlm
--do_lower_case
--evaluate_during_training
--save_steps=2500 --logging_steps=1000
--per_gpu_train_batch_size=8 --per_gpu_eval_batch_size=8
${add1} ${add2} ${add3} ${add4} ${add5}
./run_tod_lm_pretraining.sh 2 bert bert-base-uncased save/pretrain/ToD-BERT-JNT --only_last_turn --add_rs_loss
Hi Jason,
I have pasted the full command as above.
Thanks a lot for your prompt reply.
Rich
@Bosheng2020
Are you trying to run a pretraining task or DST task?
If the pretraining task, you need to use task=usdl with run_tod_lm_pretraining.sh
If DST task, you need to run the command here
Oh I see... I am running a DST task. Thanks a lot for your help!
Hi Jason, I tried using only multiwoz, but unified_meta remained the same. Any more things I could check? Thank you
@aliyabannaeva
Can you check if you can run this command?
If not, please copy and paste the error message here, thanks.