ToD-BERT icon indicating copy to clipboard operation
ToD-BERT copied to clipboard

KeyError: 'slots'

Open bannayeva opened this issue 4 years ago • 12 comments
trafficstars

Hi, i'm getting an error i cannot seem to get past. I've included it below:

Traceback (most recent call last): File "main.py", line 109, in <module> trn_loader = get_loader(args, "train", tokenizer, datasets, unified_meta) File "/content/ToD-BERT/utils/utils_general.py", line 58, in get_loader dataset = globals()["Dataset_"+task](data_info, tokenizer, args, unified_meta, mode, args["max_seq_length"]) File "/content/ToD-BERT/utils/dataloader_dst.py", line 20, in __init__ self.slots = list(unified_meta["slots"].keys()) KeyError: 'slots'

Is this to do with a specific dataset i need to include in the list of datasets to use? Because i do not want to use them all,just the multiwozs.

bannayeva avatar May 23 '21 20:05 bannayeva

Hi,

what was the command you ran?

Please check these few lines to make sure and print what you have for your unified_meta.

jasonwu0731 avatar May 25 '21 18:05 jasonwu0731

Hi, running those lines gave this as output for unified_meta:

{'others': None, 'num_labels': 0, 'resp_cand_trn': {}}

In the definition of the function:

def get_unified_meta(datasets):    
  unified_meta = {"others":None}    
  for ds in datasets:        
    for key, value in datasets[ds]["meta"].items():

What is considered as meta within the datasets? And does it exist in specific ones, and not the others? Because i'm only using 5 of them, within this command: !python main.py -task dst --do_train --dataset "['metalwoz','multiwoz','oos_intent','schema','taskmaster']" --model_name_or_path bert-base-uncased --my_model 'BertConfig'

bannayeva avatar May 30 '21 10:05 bannayeva

The problem here is that some datasets you used do not have the DST labels to be trained as defined similarly to MultiWOZ.

You can first run only using Multiwoz and check what it has in the unified_meta or you can check the DST data loader of Multiwoz as well.

jasonwu0731 avatar Jun 01 '21 16:06 jasonwu0731

Hi Jason,

I have encountered the same issue. I only run the Multiwoz datasets (--dataset='["multiwoz"]' ) and I print out the unified_meta and here is the output: {'others': None, 'num_labels': 0}.

Could you kindly advise on this? I have tried Multiwoz 2.0 and Multiwoz 2.1 and both encountered the same issue.

Here are the error messages: Traceback (most recent call last): File "my_tod_pretraining.py", line 1010, in main() File "my_tod_pretraining.py", line 973, in main trn_loader = get_loader(args_dict, "train", tokenizer, datasets, unified_meta, "train") File "/home/rich/ToD-BERT-master/utils/utils_general.py", line 58, in get_loader dataset = globals()["Dataset_"+task](data_info, tokenizer, args, unified_meta, mode, args["max_seq_length"]) File "/home/rich/ToD-BERT-master/utils/dataloader_dst.py", line 21, in init self.slots = list(unified_meta["slots"].keys()) KeyError: 'slots'

Thanks a lot,

Rich

Bosheng2020 avatar Jun 08 '21 09:06 Bosheng2020

@Bosheng2020

Can you provide the full command you were running?

jasonwu0731 avatar Jun 08 '21 16:06 jasonwu0731

gpu=$1 model_type=$2 bert_dir=$3 output_dir=$4 add1=$5 add2=$6 add3=$7 add4=$8 add5=$9

CUDA_VISIBLE_DEVICES=$gpu python my_tod_pretraining.py
--task='dst'
--data_path='/home/rich/ToD-BERT-master/dialog_datasets'
--dataset='["multiwoz"]'
--model_type=${model_type}
--model_name_or_path=${bert_dir}
--output_dir=${output_dir}
--do_train
--do_eval
--mlm
--do_lower_case
--evaluate_during_training
--save_steps=2500 --logging_steps=1000
--per_gpu_train_batch_size=8 --per_gpu_eval_batch_size=8
${add1} ${add2} ${add3} ${add4} ${add5}

Bosheng2020 avatar Jun 08 '21 16:06 Bosheng2020

./run_tod_lm_pretraining.sh 2 bert bert-base-uncased save/pretrain/ToD-BERT-JNT --only_last_turn --add_rs_loss

Bosheng2020 avatar Jun 08 '21 16:06 Bosheng2020

Hi Jason,

I have pasted the full command as above.

Thanks a lot for your prompt reply.

Rich

Bosheng2020 avatar Jun 08 '21 16:06 Bosheng2020

@Bosheng2020

Are you trying to run a pretraining task or DST task?

If the pretraining task, you need to use task=usdl with run_tod_lm_pretraining.sh

If DST task, you need to run the command here

jasonwu0731 avatar Jun 08 '21 16:06 jasonwu0731

Oh I see... I am running a DST task. Thanks a lot for your help!

Bosheng2020 avatar Jun 08 '21 16:06 Bosheng2020

Hi Jason, I tried using only multiwoz, but unified_meta remained the same. Any more things I could check? Thank you

bannayeva avatar Jun 12 '21 07:06 bannayeva

@aliyabannaeva

Can you check if you can run this command?

If not, please copy and paste the error message here, thanks.

jasonwu0731 avatar Jun 14 '21 04:06 jasonwu0731