pet
pet copied to clipboard
ZeroDivisionError when reduction is set to 'wmean' while training iPET
I ran the code with:
python cli.py \
--method ipet \
--data_dir ../dataset/data \
--model_type bert \
--model_name_or_path bert-base-cased \
--task_name my-task \
--output_dir ./bert_ipet_10_test \
--pattern_ids 0 1 2 \
--do_train \
--do_eval \
--eval_set test \
--train_examples 10 \
--unlabeled_examples 50 \
--pet_max_seq_length 512 \
--pet_repetitions 1 \
--lm_training \
--pet_per_gpu_train_batch_size 1 \
--pet_per_gpu_eval_batch_size 1 \
--pet_per_gpu_unlabeled_batch_size 1 \
--pet_gradient_accumulation_steps 1 \
--pet_num_train_epochs 3 \
--pet_max_steps 25 \
--sc_per_gpu_train_batch_size 1 \
--sc_per_gpu_eval_batch_size 1 \
--sc_per_gpu_unlabeled_batch_size 1 \
--sc_gradient_accumulation_steps 1 \
--sc_max_steps 100 \
--sc_max_seq_length 512 \
--ipet_generations 3 \
--ipet_logits_percentage 0.25 \
--ipet_scale_factor 4
The error occurred before preparing next-gen-train-data in the first generation g0
after conducting p0-i0
, p1-i0
, and p2-i0
.
File "cli.py", line 282, in <module>
main()
File "cli.py", line 266, in main
pet.train_ipet(pet_model_cfg, pet_train_cfg, pet_eval_cfg, ipet_cfg, sc_model_cfg, sc_train_cfg, sc_eval_cfg,
File "C:\Users\uber\Desktop\2023\pet\pet\modeling.py", line 191, in train_ipet
generate_ipet_train_sets(train_data=train_data, unlabeled_data=unlabeled_data,
File "C:\Users\uber\Desktop\2023\pet\pet\modeling.py", line 683, in generate_ipet_train_sets
subdir_train_set = generate_ipet_train_set(
File "C:\Users\uber\Desktop\2023\pet\pet\modeling.py", line 727, in generate_ipet_train_set
logits = np.average(logits, axis=0, weights=weights)
File "<__array_function__ internals>", line 5, in average
File "C:\Users\uber\.conda\envs\pet\lib\site-packages\numpy\lib\function_base.py", line 409, in average
raise ZeroDivisionError(
ZeroDivisionError: Weights sum to zero, can't be normalized
It seems that my weights
from this line:
https://github.com/timoschick/pet/blob/21d32de975a911bfa0261827c9bd23dc4f0e4aa2/pet/modeling.py#L717
is an empty array.
Any thoughts?
Regards, Fan
I found out why. My len(logits_lists)
was 2, and my logits_percentage
was 0.25. In this line: https://github.com/timoschick/pet/blob/21d32de975a911bfa0261827c9bd23dc4f0e4aa2/pet/modeling.py#L714 my num_logits_lists
, after applying round()
, become 0. Afterwards, my logits_lists
after this line: https://github.com/timoschick/pet/blob/21d32de975a911bfa0261827c9bd23dc4f0e4aa2/pet/modeling.py#L715
became an empty list, consequently, my weights
was conducted as an empty array.
Is the round() meant to round up or round down? Or is there a constraint that sets the lowest result after round() to 1? Or is there a need to change Python 3's default rounding behavior, "round half to even"?
If I got anything wrong, please let me know. I am looking forward to your reply.
Fan