Fengshenbang-LM icon indicating copy to clipboard operation
Fengshenbang-LM copied to clipboard

questions about the padding value

Open iridescentee opened this issue 1 year ago • 1 comments

https://github.com/IDEA-CCNL/Fengshenbang-LM/blob/c8fb7b8437843ea13fa9d147ce86c4592fa21237/fengshen/examples/qa_t5/qa_dataset.py#L97-L98

https://github.com/IDEA-CCNL/Fengshenbang-LM/blob/c8fb7b8437843ea13fa9d147ce86c4592fa21237/fengshen/examples/qa_t5/qa_dataset.py#L115-L121

First of all, thank you for your code. Your code helps me a lot.

I have a small question on how you pad the input sequences. In Lines 97-98, you set the pad token id -100. usually, setting the token label to -100 means its loss should be ignored. I do not see why you set the padding value of input_ids and attention_mask [line 115 - 121] to -100. Are these lines wrong and I should change pad value into 0 ?

iridescentee avatar Aug 25 '23 08:08 iridescentee

Yes, pad_token_id should be 0

hejunqing avatar Jan 15 '24 11:01 hejunqing