llm-foundry icon indicating copy to clipboard operation
llm-foundry copied to clipboard

Set pad_token_id to tokenizer.pad_token_id if not set on command line

Open patrickhwood opened this issue 2 years ago • 0 comments

The hf_chat.py program emits this warning message before each chat response:

The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's attention_mask to obtain reliable results. Setting pad_token_id to eos_token_id:0 for open-end generation.

Fixed by setting pad_token_id to tokenizer.eos_token_id if not set on the command line.

patrickhwood avatar May 12 '23 20:05 patrickhwood