[Bug] Setting `pad_token_id` to `eos_token_id`:151645 for open-end generation.
Checklist
- [x] 1. I have searched related issues but cannot get the expected help.
- [x] 2. The bug has not been fixed in the latest version.
- [x] 3. Please note that if the bug-related issue you submitted lacks corresponding environment info and a minimal reproducible demo, it will be challenging for us to reproduce and resolve the issue, reducing the likelihood of receiving feedback.
Describe the bug
Thanks for your nice work on Internvl3. I found that transformers give the following warning. How can I solve that?
Reproduction
I directly utilize the lmm-evals and its model file ``Internvl2'' to run your latest Internvl3-8B checkpoint.
Environment
torch 2.6.0
torchvision 0.21.0
tqdm 4.67.1
tqdm-multiprocess 0.0.11
traitlets 5.14.3
transformers 4.52.0.dev0
Error traceback
Hi, this shouldn't affect normal generation. If the model repeats
"eos_token_id": [
151645,
151643
]
to your generation_config.json
Hi, this shouldn't affect normal generation. If the model repeats or <im_end>, try adding:
"eos_token_id": [ 151645, 151643 ]to your generation_config.json
Thanks a lot
you can alse use this one
generation_config = dict( max_new_tokens=1024, do_sample=True, eos_token_id=151645, pad_token_id=151645, )