llava_injection.py fail to run
hello, I find the script llava_injection.py adds new tokens to the tokenizer, but when I run the script, the add operation doesn't work which leads to a runtime error. How do you solve this problem? for example, the DEFAULT_IMAGE_PATCH_TOKEN would have the id 32000, but the vocab size still is 32000(max_id is 31999) which will result in indexSelectLargeIndex error.
hi did you manage to solve this issue?
hi did you manage to solve this issue?
I manually add the tokens before the running step, but maybe the final result would be a little different.
hi did you manage to solve this issue?
I manually add the tokens before the running step, but maybe the final result would be a little different.
Could you possibly share a quick and simple solution/script? Thanks for your help!
hi did you manage to solve this issue?
I manually add the tokens before the running step, but maybe the final result would be a little different. I also meet this problem, could you share a solution/script? Thank you very much!
hi did you manage to solve this issue?
I manually add the tokens before the running step, but maybe the final result would be a little different.
Would it be possible to share a solution? Really appreciate it.