Simon Mo
Simon Mo
I don't think it the file is included in the test yaml.
This is erroring in CI https://buildkite.com/vllm/ci/builds/6751#018f529b-4a51-431a-b1d0-0a4daeb8d49f/319-360 ``` 35%|███▌ | 7/20 [00:04
``` Processed prompts: 100%\|██████████\| 1000/1000 [01:02
For a string prompt, we do not hash it incrementally. For example, (for illustrative purpose), string ABCD is hashed: `hash("A"), hash("AB"), hash("ABC"), hash("ABCD")` to produce 4 hash values and used...
Would this PR fixes it https://github.com/vllm-project/vllm/pull/4468
@rkooo567 plz push
Thank you for your contribution. I will close this in favor of #2531
@tdoublep can you help review this?
nice! i don't even though this schema existed in the first place
In these cases > models/tokenizer can be private, or lm-evaluation-harness and vllm can be in the same network but without internet access. wouldn't the user of lm-evaluation-harness be able to...