vllm icon indicating copy to clipboard operation
vllm copied to clipboard

[Bugfix] add truncate_prompt_tokens to work offline, directly from LLM class.

Open yecohn opened this issue 1 year ago • 2 comments

Fixes #4507.

  • added function _validate_prompt to make sure prompt is a right format and run some validations.
  • truncate_prompt_tokens should take truncate the prompt to the left as seen in vllm/entrypoints/openai/serving_engine.py line 182.

yecohn avatar May 04 '24 15:05 yecohn

@tdoublep can you help review this?

simon-mo avatar May 04 '24 20:05 simon-mo

@simon-mo sure, will try to get to that later today

tdoublep avatar May 06 '24 07:05 tdoublep