vllm icon indicating copy to clipboard operation
vllm copied to clipboard

Metrics model name when using multiple loras

Open mces89 opened this issue 3 months ago • 1 comments

Your current environment

The output of `python collect_env.py`
Your output of `python collect_env.py` here

Model Input Dumps

No response

🐛 Describe the bug

Hi, currently even i use multiple lora serving with different names like "lora1", "lora2" which shared a same base model name "base". When i pull the metrics endpoint, all metrics' model name still show as "base", is it possible to updated to reflect the actual lora names instead of the base model name? Thanks.

Before submitting a new issue...

  • [X] Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.

mces89 avatar Nov 20 '24 23:11 mces89