vllm
vllm copied to clipboard
Metrics model name when using multiple loras
Your current environment
The output of `python collect_env.py`
Your output of `python collect_env.py` here
Model Input Dumps
No response
🐛 Describe the bug
Hi, currently even i use multiple lora serving with different names like "lora1", "lora2" which shared a same base model name "base". When i pull the metrics endpoint, all metrics' model name still show as "base", is it possible to updated to reflect the actual lora names instead of the base model name? Thanks.
Before submitting a new issue...
- [X] Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.