add multi lora support
in addition to the last pull reqeust, now you can also use multiple lora adapters.
Now you can use a single adapter as input (as before):
{"name": "xxx", "path": "xxx/xxxxx", "base_model_name": "xxx/xxxx"}
And also as a list of adapters:
[{"name": "xxx", "path": "xxx/xxxxx", "base_model_name": "xxx/xxxx"},{"name": "xxx", "path": "xxx/xxxxx", "base_model_name": "xxx/xxxx"},...]
Thanks @pandyamarut for merging the last lora adapter Pull Request. Just added a small update to include also multi lora support.
Hey mind writng some simple examples to the readme.md too? would be nice if theres an example usage
Waiting for merge please. Seems the LoRA support is broken due to some conflicts since v1.9 https://github.com/runpod-workers/worker-vllm/blob/6fc770415def3b65ddf8c3a80a8b36fb1454f8e7/src/engine.py#L145-L151
should be working now with the new version