lmdeploy icon indicating copy to clipboard operation
lmdeploy copied to clipboard

[Bug]对VLM来说,是否支持多个lora adapter使用

Open Amber-Believe opened this issue 8 months ago • 5 comments

Checklist

  • [x] 1. I have searched related issues but cannot get the expected help.
  • [ ] 2. The bug has not been fixed in the latest version.
  • [ ] 3. Please note that if the bug-related issue you submitted lacks corresponding environment info and a minimal reproducible demo, it will be challenging for us to reproduce and resolve the issue, reducing the likelihood of receiving feedback.

Describe the bug

对VLM来说,是否支持多个lora adapter使用 看到对LLM是支持多个lora adapter,在api调用时,选择模型名字就可以加载对应的adapter,想知道这种对VLM是否同样支持? Image

Reproduction

如上问题

Environment

如上问题

Error traceback


Amber-Believe avatar Apr 17 '25 01:04 Amber-Believe

是说 vision encoder 加了 lora 了么?这个还没有支持

lvhan028 avatar Apr 17 '25 07:04 lvhan028

是说 vision encoder 加了 lora 了么?这个还没有支持

使用lora训练多个领域的能力,然后对应多个adapter,想要根据不同的场景,加载不同的adapter,然后通过调用API的方式使用,这种支持吗(多模态)

Amber-Believe avatar Apr 17 '25 08:04 Amber-Believe

有点像动态lora,通过专用 API 端点在运行时动态配置 LoRA 适配器,可以动态更改模型这种

Amber-Believe avatar Apr 17 '25 09:04 Amber-Believe

cc @grimoire

lvhan028 avatar Apr 17 '25 09:04 lvhan028

如果 adapter 对 vision 的部分做了修改那么暂时还没有支持

grimoire avatar Apr 18 '25 08:04 grimoire