vllm
vllm copied to clipboard
[Feature]: add DoRA support
🚀 The feature, motivation and pitch
The recent ICML'24 Oral paper, DoRA, has shown consistent improvement over LoRA on various tasks (LLM, MLLM, etc.) and backbones (LLaMA, LLaVA, etc.) [Code] https://github.com/NVlabs/DoRA
DoRA has also been integrated into various open-source library/framework:
- Hugging Face: peft, diffusers
- Apple: MLX
- Meta: torchtune
- NVIDIA: NeMo
- Unsloth AI: unsloth
- Answer.AI: QDoRA + FSDP
- LLaMA-Factory
- LyCORIS
- ..., etc.
Alternatives
No response
Additional context
No response
Before submitting a new issue...
- [X] Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
Thank you very much for bringing this feature. We will consider supporting it.
I also need vllm to support dora !!!
We need this feature too,this will help a lot!
@jeejeelee any updates on whether this is on the roadmap?
@jeejeelee https://x.com/winglian/status/1888951180606202028 GRPO + DoRA converges faster than GRPO+ FFT or GRPO + LoRA (thanks @winglian for the great finding!)
@jeejeelee https://x.com/winglian/status/1888951180606202028 GRPO + DoRA converges faster than GRPO+ FFT or GRPO + LoRA (thanks @winglian for the great finding!)
Thanks, will start trying to support DoRA soon
Hi, this is my first time contributing to vllm, and to opensource in general, but I've created this PR for adding DoRA support: https://github.com/vllm-project/vllm/pull/14389
Thanks for your patience with my opensource development practices, and I look forward to learning with you!
This issue has been automatically marked as stale because it has not had any activity within 90 days. It will be automatically closed if no further activity occurs within 30 days. Leave a comment if you feel this issue should remain open. Thank you!
This issue has been automatically closed due to inactivity. Please feel free to reopen if you feel it is still relevant. Thank you!
@jeejeelee just curious: any plan or timeline for the support?
I have considered this issue before. A rather tricky problem is that the TP>1 case is not easy to handle
@jeejeelee should I mark this as keep-open so the bot leaves it alone?
Let's keep it open, thank you
Hello, do you know if Dora is supported at this moment ? I still have : ValueError: vLLM does not yet support DoRA. on vllm=0.11.0