vllm icon indicating copy to clipboard operation
vllm copied to clipboard

[Bugfix] Fix JambaForCausalLM LoRA

Open jeejeelee opened this issue 9 months ago • 4 comments

Reported by @varun-sundar-rabindranath

The current CI tests for Jamba LoRA are being skipped, so its issues have not been discovered until now. Thanks to @varun-sundar-rabindranath for testing. When I testing locally with 4 GPUs, two problems were found:

    1. Some layers may call weight property, so we need to add weight property for lora layer.
    1. The LoRA weights in the Jamba LoRA test script include MOE layer, which we currently do not support, so we are removing the Jamba LoRA test script directly.

jeejeelee avatar Mar 06 '25 16:03 jeejeelee

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

github-actions[bot] avatar Mar 06 '25 16:03 github-actions[bot]

cc @tlrmchlsmth

DarkLight1337 avatar Mar 06 '25 16:03 DarkLight1337

Thanks @jeejeelee . Can you also update the "supported models" page for LoRA please. https://github.com/vllm-project/vllm/blob/9f1710f1ace3535920c0bb6d4cc329c36289080e/docs/source/models/supported_models.md?plain=1#L339

Thanks @jeejeelee . Can you also update the "supported models" page for LoRA please.

https://github.com/vllm-project/vllm/blob/9f1710f1ace3535920c0bb6d4cc329c36289080e/docs/source/models/supported_models.md?plain=1#L339

This model supports LoRA, it's just that the MOE layers don't support LoRA yet, so we'll keep them for now.

jeejeelee avatar Mar 07 '25 01:03 jeejeelee