When will the vllm PR be merged to the main branch?
Thank you for your impressive work on this project. I'm eager to try this model, but I've noticed that the vllm deployment pull request has conflicts with the main branch, and building vllm from scratch is challenging for my development environment.
Is there an active effort to resolve these conflicts and merge the PR into the main branch? If possible, could you provide an estimated timeline for this merge? I greatly appreciate your work and look forward to using this implementation. Thank you for your time.
I'm eager like you to wait for trying this model ! But this pr has been pending for a period time @zwd003 could you give us a favor?
Thank you for your impressive work on this project. I'm eager to try this model, but I've noticed that the
vllmdeployment pull request has conflicts with the main branch, and buildingvllmfrom scratch is challenging for my development environment.感谢您在这个项目上所做的令人印象深刻的工作。我很想尝试这个模型,但我注意到vllm部署拉取请求与主分支有冲突,并且从头开始构建vllm对于我的开发环境来说是一个挑战。Is there an active effort to resolve these conflicts and merge the PR into the main branch? If possible, could you provide an estimated timeline for this merge? I greatly appreciate your work and look forward to using this implementation. Thank you for your time.是否正在积极努力解决这些冲突并将 PR 合并到主分支中?如果可能的话,您能否提供此次合并的预计时间表?我非常感谢您的工作并期待使用此实现。感谢您的时间。
+1
same here.
It's merged, but doesn't work
vllm v0.5.1 support deepseek v2
vllm v0.5.1 support deepseek v2
Are you using the 236b model or the lite one?
vllm v0.5.1 support deepseek v2
Are you using the 236b model or the lite one?
Not test yet
vllm v0.5.1 support deepseek v2
Are you using the 236b model or the lite one?
use 236b, success