vllm icon indicating copy to clipboard operation
vllm copied to clipboard

[New Model]: support Ovis VLM series

Open devops724 opened this issue 10 months ago • 5 comments

🚀 The feature, motivation and pitch

Hi there i can't load AIDC-AI/Ovis2-34B model https://huggingface.co/AIDC-AI/Ovis2-34B this model outperform even Qwen 2.5 VL 72B in benchmarks Currently i get this ERROR 02-17 19:45:45 engine.py:389] ValueError: Ovis has no vLLM implementation and the Transformers implementation is not compatible with vLLM. DEBUG 02-17 19:45:45 client.py:256] Shutting down MQLLMEngineClient output handler. can we add support this model, please best regards

Alternatives

No response

Additional context

No response

Before submitting a new issue...

  • [x] Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.

devops724 avatar Feb 18 '25 00:02 devops724

The same question, but I am using qwen2.5-vl.

myg133 avatar Feb 18 '25 04:02 myg133

The same question, but I am using qwen2.5-vl.

qwen2.5-vl model already support in latest version

devops724 avatar Feb 18 '25 17:02 devops724

The same question, but I am using qwen2.5-vl.

qwen2.5-vl model already support in latest version

thanks,It was caused by some other configurations, my mistake.

myg133 avatar Feb 20 '25 01:02 myg133

is there an estimate to when this will be made compatible?

sukkritsharmaofficial avatar Feb 24 '25 21:02 sukkritsharmaofficial

the same problem,but Ovis2-16B,Ovis2-34B

https://github.com/vllm-project/vllm/issues/14115

jieguolove avatar Mar 03 '25 04:03 jieguolove

I met the same problem when loading "ranchlai/chatglm3-6B-gptq-4bit", any suggestion?

foin6 avatar Apr 21 '25 08:04 foin6

Closing as completed by #15826

DarkLight1337 avatar May 10 '25 16:05 DarkLight1337