[New Model]: support Ovis VLM series
🚀 The feature, motivation and pitch
Hi there i can't load AIDC-AI/Ovis2-34B model https://huggingface.co/AIDC-AI/Ovis2-34B this model outperform even Qwen 2.5 VL 72B in benchmarks Currently i get this ERROR 02-17 19:45:45 engine.py:389] ValueError: Ovis has no vLLM implementation and the Transformers implementation is not compatible with vLLM. DEBUG 02-17 19:45:45 client.py:256] Shutting down MQLLMEngineClient output handler. can we add support this model, please best regards
Alternatives
No response
Additional context
No response
Before submitting a new issue...
- [x] Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
The same question, but I am using qwen2.5-vl.
The same question, but I am using qwen2.5-vl.
qwen2.5-vl model already support in latest version
The same question, but I am using qwen2.5-vl.
qwen2.5-vl model already support in latest version
thanks,It was caused by some other configurations, my mistake.
is there an estimate to when this will be made compatible?
the same problem,but Ovis2-16B,Ovis2-34B
https://github.com/vllm-project/vllm/issues/14115
I met the same problem when loading "ranchlai/chatglm3-6B-gptq-4bit", any suggestion?
Closing as completed by #15826