[New Model]: allenai/Molmo-7B-0-0924 VisionLM
The model to consider.
https://huggingface.co/allenai/Molmo-7B-O-0924 https://huggingface.co/collections/allenai/molmo-66f379e6fe3b8ef090a8ca19
The closest model vllm already supports.
Existing Olmo Models by AllenAi: OLMoForCausalLM and OLMoEForCausalLM are supported.
What's your difficulty of supporting the model you want?
Molmo is a vision LM, so unlike the previous Olmo models by Allen AI, this model includes vision.
Before submitting a new issue...
- [X] Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
+1
Of note, Molmo-72B-0924 seems to be benchmarking as SOTA open source model, beating many closed models. It also performs much better than Llama 3.2 models. It would be great to have this model supported.
More information here:
https://huggingface.co/allenai/Molmo-72B-0924 https://molmo.allenai.org/blog https://molmo.allenai.org/paper.pdf
+1 If it's not already done by this weekend, I can try handle it then
how to get the onnx version of the model?
+1 If it's not already done by this weekend, I can try handle it then
how to do it by ourself ?
We're a bit overwhelmed by things to work on, so any help/contribution is definitely welcomed! Supporting this model should be straightforward since it's also LlaVA-style like many other VLMs we support today.
If anyone decides to make a PR to support this model, please ping me directly for review once it's ready!
Hello, I'm with the Molmo team at Ai2. We'll soon be adding our models to vllm, so stay tuned!
Hello, I'm with the Molmo team at Ai2. We'll soon be adding our models to vllm, so stay tuned!
Nice, will that include molmoe?
Hello, I'm with the Molmo team at Ai2. We'll soon be adding our models to vllm, so stay tuned!
Nice, will that include molmoe?
yes
@mrsalehi Thank you! Do you know when approximately?
And when will you release the dataset?
@mrsalehi Thank you! Do you know when approximately?
Most likely today or tomorrow.
does the support included in the release 0.6.2?
does the support included in the release 0.6.2?
@premg16 0.6.2 has already been released, so no, but we will make a new release when this model is supported by vLLM!
Very excited for the Molmo integration! Let us know if there's anything we can do to help.
https://github.com/vllm-project/vllm/pull/9016#issue
@mrsalehi Thank you for the vLLM implementation :)
When will you release the datasets?
@mrsalehi will there be a release for molmoe?