llm-export icon indicating copy to clipboard operation
llm-export copied to clipboard

new model support

Open simplew2011 opened this issue 3 months ago • 4 comments

  • [ ] DeepseekV3
  • [x] Qwen3-30B-A3B
  • [ ] Qwen3-Next-80B-A3B-Instruct
  • [x] Qwen3_VL

simplew2011 avatar Sep 25 '25 11:09 simplew2011

Thanks for reaching out and suggesting these new models. Here is the current status:

  • DeepseekV3: This model is too large for on-device deployment, so we will not be supporting it for the time being.
  • Qwen3-30B-A3B: This model is already supported and you can use it directly.
  • Qwen3-Next-80B-A3B-Instruct & Qwen3_VL: Support for these models is planned. We will prioritize their integration as soon as smaller, official parameter versions are released.

We will keep this issue updated with our progress.

wangzhaode avatar Sep 25 '25 12:09 wangzhaode

Thanks for reaching out and suggesting these new models. Here is the current status:

  • DeepseekV3: This model is too large for on-device deployment, so we will not be supporting it for the time being.
  • Qwen3-30B-A3B: This model is already supported and you can use it directly.
  • Qwen3-Next-80B-A3B-Instruct & Qwen3_VL: Support for these models is planned. We will prioritize their integration as soon as smaller, official parameter versions are released.

We will keep this issue updated with our progress.

That would be great! Especially for the smaller VL models. This week, Qwen released some very cool small VL models. I tried converting them myself, but I couldn't manage it. they have some custom components which I'm not aware of tbh. DeepStack, Interleaved-MRoPE etc, dunno if they have ONNX equivalents... :(

altunenes avatar Oct 22 '25 22:10 altunenes

Qwen3-VL has been supported on v0.0.4.

MNN Models:

https://huggingface.co/collections/taobao-mnn/qwen3-vl-mnn https://modelscope.cn/collections/Qwen3-VL-MNN-f4da0cedb82847

wangzhaode avatar Oct 23 '25 01:10 wangzhaode

amazing! thank you :-)

altunenes avatar Oct 23 '25 08:10 altunenes