llm-export icon indicating copy to clipboard operation
llm-export copied to clipboard

Fail to export Qwen3-Omni as onnx model

Open L1Zhichao opened this issue 2 months ago • 2 comments

Raise Value Error: Unrecognized configuration class <class 'transformers.models.qwen3_omni_moe.configuration_qwen3_omni_moe.Qwen3OmniMoeConfig'> for this kind of AutoModel: AutoModelForCausalLM. Model type should be one of ApertusConfig, ArceeConfig, AriaTextConfig, BambaConfig,.................................................... ... ... ... raise RuntimeError("Load model failed for ", model_path) RuntimeError: ('Load model failed for ', 'Qwen/Qwen3-Omni-30B-A3B-Instruct')

Note: I have already deploy the Qwen3-Omni model and successful run the inference demo. @wangzhaode @cdliang11 @inisis

L1Zhichao avatar Nov 11 '25 08:11 L1Zhichao

Qwen3-Omni is not supported yet. We expect to add support after the smaller parameter version is released.

wangzhaode avatar Nov 11 '25 08:11 wangzhaode

Nice avatar @wangzhaode

inisis avatar Nov 11 '25 10:11 inisis