Fail to export Qwen3-Omni as onnx model
Raise Value Error: Unrecognized configuration class <class 'transformers.models.qwen3_omni_moe.configuration_qwen3_omni_moe.Qwen3OmniMoeConfig'> for this kind of AutoModel: AutoModelForCausalLM. Model type should be one of ApertusConfig, ArceeConfig, AriaTextConfig, BambaConfig,.................................................... ... ... ... raise RuntimeError("Load model failed for ", model_path) RuntimeError: ('Load model failed for ', 'Qwen/Qwen3-Omni-30B-A3B-Instruct')
Note: I have already deploy the Qwen3-Omni model and successful run the inference demo. @wangzhaode @cdliang11 @inisis
Qwen3-Omni is not supported yet. We expect to add support after the smaller parameter version is released.
Nice avatar @wangzhaode