compatible qwen3 think and no_think
compatible qwen3 think and no_think
I'll label this as good first issue. See if anyone are interested for contribution, the end result is a config example file for setting up qwen3 models.
Examples are here: https://github.com/Canner/WrenAI/tree/main/wren-ai-service/docs/config_examples
Since qwen3 is a mixed thinking model of the same model, how should this model be configured @cyyeh
@flyrun9527
We are using litellm under the hood, so I suggest you check if litellm supports qwen3 models first.
@flyrun9527 someone just contributed open-router config example! qwen3 is available via open-router, I suggest you try open-router first!
https://github.com/Canner/WrenAI/blob/main/wren-ai-service/docs/config_examples/config.open_router.yaml
@flyrun9527 someone contributed qwen3 using open-router! please take a look at the config example: https://github.com/Canner/WrenAI/blob/main/wren-ai-service/docs/config_examples/config.qwen3.yaml
Also you could check relevant documentation here: https://github.com/Canner/WrenAI/tree/main/wren-ai-service/docs/config_examples#qwen3-think-and-no_think-configuration
I'll close the issue now, please reopen it if you found any issues.