How can we use the Qwen3 and embedder of Alibaba?
version:0.24.1
version:0.24.1
Maybe it's not supported yet
https://github.com/BerriAI/litellm/issues/9198
@xiatiandegaga could u still upload your config.yaml, I would like to take a look
@xiatiandegaga since you declared qwen3-fast as alias in your llm config, you should put litellm_llm.qwen3-fast in pipe definitions instead of using litellm_llm.default. Also I think maybe you also need to put openai/ prefix for embedding model?
@xiatiandegaga since you declared
qwen3-fastas alias in your llm config, you should putlitellm_llm.qwen3-fastin pipe definitions instead of usinglitellm_llm.default. Also I think maybe you also need to putopenai/prefix for embedding model?
can't work when I add openai/ for embedding ,I‘ll wait for the version of UI just like supersonic lol
@xiatiandegaga since you declared
qwen3-fastas alias in your llm config, you should putlitellm_llm.qwen3-fastin pipe definitions instead of usinglitellm_llm.default. Also I think maybe you also need to putopenai/prefix for embedding model?can't work when I add openai/ for embedding ,I‘ll wait for the version of UI just like supersonic lol
@xiatiandegaga May I ask if you have tried using this configuration file? https://github.com/Canner/WrenAI/blob/main/wren-ai-service/docs/config_examples/config.qwen3.yaml
Qwen3 is the most popular mode in china with a large amount of customer,we need to use it directly.
@xiatiandegaga since you declared
qwen3-fastas alias in your llm config, you should putlitellm_llm.qwen3-fastin pipe definitions instead of usinglitellm_llm.default. Also I think maybe you also need to putopenai/prefix for embedding model?can't work when I add openai/ for embedding ,I‘ll wait for the version of UI just like supersonic lol
@xiatiandegaga May I ask if you have tried using this configuration file? https://github.com/Canner/WrenAI/blob/main/wren-ai-service/docs/config_examples/config.qwen3.yaml
config.qwen3.yaml 的中embedder 配置使用的是openai向量模型,并且qwen3在wrenai中是通过openrouter代理的方式使用,在配置文件的示例中并没有直接使用dashscope,所以wrenai在目前版本的配置文件稀里糊涂,我们甚至不知道到底怎么使用。。。然而wrenai文档中又明确说明支持dashscope方式的所有qwen大模型。。。实在难以理解
@xiatiandegaga since you declared
qwen3-fastas alias in your llm config, you should putlitellm_llm.qwen3-fastin pipe definitions instead of usinglitellm_llm.default. Also I think maybe you also need to putopenai/prefix for embedding model?can't work when I add openai/ for embedding ,I‘ll wait for the version of UI just like supersonic lol
@xiatiandegaga May I ask if you have tried using this configuration file? https://github.com/Canner/WrenAI/blob/main/wren-ai-service/docs/config_examples/config.qwen3.yaml
config.qwen3.yaml 的中embedder 配置使用的是openai向量模型,并且qwen3在wrenai中是通过openrouter代理的方式使用,在配置文件的示例中并没有直接使用dashscope,所以wrenai在目前版本的配置文件稀里糊涂,我们甚至不知道到底怎么使用。。。然而wrenai文档中又明确说明支持dashscope方式的所有qwen大模型。。。实在难以理解
For dashscope models, you could try following the docs here: https://docs.litellm.ai/docs/providers/dashscope
Basically we are simply using litellm underneath
@xiatiandegaga since you declared
qwen3-fastas alias in your llm config, you should putlitellm_llm.qwen3-fastin pipe definitions instead of usinglitellm_llm.default. Also I think maybe you also need to putopenai/prefix for embedding model?can't work when I add openai/ for embedding ,I‘ll wait for the version of UI just like supersonic lol
@xiatiandegaga May I ask if you have tried using this configuration file? https://github.com/Canner/WrenAI/blob/main/wren-ai-service/docs/config_examples/config.qwen3.yaml
config.qwen3.yaml 的中embedder 配置使用的是openai向量模型,并且qwen3在wrenai中是通过openrouter代理的方式使用,在配置文件的示例中并没有直接使用dashscope,所以wrenai在目前版本的配置文件稀里糊涂,我们甚至不知道到底怎么使用。。。然而wrenai文档中又明确说明支持dashscope方式的所有qwen大模型。。。实在难以理解
For dashscope models, you could try following the docs here: https://docs.litellm.ai/docs/providers/dashscope
Basically we are simply using litellm underneath
如果将wrenai默认的向量模型替换为qwen text-embedding-v3,在提问时候就会发生异常,因为qwen text-embedding-v3向量模型只支持1024 错误明细: PRQBT:job 2 have changes, returning question count: 0, updating
sendEvent home_generate_project_recommendation_questions_failed {
projectId: 2,
projectType: 'MSSQL',
status: 'FAILED',
questions: [],
error: {
code: 'OTHERS',
message: 'An error occurred during question recommendation generation: litellm.BadRequestError: DashscopeException - <400> InternalError.Algo.InvalidParameter: Range of input length should be [1, 30720]',
shortMessage: 'Internal server error'
}
} AI false
成功集成WrenAI与阿里云DashScope服务,使用最新的AI模型:
LLM模型: qwen-plus - 性能强大的大语言模型 Embedding模型: text-embedding-v4 - 最新的向量嵌入模型 详情见 https://github.com/lovrabet-ai/wrenai-qwen/blob/main/wrenai-dashscope-integration-guide.md
成功集成WrenAI与阿里云DashScope服务,使用最新的AI模型:
LLM模型: qwen-plus - 性能强大的大语言模型 Embedding模型: text-embedding-v4 - 最新的向量嵌入模型 详情见 https://github.com/lovrabet-ai/wrenai-qwen/blob/main/wrenai-dashscope-integration-guide.md
thx!
成功集成WrenAI与阿里云DashScope服务,使用最新的AI模型:
LLM模型: qwen-plus - 性能强大的大语言模型 Embedding模型: text-embedding-v4 - 最新的向量嵌入模型 详情见 https://github.com/lovrabet-ai/wrenai-qwen/blob/main/wrenai-dashscope-integration-guide.md
thk