OPENAI 这部分是否可以支持 OLLAMA
"OPENAI_API_KEY": "sk-xxx", "OPENAI_BASE_URL": "https://api.deepseek.com", "OPENAI_MODEL": "deepseek-chat",
@musistudio 我也有这个问题,我现在配置如下,打开log后的错误是 Error in OpenAI API call: {"status":400,"headers":{"content-length":"168","content-type":"application/json","date":"Mon, 23 Jun 2025 05:36:58 GMT","server":"uvicorn"}} OPENAI这部分不能配置成自己vLLM部署的deepseek吗?
{ "OPENAI_API_KEY": "xxxxxx", "OPENAI_BASE_URL": "http://10.10.10.10:8000/v1", "OPENAI_MODEL": "deepseek-ai/DeepSeek-R1-0528", "Providers": [ { "name": "mydeepseek", "api_base_url": "http://10.10.10.10:8000/v1", "api_key": "xxxxxxxxx", "models": ["deepseek-ai/DeepSeek-R1-0528"] } ], "Router": { "background": "mydeepseek,deepseek-ai/DeepSeek-R1-0528", "think": "mydeepseek,deepseek-ai/DeepSeek-R1-0528", "longContext": "mydeepseek,deepseek-ai/DeepSeek-R1-0528" }, "usePlugins": ["toolcall-improvement"] }