obsidian-copilot icon indicating copy to clipboard operation
obsidian-copilot copied to clipboard

Optimized support for the API for Qwen streaming (paragraph-by-paragraph answers)优化对于通义千问API的支持,让其能够流式传输(一段一段的回答)

Open XSR-WatchPioneer opened this issue 1 year ago • 1 comments

I use a third party compatible with the OpenAI API to enable the Qwen model, but cross-domain needs to be enabled, otherwise it will not work properly.

我使用兼容OpenAI API的第三方可以启用通义千问模型,但是需要开启跨域,否则无法正常工作。

However, after cross-domain is enabled, the answers of the model cannot be streamed, and must be displayed after all the answers are generated.

但是开启跨域以后,模型的回答不能流式加载,必须要等全部回答生成后才显示。

Configuration about Qwen: - fill https://dashscope.aliyuncs.com/compatible-mode/v1 URL.

关于通义千问的配置: - URL 填 https://dashscope.aliyuncs.com/compatible-mode/v1

  • provider Specifies the third party.

  • provider选第三方。

  • Select cross-domain CORS.

  • 勾选跨域CORS。

XSR-WatchPioneer avatar Sep 12 '24 08:09 XSR-WatchPioneer

@XSR-WatchPioneer Cannot be resolved at this time

see:

Say goodbye to CORS errors for both chat models and embedding! The new model table in settings now lets you turn on "CORS" for individual chat models if you see CORS issue with them. And embedding models are immune to CORS errors by default! Caveat: this is powered by Obsidian API's requestUrl which does not support "streaming" of LLM responses. So streaming is disabled whenever you have CORS on in Copilot settings. Please upvote this feature request to let Obsidian know your need for streaming!

Emt-lin avatar Sep 12 '24 11:09 Emt-lin