obsidian-copilot
obsidian-copilot copied to clipboard
[FR] Support kimi.ai
https://platform.moonshot.cn/console/api-keys
Another powerful online online LLM
https://platform.moonshot.cn/docs/intro#%E4%B8%BB%E8%A6%81%E6%A6%82%E5%BF%B5
It is compatible with the OpenAI API format. I think it would be better to provide an option for these OpenAI-compatible APIs. This way, there's no need for individual adaptation.
Provide an API input box, model name, and offer endpoint box.
It can refer to the approach of text generator.
Have you tried the endpoint override? Does it fail with CORS as well?
Have you tried the endpoint override? Does it fail with CORS as well?
No, it can be used normally in the text generator. Perhaps it's due to network issues?
Have you tried the endpoint override? Does it fail with CORS as well?
No, it can be used normally in the text generator. Perhaps it's due to network issues?
I had a CORS problem until I set up a proxy server.
Access to fetch at 'https://api.moonshot.cn/v1/chat/completions' from origin 'app://obsidian.md' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
Additionally, it is great to allow users to save multiple custom models.
Any openai replacement should work without CORS error with the new local proxy setting from https://github.com/logancyang/obsidian-copilot/pull/495
Any openai replacement should work without CORS error with the new local proxy setting from #495
This can indeed be used, but custom model names cannot be used.
@wwjCMP this field doesn't work for you? When you fill this field, you must pick any OpenAI model in the dropdown as a placeholder.
If it still doesn't work, please open an issue and show how it fails, I can take a look.
@wwjCMP this field doesn't work for you? When you fill this field, you must pick any OpenAI model in the dropdown as a placeholder.
If it still doesn't work, please open an issue and show how it fails, I can take a look.
In fact, I can make it work by inputting the name of an OpenAI model, such as GPT-4. This way, I can use a reverse proxy to utilize third-party services compatible with OpenAI. However, I cannot specify the model provided by the third-party service, such as specifying the model as kimi-v2.
@wwjCMP this field doesn't work for you? When you fill this field, you must pick any OpenAI model in the dropdown as a placeholder.
If it still doesn't work, please open an issue and show how it fails, I can take a look.
Ok, now i can use the custom model name by change this at the same time. But not only one third-party model can be added.

