obsidian-copilot icon indicating copy to clipboard operation
obsidian-copilot copied to clipboard

[FR] Support kimi.ai

Open wwjCMP opened this issue 1 year ago • 11 comments

https://platform.moonshot.cn/console/api-keys

Another powerful online online LLM

wwjCMP avatar Mar 25 '24 15:03 wwjCMP

https://platform.moonshot.cn/docs/intro#%E4%B8%BB%E8%A6%81%E6%A6%82%E5%BF%B5

wwjCMP avatar Mar 25 '24 15:03 wwjCMP

It is compatible with the OpenAI API format. I think it would be better to provide an option for these OpenAI-compatible APIs. This way, there's no need for individual adaptation.

Provide an API input box, model name, and offer endpoint box.

It can refer to the approach of text generator.

wwjCMP avatar Mar 25 '24 16:03 wwjCMP

Have you tried the endpoint override? Does it fail with CORS as well?

logancyang avatar Mar 26 '24 18:03 logancyang

Have you tried the endpoint override? Does it fail with CORS as well?

No, it can be used normally in the text generator. Perhaps it's due to network issues?

wwjCMP avatar Mar 27 '24 01:03 wwjCMP

Have you tried the endpoint override? Does it fail with CORS as well?

No, it can be used normally in the text generator. Perhaps it's due to network issues?

I had a CORS problem until I set up a proxy server.

Access to fetch at 'https://api.moonshot.cn/v1/chat/completions' from origin 'app://obsidian.md' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.

BaoLiqi avatar Mar 27 '24 02:03 BaoLiqi

Additionally, it is great to allow users to save multiple custom models.

wwjCMP avatar Mar 30 '24 08:03 wwjCMP

Any openai replacement should work without CORS error with the new local proxy setting from https://github.com/logancyang/obsidian-copilot/pull/495

logancyang avatar Aug 09 '24 21:08 logancyang

Any openai replacement should work without CORS error with the new local proxy setting from #495

This can indeed be used, but custom model names cannot be used.

wwjCMP avatar Aug 15 '24 04:08 wwjCMP

@wwjCMP this field doesn't work for you? When you fill this field, you must pick any OpenAI model in the dropdown as a placeholder. SCR-20240814-taqm

If it still doesn't work, please open an issue and show how it fails, I can take a look.

logancyang avatar Aug 15 '24 05:08 logancyang

@wwjCMP this field doesn't work for you? When you fill this field, you must pick any OpenAI model in the dropdown as a placeholder. SCR-20240814-taqm

If it still doesn't work, please open an issue and show how it fails, I can take a look.

In fact, I can make it work by inputting the name of an OpenAI model, such as GPT-4. This way, I can use a reverse proxy to utilize third-party services compatible with OpenAI. However, I cannot specify the model provided by the third-party service, such as specifying the model as kimi-v2.

wwjCMP avatar Aug 15 '24 06:08 wwjCMP

@wwjCMP this field doesn't work for you? When you fill this field, you must pick any OpenAI model in the dropdown as a placeholder. SCR-20240814-taqm

If it still doesn't work, please open an issue and show how it fails, I can take a look. image

Ok, now i can use the custom model name by change this at the same time. But not only one third-party model can be added.

wwjCMP avatar Aug 15 '24 14:08 wwjCMP