obsidian-copilot icon indicating copy to clipboard operation
obsidian-copilot copied to clipboard

Implement Mistral Platform

Open bfoujols opened this issue 1 year ago • 14 comments

Is it possible to add new "Mistral Platform" as a separate option in the models dropdown . Can you specify the actual model in the settings with the 3 endpoints (tiny, small, medium)

Website Docs : https://console.mistral.ai

bfoujols avatar Jan 17 '24 15:01 bfoujols

@bfoujols I'm also looking at it, just got access to the medium model myself. Added to the roadmap!

logancyang avatar Jan 17 '24 20:01 logancyang

Thanks Logan, I'm keen for this too. I believe that because Mistral implement the openai api formating, you can just straight up replace 'https://api.openai.com/v1' with 'https://api.mistral.ai/v1'; add an everything should work, from looking for models to getting completions. thanks again,

peterlionelnewman avatar Mar 10 '24 13:03 peterlionelnewman

@peterlionelnewman yes, so in theory we can just put Mistral API key in the OpenAI key field and do this override in advanced setting SCR-20240310-nmsg

But then it has CORS issue SCR-20240310-nnbv

Claude 3 also has this issue and I used a local proxy server to get around it. Will add Mistral with this soon too.

logancyang avatar Mar 10 '24 22:03 logancyang

On mac devices, the command below works for me. OLLAMA_ORIGINS="*" ollama serve

zhuhaoxlj avatar May 17 '24 01:05 zhuhaoxlj

Bump!

knuurr avatar Oct 24 '24 00:10 knuurr

@knuurr have you tried adding Mistral as a custom model? https://www.obsidiancopilot.com/en/docs/settings#adding-custom-models

logancyang avatar Oct 24 '24 05:10 logancyang

@logancyang actually I did test it just now.

My requests end with 422 error.

This is the setup I'm using. I filled fields for ilustration purpose, but model is set up this way. obraz

This is chat window: obraz

This is from Network Tab. I think it's ok to ignore vector errors as I disable vault RAG.

obraz

I tested some other configuration - either using base URL or full url like /v1/chat/completions but I can't get it working.

Not to mention i use another desktop chat app for other purposes, Chatbox, where I also created custom model (Mistral) and all seems to work.

obraz

Do you maybe have any clue? Obsidian integration would be great for my use case.

knuurr avatar Oct 26 '24 23:10 knuurr

@logancyang事实上我刚刚确实测试过了。

我的请求以 422 错误结束。

这是我正在使用的设置。我填写了字段以方便说明,但模型是这样设置的。 奥布拉兹

这是聊天窗口: 奥布拉兹

这是来自网络选项卡。我认为可以忽略矢量错误,因为我禁用了 Vault RAG。

奥布拉兹

我测试了一些其他配置 - 要么使用基本 URL 要么使用完整 URL,/v1/chat/completions但我无法使其工作。

更不用说我将另一个桌面聊天应用程序 Chatbox 用于其他目的,我也在其中创建了自定义模型(Mistral)并且一切似乎都正常。

奥布拉兹

你可能有什么线索吗?Obsidian 集成对我的用例来说非常有用。

+1

wwjCMP avatar Nov 01 '24 00:11 wwjCMP

I can repro, even with CORS on it doesn't work. Mistral's API isn't OpenAI compatible as they claim? I've been using Mistral models via OpenRouter though.

logancyang avatar Nov 01 '24 01:11 logancyang

Please add the Mistral-large model option.

ivankrdenas avatar Nov 20 '24 14:11 ivankrdenas

Confirming I tested Mistral today as an OpenAI compatible model trying combos of endpoints: https://api.mistral.ai/v1/chat/completions https://api.mistral.ai/v1/chat https://api.mistral.ai/v1

Model name: mistral-large-latest

Reults in " Error: Connection error." in chat Developer Tools > Network says:

Screen Shot 2024-12-14 at 8 06 43 PM

Then I thought perhaps since I'm testing with FREE models, they are limited. All model names: https://docs.mistral.ai/getting-started/models/models_overview/ The free model listed is: pixtral-12b-2409 Page here mentions the correct endpoint for that free model: https://mistral.ai/news/pixtral-12b/#la-plateforme

Tried with CORS set to "true" and "false"...

Still no luck... ☹️ Going to post in the Mistral Discord

jhmonroe avatar Dec 15 '24 02:12 jhmonroe

@jhmonroe we are using langchain openai client and it's not working with Mistral out-of-the-box.

But this PR introduced the mistral client from langchain which should be working (?) https://github.com/logancyang/obsidian-copilot/pull/841/files Just waiting for him to update it to be able to merge.

logancyang avatar Dec 15 '24 02:12 logancyang

@jhmonroe we are using langchain openai client and it's not working with Mistral out-of-the-box.

But this PR introduced the mistral client from langchain which should be working (?) https://github.com/logancyang/obsidian-copilot/pull/841/files Just waiting for him to update it to be able to merge.

exciting! will stay tuned. we are at a very exciting moment... this week have been experimenting with cursor and cline (for vscode)... amazing how much they can do now with agentic abilities. will be crazy when agents are in obsidian 🤯

jhmonroe avatar Dec 15 '24 04:12 jhmonroe

Being able to use the embedding one would be nice too.

douglaslassance avatar Apr 12 '25 17:04 douglaslassance