Logan Yang

Results 324 comments of Logan Yang

@bfoujols I'm also looking at it, just got access to the medium model myself. Added to the roadmap!

@peterlionelnewman yes, so in theory we can just put Mistral API key in the OpenAI key field and do this override in advanced setting But then it has CORS issue...

@knuurr have you tried adding Mistral as a custom model? https://www.obsidiancopilot.com/en/docs/settings#adding-custom-models

I can repro, even with CORS on it doesn't work. Mistral's API isn't OpenAI compatible as they claim? I've been using Mistral models via OpenRouter though.

@jhmonroe we are using langchain openai client and it's not working with Mistral out-of-the-box. But this PR introduced the mistral client from langchain which should be working (?) https://github.com/logancyang/obsidian-copilot/pull/841/files Just...

@marcusbai no need to do that manually, the openai proxy base url in adv setting does it for you.

Without a screenshot of the note and console with debug mode on, it's hard to test on my side. Could you provide the screenshot? LM studio server mode shouldn't depend...

It's partially because 1. the model is not tuned for RAG 2. you forgot to set a big context window explicitly in Ollama (check the local copilot guide). Since this...

Yes this has been on my roadmap for a while. Will prioritize this after I ship the new modes.