please add mistral api provider
Mistral AI has it's own api and it's almost similar to openai. It would be very convenient to be able to connect to their API. https://docs.mistral.ai/api/
How can i help?
@dailydaniel I'd be happy to take this on - will provide updates here 👍
@dailydaniel I think Mistral's API is OpenAI API compatible, so you should be able to use the OpenAI provider but with your mistral details
export OPENAI_API_KEY="your-mistral-api-key"
export OPENAI_API_BASE="https://api.mistral.ai/v1"
@colesmcintosh Thanks
As far as I can tell, mistral api is not compatible with openai, they have their own sdk for python and their api does not work through openai library (at least it didn't work a couple of months ago).
@dailydaniel I just tried it and can confirm it is not compatible...will try create a PR to add Mistral as a provider
@colesmcintosh do you know by any chance what the blockers are exactly?
I'm using a couple of tools that rely on /v1/chat/completions endpoint and employ OpenAI provider code with just Mistral AI key and https://api.mistral.ai host substitution. They work perfectly fine.
Is there any goose code that invalidates such compatibility?
Hmm. I just went ahead and did this:
$ export OPENAI_HOST="https://api.mistral.ai"
$ export OPENAI_API_KEY="…"
$ goose configure # default OPENAI_BASE_PATH set to "v1/chat/completions", model "mistral-large-2411"
And it worked. I checked config without env vars, also worked.
But yes, for clarity it would be good to have a separate Mistral AI entry, given how great their models are.
Yes what @vaygr is the best bet for support! More docs at: https://block.github.io/goose/docs/getting-started/providers#using-custom-openai-endpoints