go-openai icon indicating copy to clipboard operation
go-openai copied to clipboard

Add preliminary MistralAI support

Open iMilnb opened this issue 9 months ago • 3 comments

This small patch provides the ability to use another API endpoint declared in an environment variable, and adds Mistral models.

Here is the python example from MistralAI: https://github.com/mistralai/client-python/blob/0d55818364fd2a7ca67401dd550632f8ca70e5af/examples/chatbot_with_streaming.py#L15

Successfully tested with MistralAI.

iMilnb avatar May 02 '24 12:05 iMilnb

Codecov Report

Attention: Patch coverage is 0% with 2 lines in your changes are missing coverage. Please review.

Project coverage is 98.51%. Comparing base (774fc9d) to head (26912c2). Report is 15 commits behind head on master.

Files Patch % Lines
client.go 0.00% 1 Missing and 1 partial :warning:
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #732      +/-   ##
==========================================
+ Coverage   98.46%   98.51%   +0.05%     
==========================================
  Files          24       24              
  Lines        1364     1142     -222     
==========================================
- Hits         1343     1125     -218     
+ Misses         15       10       -5     
- Partials        6        7       +1     

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

codecov[bot] avatar May 02 '24 12:05 codecov[bot]

I think this PR signals a mature direction to start thinking about moving this project.

kmesiab avatar May 19 '24 17:05 kmesiab

I think this PR signals a mature direction to start thinking about moving this project.

Actually, I think the model declaration needs to be revamped. go-openai would certainly work with ollama just like Mistral, but adding models in the completion.go file is not sustainable. The const list should be extendable via an environment variable or a config file.
I can work on this it you agree with this idea.

iMilnb avatar May 20 '24 09:05 iMilnb

@iMilnb Thank you for the PR! We're definitely not merging that, as implicit dependency on the environment variable is not what we want. (And I'm getting strong xz vibes from this PR tbh).

Regarding the list of consts: there is no need to rely on it, the Model parameter is just a string, and you can pass any model name you want.

sashabaranov avatar May 20 '24 13:05 sashabaranov

I think this PR signals a mature direction to start thinking about moving this project.

Yes, let me clarify. Expanding the library to support a growing number of compatible models is a solid direction for this library which is becoming mature.

The actual implementation should be carefully considered, as this is a foundational piece of the library's architecture. It is a promising direction of thought.

However, I do echo the sentiments from @sashabaranov, in terms of the current suggested implementation being less than ideal.

kevin-mesiab avatar May 20 '24 15:05 kevin-mesiab

Regarding the list of consts: there is no need to rely on it.

Just trying to understand the design philosophy here. Is the goal to prevent the constants list from being polluted with non openai models?

kevin-mesiab avatar May 20 '24 15:05 kevin-mesiab