go-openai
go-openai copied to clipboard
Add preliminary MistralAI support
This small patch provides the ability to use another API endpoint declared in an environment variable, and adds Mistral models.
Here is the python example from MistralAI: https://github.com/mistralai/client-python/blob/0d55818364fd2a7ca67401dd550632f8ca70e5af/examples/chatbot_with_streaming.py#L15
Successfully tested with MistralAI.
Codecov Report
Attention: Patch coverage is 0%
with 2 lines
in your changes are missing coverage. Please review.
Project coverage is 98.51%. Comparing base (
774fc9d
) to head (26912c2
). Report is 15 commits behind head on master.
Files | Patch % | Lines |
---|---|---|
client.go | 0.00% | 1 Missing and 1 partial :warning: |
Additional details and impacted files
@@ Coverage Diff @@
## master #732 +/- ##
==========================================
+ Coverage 98.46% 98.51% +0.05%
==========================================
Files 24 24
Lines 1364 1142 -222
==========================================
- Hits 1343 1125 -218
+ Misses 15 10 -5
- Partials 6 7 +1
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
I think this PR signals a mature direction to start thinking about moving this project.
I think this PR signals a mature direction to start thinking about moving this project.
Actually, I think the model declaration needs to be revamped. go-openai
would certainly work with ollama just like Mistral, but adding models in the completion.go
file is not sustainable. The const
list should be extendable via an environment variable or a config file.
I can work on this it you agree with this idea.
@iMilnb Thank you for the PR! We're definitely not merging that, as implicit dependency on the environment variable is not what we want. (And I'm getting strong xz vibes from this PR tbh).
Regarding the list of consts: there is no need to rely on it, the Model
parameter is just a string, and you can pass any model name you want.
I think this PR signals a mature direction to start thinking about moving this project.
Yes, let me clarify. Expanding the library to support a growing number of compatible models is a solid direction for this library which is becoming mature.
The actual implementation should be carefully considered, as this is a foundational piece of the library's architecture. It is a promising direction of thought.
However, I do echo the sentiments from @sashabaranov, in terms of the current suggested implementation being less than ideal.
Regarding the list of consts: there is no need to rely on it.
Just trying to understand the design philosophy here. Is the goal to prevent the constants list from being polluted with non openai models?