Enable OpenAI other compatible APIs
Two changes:
- One added environmental variable to client setup to enable openAI compatible endpoints
OPENAI_API_BASE_URLas environmental variable. - added strip command to OPENAIKey - Some weirdness happening with jupyter notebooks that requires this.
Tested with ollama and llama3
Don't merge this yet. Still debugging
Seems to be working now. Seems like llama3 has some hard coded text that where the "```python" continue from the prompt doesn't work. Added a regex to extract based on code markdown code blocks "```"
Seems to be working on my end but would appreciate a second pair of eyes.
Set OPENAI_API_BASE_URL to http://localhost:11434/v1/ to point to ollama and use llama3
as the instruction and response model. Alternatively you can try groq or something else.
Note the client object is not per model type, so there is no way to mix models between open ones and openai models.
Thanks @rapatel0. Could you take a look at some of my comments in the code?
I can't anything in the review changes on github. Where are the comments?
I can't anything in the review changes on github. Where are the comments?
@rapatel0 Just tagged you.
I might be being a moron here but still don't see anything
I might be being a moron here but still don't see anything
Hmm, that's odd! I'm not sure why it's not showing up. I'll just write it here:
- On line 304: could you
(1) add a space between elif and the string (2) use double quotes?
Sorry for the nit.
- On line 316: I'm a little reticent to start adding too much regex to the repo, which is why I have been dragging my feet on https://github.com/handrew/browserpilot/pull/8 though I could be convinced. Could you help motivate why we need this for other models?