Custom URLs for OpenAI API and ollama servers
Added the possibility to use custom LLM with OpenAI library by passing baseURL. Also added option to configure ollama server when it is not running locally.
Hey @d-popov, this looks good to me, would you be able to resolve the merge conflicts? Will merge this after resolving them! Thanks! ❤️ 🙌
#24
I think this needs an update on the documentation, please consider adding instructions on the README
for review
can you revert the Ollama file and just update the openai file with just one parameter? fetch the latest commits.
@ARajgor I see the functionality is already on master. File reverted. I am sorry i don't understand your comment on openai_client.py
@ARajgor I see the functionality is already on master. File reverted. I am sorry i don't understand your comment on openai_client.py
take the current openai_client file and just add base_url parameter.
@ARajgor done
also, revert ollama file. the current file works perfectly. also add openai URL parameter in sample.config.file
oh, well...