devika icon indicating copy to clipboard operation
devika copied to clipboard

Custom URLs for OpenAI API and ollama servers

Open d-popov opened this issue 1 year ago • 6 comments

Added the possibility to use custom LLM with OpenAI library by passing baseURL. Also added option to configure ollama server when it is not running locally.

d-popov avatar Mar 27 '24 11:03 d-popov

Hey @d-popov, this looks good to me, would you be able to resolve the merge conflicts? Will merge this after resolving them! Thanks! ❤️ 🙌

mufeedvh avatar Mar 28 '24 09:03 mufeedvh

#24

ARajgor avatar Mar 31 '24 17:03 ARajgor

I think this needs an update on the documentation, please consider adding instructions on the README

Israel-Laguan avatar Apr 01 '24 13:04 Israel-Laguan

for review

d-popov avatar Apr 02 '24 07:04 d-popov

can you revert the Ollama file and just update the openai file with just one parameter? fetch the latest commits.

ARajgor avatar Apr 03 '24 20:04 ARajgor

@ARajgor I see the functionality is already on master. File reverted. I am sorry i don't understand your comment on openai_client.py

d-popov avatar Apr 04 '24 21:04 d-popov

@ARajgor I see the functionality is already on master. File reverted. I am sorry i don't understand your comment on openai_client.py

take the current openai_client file and just add base_url parameter.

ARajgor avatar Apr 06 '24 05:04 ARajgor

@ARajgor done

d-popov avatar Apr 06 '24 21:04 d-popov

also, revert ollama file. the current file works perfectly. also add openai URL parameter in sample.config.file

ARajgor avatar Apr 07 '24 06:04 ARajgor

oh, well...

d-popov avatar Apr 29 '24 09:04 d-popov