Anton Solbjørg
Anton Solbjørg
> This is such a bizarre problem that I think stems from LiteLLM relying on OS env vars (which I encouraged them to do early on!) so we can never...
I believe this is ready for merging now @KillianLucas
https://github.com/KillianLucas/open-interpreter/issues/964
@KillianLucas I have removed a bunch of unnecessary things Hopefully this aligns with: > minified to just set "custom_llm_provider" to "openai" It checks in init if it is model is...
What model are you using?
https://github.com/KillianLucas/open-interpreter/pull/955 Will have a fix for this next release, I think
--profiles shouldn't work unless you are running the git version We add openai to the model name to use the openai api format, this will change next release. Seems like...
Seems like the change is still in a PR, i need to modify some parts before merging it... @RisingVoicesBk I would wait for the update coming soon and upgrade then,...
https://github.com/KillianLucas/open-interpreter/pull/955
Make sure Python and python/Scripts is in PATH