Anton Solbjørg
Anton Solbjørg
can you try setting `interpreter.model = "openai/custom"`
I think Killian had something for the next update
Try this: ``` pip install --force-reinstall open-interpreter ```
Nice catch! We should add all commands to tests @KillianLucas?
Since it has been a few months, I'll transfer this issue to you @BassCoder2808
 https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/models#gpt-35-turbo-model-availability > interpreter --model azure/GPT3 you should probably use `interpreter --model azure/gpt-35-turbo` I haven't used azure myself
I see, this is a issue with merge_delta.py will make a fix soon
Should be fixed next update or if you install from the github as @Fschoeller said
run ``` interpreter --config ``` paste in: ``` llm.model: gpt-4 llm.temperature: 0 offline: false llm.api_key: ... # Your API key, if the API requires it llm.api_base: ... # The URL...
--- title: Azure --- To use a model from Azure, set the `model` flag to begin with `azure/`: ```bash Terminal interpreter --model azure/ ``` Please follow this guide in the...