OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

[Bug]: LLM Provider NOT provided

Open hccnm opened this issue 1 year ago • 4 comments

Is there an existing issue for the same bug?

  • [X] I have checked the troubleshooting document at https://github.com/OpenDevin/OpenDevin/blob/main/docs/guides/Troubleshooting.md
  • [X] I have checked the existing issues.

Describe the bug

I'm using version 0.4.0 and going through the guidance of AzureLLMs.md Configured config.toml, when I executed make run, I encountered the following problems。Is this a bug? Or how should I modify the configuration? image image

Current Version

0.4.0

Installation and Configuration

LLM_MODEL="azure/gpt4-1106"
LLM_API_KEY="xxxx"
LLM_BASE_URL="https://xxx.openai.azure.com/"
LLM_EMBEDDING_MODEL="azureopenai"
LLM_EMBEDDING_DEPLOYMENT_NAME="embedding2"
LLM_API_VERSION="2024-02-15-preview"
WORKSPACE_BASE="/opendevin/OpenDevin/workspace"
SANDBOX_TYPE="exec"

Model and Agent

No response

Reproduction Steps

No response

Logs, Errors, Screenshots, and Additional Context

No response

hccnm avatar Apr 26 '24 09:04 hccnm

Can you please check this value "GPT4-1106"? On the litellm list here https://litellm.vercel.app/docs/providers/azure I find "GPT4-1106-preview", or others, and the table indicates that the value corresponding to it should be "azure/<your chat model deployment name>". In other words, just like you have defined a deployment in your Azure account for embedding, you should have another one for the chat model you want to use. I'd suggest to use that name in LLM_MODEL. That might be by default the same name with the chat model, though it probably doesn't have to be.

You could just try 'GPT4-1106-preview' quick, or check the deployments page in the Azure account?

enyst avatar Apr 26 '24 11:04 enyst

In your Azure account, there's a "deployments" page/tab I think, where you can see the names of your deployments. It's that name you need for the chat model. However, I need to add a detail: if it's different than the default model name, which it might be, then:

  • start opendevin (make run, if that's how you prefer)
  • open settings in the UI, and add your actual deployment name (for the chat model) in the box. It is specific to your account, and it might be or not be in the list you see, but you can add your own value and save it.

enyst avatar Apr 26 '24 12:04 enyst

image thans you first. yes,I checked the name of the deployment,and I used the same parameters in litellm and it worked fine,So this makes me confused

hccnm avatar Apr 28 '24 03:04 hccnm

Please make sure to open the web UI, and in Settings, enter the model and save. Even if you sent it as parameter, save it in the UI. Does it work?

enyst avatar Apr 28 '24 08:04 enyst

Please make sure to open the web UI, and in Settings, enter the model and save. Even if you sent it as parameter, save it in the UI. Does it work?

now ,it work

hccnm avatar Apr 30 '24 09:04 hccnm