OpenHands
OpenHands copied to clipboard
Odevin refuses to connect locally - Goes to OpenAI
Is there an existing issue for the same bug?
- [X] I have checked the troubleshooting document at https://opendevin.github.io/OpenDevin/modules/usage/troubleshooting
- [X] I have checked the existing issues.
Describe the bug
There was the issue of OD demanding OpenAI through token and it was posted that this was patched out, but I don't see it. I followed the instructions, set it up, but OD still wants to connect to OpenAI, even though I set the localhost:port to my current local OpenAI-like server. This same server (VLLM) works fine with other similar software solutions, so I know the problem isn't with it.
Current Version
ghcr.io/opendevin/opendevin:0.4.0
Installation and Configuration
export LLM_API_KEY="KEY" -v $WORKSPACE_BASE:/opt/workspdocker run -e LLM_API_KEY -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE -v $WORKSPACE_BASE:/opt/workspace_base -v /var/run/docker.sock:/var/run/docker.sock -p 3000:3000 --add-host host.docker.internal:host-gateway ghcr.io/opendevin/opendevin:0.4.0
Model and Agent
Should be a TheBloke model, but it insists on ChatGPT3.5
Reproduction Steps
No response
Logs, Errors, Screenshots, and Additional Context
No response
@zeta274 Can you please start the app in the browser, and enter the model in the Settings there? The model name you enter and save in the UI is the model name that will be used.
LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=meta-llama/Meta-Llama-3-8B-Instruct
@zeta274 that doesn't look like a valid model name to me. You might want something like ollama/Meta-Llama-3-8B-Instruct
Going to close this one as I think it's just a model name issue (and the fact that it needs to be set in the UI)
But feel free to ping this thread if you're still having trouble!
I'm not using Ollama, I'm on VLLM, with the OpenAI-like API.
@zeta274 Can you add the model name in the UI? Please if it still doesn't work, tell what exactly are you passing, both in the command to run and in the UI.