OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

Odevin refuses to connect locally - Goes to OpenAI

Open zeta274 opened this issue 9 months ago • 1 comments

Is there an existing issue for the same bug?

  • [X] I have checked the troubleshooting document at https://opendevin.github.io/OpenDevin/modules/usage/troubleshooting
  • [X] I have checked the existing issues.

Describe the bug

There was the issue of OD demanding OpenAI through token and it was posted that this was patched out, but I don't see it. I followed the instructions, set it up, but OD still wants to connect to OpenAI, even though I set the localhost:port to my current local OpenAI-like server. This same server (VLLM) works fine with other similar software solutions, so I know the problem isn't with it.

Current Version

ghcr.io/opendevin/opendevin:0.4.0

Installation and Configuration

export LLM_API_KEY="KEY" -v $WORKSPACE_BASE:/opt/workspdocker run     -e LLM_API_KEY     -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE     -v $WORKSPACE_BASE:/opt/workspace_base     -v /var/run/docker.sock:/var/run/docker.sock     -p 3000:3000     --add-host host.docker.internal:host-gateway     ghcr.io/opendevin/opendevin:0.4.0

Model and Agent

Should be a TheBloke model, but it insists on ChatGPT3.5

Reproduction Steps

No response

Logs, Errors, Screenshots, and Additional Context

No response

zeta274 avatar Apr 30 '24 09:04 zeta274

@zeta274 Can you please start the app in the browser, and enter the model in the Settings there? The model name you enter and save in the UI is the model name that will be used.

enyst avatar Apr 30 '24 18:04 enyst

LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=meta-llama/Meta-Llama-3-8B-Instruct

zeta274 avatar May 01 '24 17:05 zeta274

@zeta274 that doesn't look like a valid model name to me. You might want something like ollama/Meta-Llama-3-8B-Instruct

rbren avatar May 02 '24 02:05 rbren

Going to close this one as I think it's just a model name issue (and the fact that it needs to be set in the UI)

rbren avatar May 02 '24 02:05 rbren

But feel free to ping this thread if you're still having trouble!

rbren avatar May 02 '24 02:05 rbren

I'm not using Ollama, I'm on VLLM, with the OpenAI-like API.

zeta274 avatar May 02 '24 02:05 zeta274

@zeta274 Can you add the model name in the UI? Please if it still doesn't work, tell what exactly are you passing, both in the command to run and in the UI.

enyst avatar May 02 '24 08:05 enyst