OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

[Bug]: Issue when liteLLM doesnt have pricing for ollama model

Open DFin opened this issue 1 year ago • 1 comments

Is there an existing issue for the same bug?

  • [X] I have checked the troubleshooting document at https://opendevin.github.io/OpenDevin/modules/usage/troubleshooting
  • [X] I have checked the existing issues.

Describe the bug

When running OpenDevin with a ollama model that isnt know by LiteLLM it causes problems. Not familiar with litellm, but it seems to expect a configuration file with pricing and context window size that contains the models and fails if that isnt provided.

16:30:53 - opendevin:ERROR: agent_controller.py:149 - Error in loop
Traceback (most recent call last):
  File "/app/opendevin/controller/agent_controller.py", line 144, in _run
    finished = await self.step(i)
               ^^^^^^^^^^^^^^^^^^
  File "/app/opendevin/controller/agent_controller.py", line 260, in step
    action = self.agent.step(self.state)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/agenthub/codeact_agent/codeact_agent.py", line 231, in step
    cur_cost = completion_cost(completion_response=response)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 4453, in completion_cost
    raise e
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 4439, in completion_cost
    ) = cost_per_token(
        ^^^^^^^^^^^^^^^
  File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 4254, in cost_per_token
    raise litellm.exceptions.NotFoundError(  # type: ignore
litellm.exceptions.NotFoundError: Model not in model_prices_and_context_window.json. You passed model=ollama/llama3:70b. Register pricing for model - https://docs.litellm.ai/docs/proxy/custom_pricing

I verified that I can access Ollama from the docker environment and curl access works fine. It also seems to run the model as GPU goes to 100% for a time. The error seems to occur at the end of

Current Version

Latest as of today: ghcr.io/opendevin/opendevin:main

Installation and Configuration

'docker run \
    --add-host host.docker.internal:host-gateway \
    -e SANDBOX_USER_ID=$(id -u) \
    -e LLM_API_KEY="ollama" \
    -e LLM_BASE_URL="http://host.docker.internal:11434" \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    ghcr.io/opendevin/opendevin:main'

Model is set to 'ollama/llama3:70b' in the webinterface.

Model and Agent

No response

Reproduction Steps

  1. run OpenDevin with:
docker run \
    --add-host host.docker.internal:host-gateway \
    -e SANDBOX_USER_ID=$(id -u) \
    -e LLM_API_KEY="ollama" \
    -e LLM_BASE_URL="http://host.docker.internal:11434" \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    ghcr.io/opendevin/opendevin:main
  1. Set model to ollama/llama3:70b
  2. Above error occurs

Logs, Errors, Screenshots, and Additional Context

No response

DFin avatar May 09 '24 16:05 DFin

Should be fixed by this PR: https://github.com/OpenDevin/OpenDevin/pull/1666

enyst avatar May 09 '24 18:05 enyst

This should be fixed now!

rbren avatar May 14 '24 18:05 rbren