OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

[Bug]: passing LLM token limits via env does not work

Open barsuna opened this issue 9 months ago • 1 comments

Is there an existing issue for the same bug?

  • [X] I have checked the troubleshooting document at https://opendevin.github.io/OpenDevin/modules/usage/troubleshooting
  • [X] I have checked the existing issues.

Describe the bug

Strings values are not cast to integers

09:02:41 - opendevin:ERROR: agent_controller.py:151 - Error in loop Traceback (most recent call last): File "/app/opendevin/controller/agent_controller.py", line 146, in _run finished = await self.step(i) ^^^^^^^^^^^^^^^^^^ File "/app/opendevin/controller/agent_controller.py", line 269, in step action = self.agent.step(self.state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/agenthub/monologue_agent/agent.py", line 233, in step self._initialize(state.plan.main_goal) File "/app/agenthub/monologue_agent/agent.py", line 168, in _initialize self._add_initial_thoughts(task) File "/app/agenthub/monologue_agent/agent.py", line 221, in _add_initial_thoughts self._add_event(action.to_memory()) File "/app/agenthub/monologue_agent/agent.py", line 139, in _add_event if token_count + MAX_TOKEN_COUNT_PADDING > self.llm.max_input_tokens: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ TypeError: '>' not supported between instances of 'int' and 'str' 09:02:41 - opendevin:INFO: agent_controller.py:201 - Setting agent state from AgentState.RUNNING to AgentState.ERROR

Current Version

0.5.2

Installation and Configuration

docker run     --add-host host.docker.internal:host-gateway     -e LLM_API_KEY="ollama"     -e LLM_BASE_URL="http://host.docker.internal:11434"     -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE -e SANDBOX_TYPE=exec  -e SANDBOX_USER_ID=$(id -u)  -e DEBUG=1 -e LLM_EMBEDDING_MODEL=llama2 -e LLM_MAX_INPUT_TOKENS=8192 -e LLM_MAX_OUTPUT_TOKENS=4096 -v $WORKSPACE_BASE:/opt/workspace_base     -v /var/run/docker.sock:/var/run/docker.sock     -p 3000:3000     ghcr.io/opendevin/opendevin:0.5.2

Model and Agent

  • Model: llama3-70b
  • Agent: MonologueAgent

Reproduction Steps

Pass LLM_MAX_OUTPUT_TOKENS or LLM_MAX_INPUT_TOKENS and give the agent any task

Logs, Errors, Screenshots, and Additional Context

No response

barsuna avatar May 11 '24 09:05 barsuna

Could you check the latest version? It's refactored in #1552

SmartManoj avatar May 11 '24 12:05 SmartManoj

Thank you, indeed 0.5.3 does not have this issue anymore!

barsuna avatar May 14 '24 16:05 barsuna

Thanks for the confirmation!

enyst avatar May 14 '24 18:05 enyst