OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

Bypass the 99 Steps/Tasks Limit

Open lsiem opened this issue 9 months ago • 4 comments

Is there an existing issue for the same bug?

  • [X] I have checked the troubleshooting document at https://opendevin.github.io/OpenDevin/modules/usage/troubleshooting
  • [X] I have checked the existing issues.

Describe the bug

Hello,

I've encountered a limitation with the planner and monologue agents that significantly impacts the efficiency of the workflow. When using these agents for complex tasks, I frequently hit the ceiling of 99 steps/tasks. This limitation often halts the planning process prematurely, preventing the start of implementation.

The core issue arises when the planning reaches this limit and stops, requiring the initiation of a new agent task to continue. However, this approach has a critical drawback: the planning progress is not preserved, leading to a reset and forces the user to start from scratch.

Is there a possibility to bypass or extend this limit of 99 steps/tasks for both planner and monologue agents? An enhancement in this area would greatly improve the usability and efficiency of these agents for handling more complex scenarios.

Current Version

ghcr.io/opendevin/opendevin:0.4.0

Installation and Configuration

docker run \
    -e LLM_API_KEY \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    --add-host host.docker.internal=host-gateway \
    -e SANDBOX_TYPE=exec \
    ghcr.io/opendevin/opendevin:0.4.0

Model and Agent

gpt-4-turbo-2024-04-09 Planning Agent Monologue Agent

Reproduction Steps

  1. Run the docker container with an existing project
  2. Specify a rather complex task such as rewriting the project from scratch or migrating the project to a new framework
  3. Wait until the agent stops working
  4. Check the logs and see if the agent reaches the 99th task.

Logs, Errors, Screenshots, and Additional Context

There are no error messages or any other clues. The agent just doesn't continue executing

lsiem avatar May 01 '24 16:05 lsiem

I believe (but haven't verified) that you can push the "play" button to continue. We really need a better UX for this though

rbren avatar May 01 '24 23:05 rbren

I believe (but haven't verified) that you can push the "play" button to continue. We really need a better UX for this though

I attempted the given solution, but there was no response and no logs were generated. Upon reloading the page, a dialog box asks if I wish to proceed; however, clicking 'yes' redirects me to an unfamiliar page and results in an unusual 2xx error code. I have forked the project and am attempting to submit a pull request to resolve this issue.

lsiem avatar May 02 '24 03:05 lsiem

I believe you can set MAX_ITERATIONS to more than 99, to whatever you see fit. It can be in config.toml or in env. https://github.com/OpenDevin/OpenDevin/blob/main/opendevin/config.py#L40

enyst avatar May 02 '24 08:05 enyst

I believe you can set MAX_ITERATIONS to more than 99, to whatever you see fit. It can be in config.toml or in env. https://github.com/OpenDevin/OpenDevin/blob/main/opendevin/config.py#L40

Thank you for pointing that out, I completely overlooked the setting... Nonetheless, there is still significant potential for improvement in terms of behavior when this situation occurs. Ideally, the user should be informed via the frontend and, as already mentioned in #1508, have the possibility to continue with the previous data across agents.

lsiem avatar May 02 '24 14:05 lsiem

Apparently we have the same problem for MAX_ITERATIONS, MAX_CHARS, and MAX_BUDGET_PER_TASK

The user might very well want to continue rather than abort.

li-boxuan avatar May 26 '24 23:05 li-boxuan

@tobitege pointed out here that

This could be interesting for this PR's topic: feat(dynamic_rate_limiter.py): Dynamic tpm quota (multiple projects) https://github.com/BerriAI/litellm/pull/4349

li-boxuan avatar Jun 25 '24 06:06 li-boxuan