OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

[Bug]: the app seems to hang unexpectedly

Open robertherbaugh opened this issue 1 year ago • 5 comments

Is there an existing issue for the same bug?

  • [X] I have checked the troubleshooting document at https://github.com/OpenDevin/OpenDevin/blob/main/docs/guides/Troubleshooting.md
  • [X] I have checked the existing issues.

Describe the bug

OPENAI_API_KEY is not passed as expected using the startup instructions. Additionally, when passing -e OPENAI_API_KEY, the docker image still does not process it.

Current Version

ghcr.io/opendevin/opendevin:0.4.0

Installation and Configuration

export LLM_API_KEY="sk-..."
export WORKSPACE_BASE="$(pwd)/workspace-directory
docker run \
    -e LLM_API_KEY \
    -e WORKSPACE_MOUNT_PATH=$WORKSPACE_BASE \
    -v $WORKSPACE_BASE:/opt/workspace_base \
    -v /var/run/docker.sock:/var/run/docker.sock \
    -p 3000:3000 \
    --add-host host.docker.internal=host-gateway \
    ghcr.io/opendevin/opendevin:0.4.0

Model and Agent

No response

Reproduction Steps

  1. Export LLM API KEY
  2. Export Workspaces Directory
  3. Use Docker Startup Script provided in github.

Logs, Errors, Screenshots, and Additional Context

INFO: Started server process [1] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:3000 (Press CTRL+C to quit) INFO: 10.30.10.6:49745 - "GET / HTTP/1.1" 307 Temporary Redirect INFO: 10.30.10.6:49745 - "GET /index.html HTTP/1.1" 200 OK INFO: 10.30.10.6:49746 - "GET /assets/index-CZQzs2DR.css HTTP/1.1" 200 OK INFO: 10.30.10.6:49745 - "GET /assets/index-D59teWsw.js HTTP/1.1" 200 OK 03:10:35 - opendevin:ERROR: auth.py:31 - Invalid token 03:10:35 - opendevin:INFO: listen.py:74 - Invalid or missing credentials, generating new session ID: 69aae10c-a24e-4f4f-a6ba-f993526d1ec2 INFO: 10.30.10.6:49745 - "GET /api/auth HTTP/1.1" 200 OK INFO: 10.30.10.6:49745 - "GET /locales/en-US/translation.json HTTP/1.1" 404 Not Found INFO: 10.30.10.6:49746 - "GET /locales/en/translation.json HTTP/1.1" 200 OK INFO: 10.30.10.6:49746 - "GET /favicon-32x32.png HTTP/1.1" 200 OK INFO: ('10.30.10.6', 49747) - "WebSocket /ws?token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzaWQiOiI2OWFhZTEwYy1hMjRlLTRmNGYtYTZiYS1mOTkzNTI2ZDFlYzIifQ.Z94inVNiUAws0YyhMvqeY5vROlV-Ha8547CjU9ACsdk" [accepted] INFO: connection open Starting loop_recv for sid: 69aae10c-a24e-4f4f-a6ba-f993526d1ec2 INFO: 10.30.10.6:49746 - "GET /api/refresh-files HTTP/1.1" 200 OK INFO: 10.30.10.6:49746 - "GET /api/litellm-models HTTP/1.1" 200 OK INFO: 10.30.10.6:49745 - "GET /api/messages/total HTTP/1.1" 200 OK INFO: 10.30.10.6:49745 - "GET /api/messages/total HTTP/1.1" 200 OK INFO: 10.30.10.6:49745 - "GET /api/agents HTTP/1.1" 200 OK 03:10:36 - opendevin:INFO: agent.py:144 - Creating agent MonologueAgent using LLM gpt-3.5-turbo 03:10:36 - opendevin:INFO: llm.py:51 - Initializing LLM with model: gpt-3.5-turbo 03:10:37 - opendevin:INFO: ssh_box.py:353 - Container stopped 03:10:37 - opendevin:INFO: ssh_box.py:373 - Mounting workspace directory: /home/st-dev-autodev9000/devin-new 03:10:38 - opendevin:INFO: ssh_box.py:396 - Container started 03:10:39 - opendevin:INFO: ssh_box.py:413 - waiting for container to start: 1, container status: running 03:10:39 - opendevin:INFO: ssh_box.py:178 - Connecting to [email protected] via ssh. If you encounter any issues, you can try ssh -v -p 44717 [email protected] with the password '88f35f3a-eac7-43f6-9742-27a703381668' and report the issue on GitHub.

============== STEP 0

03:11:20 - PLAN <Code I wished Devin to create> 03:11:20 - opendevin:ERROR: agent_controller.py:102 - Error in loop Traceback (most recent call last): File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 414, in completion raise e File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 350, in completion openai_client = OpenAI( ^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/openai/_client.py", line 104, in init raise OpenAIError( openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 1010, in completion raise e File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 983, in completion response = openai_chat_completions.completion( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 422, in completion raise OpenAIError(status_code=500, message=traceback.format_exc()) litellm.llms.openai.OpenAIError: Traceback (most recent call last): File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 414, in completion raise e File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 350, in completion openai_client = OpenAI( ^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/openai/_client.py", line 104, in init raise OpenAIError( openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/app/opendevin/controller/agent_controller.py", line 98, in _run finished = await self.step(i) ^^^^^^^^^^^^^^^^^^ File "/app/opendevin/controller/agent_controller.py", line 211, in step action = self.agent.step(self.state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/agenthub/monologue_agent/agent.py", line 218, in step resp = self.llm.completion(messages=messages) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 289, in wrapped_f return self(f, *args, **kw) ^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 379, in call do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 314, in iter return fut.result() ^^^^^^^^^^^^ File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result return self.__get_result() ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result raise self._exception File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 382, in call result = fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "/app/opendevin/llm/llm.py", line 78, in wrapper resp = completion_unwrapped(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 2977, in wrapper raise e File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 2875, in wrapper result = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 2137, in completion raise exception_type( ^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 8665, in exception_type raise e File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 7431, in exception_type raise AuthenticationError( litellm.exceptions.AuthenticationError: OpenAIException - Traceback (most recent call last): File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 414, in completion raise e File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai.py", line 350, in completion openai_client = openai( ^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/openai/_client.py", line 104, in init raise openaiError( openai.openaiError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

robertherbaugh avatar Apr 28 '24 03:04 robertherbaugh

Can you please try: -e LLM_API_KEY=$LLM_API_KEY \ instead, in the docker command?

enyst avatar Apr 28 '24 08:04 enyst

That worked for the passthrough of the variable. However, now it starts the task, but then immediately quits the server after Step 1.

INFO: 10.30.10.6:58371 - "GET /api/litellm-models HTTP/1.1" 200 OK INFO: 10.30.10.6:58373 - "GET /api/messages/total HTTP/1.1" 200 OK INFO: 10.30.10.6:58373 - "GET /api/agents HTTP/1.1" 200 OK

============== STEP 0

15:21:43 - PLAN Can you write me a python scrip that will print numbers 0-100 15:21:49 - ACTION CmdRunAction(command='ls', background=False, action='run') 15:21:49 - OBSERVATION

============== STEP 1

15:21:49 - PLAN Can you write me a python scrip that will print numbers 0-100

robertherbaugh avatar Apr 28 '24 15:04 robertherbaugh

I'm not sure why it would hang on such task. I just tried with the same version, 0.4.0, and GPT-3.5, the same prompt and it worked, in 3 steps, including its initial ls and the final action=finish. Did docker quit, can you inspect the container? Otherwise can you restart it and see if it hangs again?

Edited to add: also, what operating system are you running on?

enyst avatar Apr 28 '24 23:04 enyst

This pr will fix the problem of app hang when exception thrown in agent_controller step execution: https://github.com/OpenDevin/OpenDevin/pull/1445 @enyst

assertion avatar Apr 29 '24 09:04 assertion

Thank you @assertion ! ❤️

The fix is on main, @robertherbaugh if you wish to try it, it should behave more reasonably.

enyst avatar Apr 29 '24 21:04 enyst

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.

github-actions[bot] avatar May 30 '24 01:05 github-actions[bot]

This issue was closed because it has been stalled for over 30 days with no activity.

github-actions[bot] avatar Jun 06 '24 01:06 github-actions[bot]