OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

[Bug]: No response in EventStreamRuntime

Open tjb-tech opened this issue 1 year ago • 7 comments
trafficstars

Is there an existing issue for the same bug?

  • [X] I have checked the troubleshooting document at https://docs.all-hands.dev/modules/usage/troubleshooting
  • [X] I have checked the existing issues.

Describe the bug

I use the following bash script:

export DEBUG=1 
poetry run python ./openhands/core/main.py \
        -i 1 \
        -t "帮我写一个multi-head attent" \
        -c CodeActAgent \
        -l llm.gpt4omini

but I got the following error:

INFO:openhands:DEBUG mode enabled.
19:25:18 - openhands:DEBUG: logger.py:173 - Logging initialized
19:25:18 - openhands:INFO: logger.py:185 - Logging to file in: /home/tjb/llm/OpenHands/logs
19:25:21 - openhands:INFO: config.py:522 - Attempt to load default LLM config from config toml
19:25:21 - openhands:INFO: config.py:532 - Attempt to load group gpt4omini from config toml as llm config
19:25:21 - openhands:INFO: config.py:644 - Loading llm config from gpt4omini
19:25:21 - openhands:INFO: main.py:74 - Initializing runtime: <class 'openhands.runtime.client.runtime.EventStreamRuntime'>
19:25:21 - openhands:DEBUG: runtime.py:128 - EventStreamRuntime `default_73bafe99c3ba3105` config:
AppConfig(llms={'llm': LLMConfig(model='gpt-4o-mini', api_key='******', base_url=None, api_version=None, embedding_model='openai', embedding_base_url=None, embedding_deployment_name=None, aws_access_key_id='******', aws_secret_access_key='******', aws_region_name=None, num_retries=10, retry_multiplier=2, retry_min_wait=3, retry_max_wait=300, timeout=None, max_message_chars=10000, temperature=0, top_p=0.5, custom_llm_provider=None, max_input_tokens=None, max_output_tokens=None, input_cost_per_token=None, output_cost_per_token=None, ollama_base_url=None, drop_params=None), 'gpt4omini': LLMConfig(model='gpt-4o-mini', api_key='******', base_url=None, api_version=None, embedding_model='openai', embedding_base_url=None, embedding_deployment_name=None, aws_access_key_id='******', aws_secret_access_key='******', aws_region_name=None, num_retries=10, retry_multiplier=2, retry_min_wait=3, retry_max_wait=300, timeout=None, max_message_chars=10000, temperature=0, top_p=0.5, custom_llm_provider=None, max_input_tokens=None, max_output_tokens=None, input_cost_per_token=None, output_cost_per_token=None, ollama_base_url=None, drop_params=None)}, agents={'agent': AgentConfig(micro_agent_name=None, memory_enabled=False, memory_max_threads=2, llm_config=None)}, default_agent='CodeActAgent', sandbox=SandboxConfig(api_hostname='localhost', api_key='******', base_container_image='nikolaik/python-nodejs:python3.11-nodejs22', runtime_container_image=None, user_id=1011, timeout=120, enable_auto_lint=False, use_host_network=False, initialize_plugins=True, runtime_extra_deps=None, runtime_startup_env_vars={}, browsergym_eval_env=None), security=SecurityConfig(confirmation_mode=False, security_analyzer=None), runtime='eventstream', file_store='memory', file_store_path='/tmp/file_store', workspace_base='/home/tjb/llm/OpenHands/workspace_tjb', workspace_mount_path='/home/tjb/llm/OpenHands/workspace_tjb', workspace_mount_path_in_sandbox='/workspace', workspace_mount_rewrite=None, cache_dir='/tmp/cache', run_as_openhands=True, max_iterations=1, max_budget_per_task=None, e2b_api_key='******', disable_color=False, jwt_secret='******', debug=True, enable_cli_session=False, file_uploads_max_file_size_mb=0, file_uploads_restrict_file_types=False, file_uploads_allowed_extensions=['.*']
19:25:21 - openhands:INFO: runtime_build.py:42 - Using project root: /home/tjb/llm/OpenHands
19:25:23 - openhands:INFO: runtime_build.py:59 - * Creating isolated environment: venv+pip...
* Installing packages in isolated environment:
  - poetry-core
* Getting build dependencies for sdist...
* Building sdist...
Successfully built openhands_ai-0.9.1.tar.gz

19:25:23 - openhands:INFO: runtime_build.py:71 - Source distribution created at /tmp/tmpj369ywr1/openhands_ai-0.9.1.tar.gz
19:25:23 - openhands:INFO: runtime_build.py:82 - Unpacked source code directory: /tmp/tmpj369ywr1/code
19:25:23 - openhands:DEBUG: runtime_build.py:141 - ===== Dockerfile content start =====

# ================================================================
# START: Build Runtime Image from Scratch
# ================================================================
FROM nikolaik/python-nodejs:python3.11-nodejs22





# Install necessary packages and clean up in one layer
RUN apt-get update && \
    apt-get install -y wget sudo apt-utils libgl1-mesa-glx libasound2-plugins git && \
    apt-get clean && \
    rm -rf /var/lib/apt/lists/*

# Create necessary directories
RUN mkdir -p /openhands && \
    mkdir -p /openhands/logs && \
    mkdir -p /openhands/poetry

ENV POETRY_VIRTUALENVS_PATH=/openhands/poetry

RUN if [ ! -d /openhands/miniforge3 ]; then \
    wget --progress=bar:force -O Miniforge3.sh "https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-$(uname)-$(uname -m).sh" && \
    bash Miniforge3.sh -b -p /openhands/miniforge3 && \
    rm Miniforge3.sh && \
    chmod -R g+w /openhands/miniforge3 && \
    bash -c ". /openhands/miniforge3/etc/profile.d/conda.sh && conda config --set changeps1 False && conda config --append channels conda-forge"; \
    fi

# Install Python and Poetry
RUN /openhands/miniforge3/bin/mamba install conda-forge::poetry python=3.11 -y
# ================================================================
# END: Build Runtime Image from Scratch
# ================================================================


# ================================================================
# START: Copy Project and Install/Update Dependencies
# ================================================================
RUN if [ -d /openhands/code ]; then rm -rf /openhands/code; fi
COPY ./code /openhands/code

# Install/Update Dependencies
# 1. Install pyproject.toml via poetry
# 2. Install playwright and chromium
# 3. Clear poetry, apt, mamba caches
RUN cd /openhands/code && \
    /openhands/miniforge3/bin/mamba run -n base poetry env use python3.11 && \
    /openhands/miniforge3/bin/mamba run -n base poetry install --only main,runtime --no-interaction --no-root && \
    apt-get update && \
    /openhands/miniforge3/bin/mamba run -n base poetry run pip install playwright && \
    /openhands/miniforge3/bin/mamba run -n base poetry run playwright install --with-deps chromium && \
    export OD_INTERPRETER_PATH=$(/openhands/miniforge3/bin/mamba run -n base poetry run python -c "import sys; print(sys.executable)") && \
      \
    /openhands/miniforge3/bin/mamba run -n base poetry cache clear --all . && \
    chmod -R g+rws /openhands/poetry &&  \
    apt-get clean && rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/* && \
    /openhands/miniforge3/bin/mamba clean --all

# ================================================================
# END: Copy Project and Install/Update Dependencies
# ================================================================
===== Dockerfile content end =====
19:25:23 - openhands:INFO: runtime_build.py:153 - Input base image: nikolaik/python-nodejs:python3.11-nodejs22
Skip init: False
Extra deps: None
Hash for docker build directory [/tmp/tmpj369ywr1] (contents: ['code', 'Dockerfile']): 53c2605abbf69898b4875d2377494ab7

19:25:23 - openhands:INFO: docker.py:72 - Checking, if image ghcr.io/all-hands-ai/runtime:53c2605abbf69898b4875d2377494ab7 exists locally.
19:25:23 - openhands:INFO: docker.py:74 - Image ghcr.io/all-hands-ai/runtime:53c2605abbf69898b4875d2377494ab7 found locally.
19:25:23 - openhands:INFO: runtime_build.py:241 - Image [ghcr.io/all-hands-ai/runtime:53c2605abbf69898b4875d2377494ab7] already exists so we will reuse it.
19:25:23 - openhands:INFO: runtime.py:183 - Starting container with image: ghcr.io/all-hands-ai/runtime:53c2605abbf69898b4875d2377494ab7 and name: openhands-sandbox-default_73bafe99c3ba3105_b0ac626b-6ba1-4967-8066-d5b7fe1491d3
19:25:23 - openhands:INFO: runtime.py:204 - Mount dir: /workspace
19:25:24 - openhands:INFO: runtime.py:238 - Container started. Server url: http://localhost:39013
19:25:24 - openhands:DEBUG: runtime.py:70 - Runtime `default_73bafe99c3ba3105` config:
AppConfig(llms={'llm': LLMConfig(model='gpt-4o-mini', api_key='******', base_url=None, api_version=None, embedding_model='openai', embedding_base_url=None, embedding_deployment_name=None, aws_access_key_id='******', aws_secret_access_key='******', aws_region_name=None, num_retries=10, retry_multiplier=2, retry_min_wait=3, retry_max_wait=300, timeout=None, max_message_chars=10000, temperature=0, top_p=0.5, custom_llm_provider=None, max_input_tokens=None, max_output_tokens=None, input_cost_per_token=None, output_cost_per_token=None, ollama_base_url=None, drop_params=None), 'gpt4omini': LLMConfig(model='gpt-4o-mini', api_key='******', base_url=None, api_version=None, embedding_model='openai', embedding_base_url=None, embedding_deployment_name=None, aws_access_key_id='******', aws_secret_access_key='******', aws_region_name=None, num_retries=10, retry_multiplier=2, retry_min_wait=3, retry_max_wait=300, timeout=None, max_message_chars=10000, temperature=0, top_p=0.5, custom_llm_provider=None, max_input_tokens=None, max_output_tokens=None, input_cost_per_token=None, output_cost_per_token=None, ollama_base_url=None, drop_params=None)}, agents={'agent': AgentConfig(micro_agent_name=None, memory_enabled=False, memory_max_threads=2, llm_config=None)}, default_agent='CodeActAgent', sandbox=SandboxConfig(api_hostname='localhost', api_key='******', base_container_image='nikolaik/python-nodejs:python3.11-nodejs22', runtime_container_image=None, user_id=1011, timeout=120, enable_auto_lint=False, use_host_network=False, initialize_plugins=True, runtime_extra_deps=None, runtime_startup_env_vars={}, browsergym_eval_env=None), security=SecurityConfig(confirmation_mode=False, security_analyzer=None), runtime='eventstream', file_store='memory', file_store_path='/tmp/file_store', workspace_base='/home/tjb/llm/OpenHands/workspace_tjb', workspace_mount_path='/home/tjb/llm/OpenHands/workspace_tjb', workspace_mount_path_in_sandbox='/workspace', workspace_mount_rewrite=None, cache_dir='/tmp/cache', run_as_openhands=True, max_iterations=1, max_budget_per_task=None, e2b_api_key='******', disable_color=False, jwt_secret='******', debug=True, enable_cli_session=False, file_uploads_max_file_size_mb=0, file_uploads_restrict_file_types=False, file_uploads_allowed_extensions=['.*']
19:25:24 - openhands:INFO: runtime.py:157 - Container initialized with plugins: ['agent_skills', 'jupyter']
19:25:24 - openhands:INFO: runtime.py:160 - Container initialized with env vars: None
19:25:24 - openhands:INFO: main.py:149 - Agent Controller Initialized: Running agent CodeActAgent, model gpt-4o-mini, with task: "帮我写一个multi-head attention 模块"
19:25:24 - openhands:DEBUG: stream.py:134 - Adding MessageAction id=0 from USER
19:25:24 - openhands:INFO: agent_controller.py:150 - [Agent Controller default] Starting step loop...
19:25:24 - openhands:INFO: agent_controller.py:179
USER_ACTION
**MessageAction** (source=EventSource.USER)
CONTENT: 帮我写一个multi-head attention 模块
19:25:24 - openhands:DEBUG: agent_controller.py:238 - [Agent Controller default] Setting agent(CodeActAgent) state from AgentState.LOADING to AgentState.RUNNING
19:25:24 - openhands:DEBUG: stream.py:134 - Adding AgentStateChangedObservation id=1 from AGENT
19:25:24 - openhands:DEBUG: stream.py:134 - Adding NullObservation id=2 from USER


==============
CodeActAgent LEVEL 0 LOCAL STEP 0 GLOBAL STEP 0

19:25:25 - openhands:DEBUG: logger.py:238 - Logging to /home/tjb/llm/OpenHands/logs/llm/24-09-04_19-25/prompt_001.log
19:25:27 - openhands:DEBUG: logger.py:238 - Logging to /home/tjb/llm/OpenHands/logs/llm/24-09-04_19-25/response_001.log
19:25:27 - openhands:INFO: llm.py:477 - Cost: 0.00 USD | Accumulated Cost: 0.00 USD
Input tokens: 3942
Output tokens: 55

19:25:27 - openhands:DEBUG: stream.py:134 - Adding IPythonRunCellAction id=3 from AGENT
19:25:27 - openhands:INFO: agent_controller.py:489
ACTION
**IPythonRunCellAction**
THOUGHT: To create a multi-head attention module, I'll start by writing a Python file named `multi_head_attention.py`. This file will contain the implementation of the multi-head attention mechanism. 

Let's create the file first:
CODE:
create_file('multi_head_attention.py')
19:25:27 - openhands:INFO: runtime.py:360 - Awaiting session
19:25:27 - openhands:DEBUG: runtime.py:252 - Getting container logs...
19:25:27 - openhands:INFO: runtime.py:263 - 
------------------------------Container logs:------------------------------
    |INFO:openhands:DEBUG mode enabled.
    |11:25:26 - openhands:DEBUG: logger.py:173 - Logging initialized
    |11:25:26 - openhands:INFO: logger.py:185 - Logging to file in: /openhands/code/logs
------------------------------------------------------------------------------------------
19:25:29 - openhands:INFO: runtime.py:284 - 
------------------------------Container logs:------------------------------
    |Starting action execution API on port 39013
    |11:25:29 - openhands:INFO: client.py:692 - Starting action execution API on port 39013
    |INFO:     Started server process [15]
    |INFO:     Waiting for application startup.
    |11:25:29 - openhands:DEBUG: client.py:139 - Added sudoer successfully. Output: []
    |11:25:29 - openhands:DEBUG: client.py:155 - Added user openhands successfully with UID 1011. Output: []
------------------------------------------------------------------------------------------
19:25:30 - openhands:INFO: runtime.py:284 - 
------------------------------Container logs:------------------------------
    |11:25:29 - openhands:DEBUG: client.py:192 - Bash initialized. Working directory: /workspace. Output:
    |11:25:29 - openhands:INFO: browser_env.py:58 - Starting browser env...
    |INFO:openhands:DEBUG mode enabled.
    |11:25:30 - openhands:DEBUG: logger.py:173 - Logging initialized
    |11:25:30 - openhands:INFO: logger.py:185 - Logging to file in: /openhands/code/logs
------------------------------------------------------------------------------------------
19:25:34 - openhands:INFO: runtime.py:284 - 
------------------------------Container logs:------------------------------
    |11:25:34 - openhands:INFO: browser_env.py:101 - Browser env started.
    |11:25:34 - openhands:INFO: client.py:98 - Initializing plugin: agent_skills
    |11:25:34 - openhands:INFO: __init__.py:49 - Jupyter kernel gateway started at port 58479. Output:
------------------------------------------------------------------------------------------
http://localhost:39013
19:25:37 - openhands:ERROR: runtime.py:303 - Action execution API is not alive. Response: <Response [500]>
19:25:47 - openhands:DEBUG: runtime.py:252 - Getting container logs...
19:25:47 - openhands:INFO: runtime.py:263 - 
------------------------------Container logs:------------------------------
    |11:25:39 - openhands:INFO: client.py:98 - Initializing plugin: jupyter
    |11:25:39 - openhands:DEBUG: client.py:325 - /workspace != None -> reset Jupyter PWD
    |11:25:39 - openhands:DEBUG: client.py:331 - Changed working directory in IPython to: /workspace. Output: **IPythonRunCellObservation**
    |[Code executed successfully with no output]
    |11:25:40 - openhands:INFO: client.py:114 - AgentSkills initialized: **IPythonRunCellObservation**
    |[Code executed successfully with no output]
    |[Jupyter current working directory: /workspace]
    |[Jupyter Python interpreter: /openhands/poetry/openhands-ai-5O4_aCHf-py3.11/bin/python]
    |11:25:40 - openhands:INFO: client.py:197 - Initializing by running 1 bash commands...
    |11:25:40 - openhands:DEBUG: client.py:201 - Executing init command: git config --global user.name "openhands" && git config --global user.email "[email protected]" && alias git="git --no-pager"
    |11:25:40 - openhands:DEBUG: bash.py:37 - BASH PARSING command: git config --global user.name "openhands" && git config --global user.email "[email protected]" && alias git="git --no-pager"
    |11:25:40 - openhands:DEBUG: bash.py:44 - BASH PARSING remaining:
    |11:25:40 - openhands:DEBUG: client.py:247 - Executing command: git config --global user.name "openhands" && git config --global user.email "[email protected]" && alias git="git --no-pager"
    |11:25:40 - openhands:DEBUG: client.py:256 - Executing command for exit code: git config --global user.name "openhands" && git config --global user.email "[email protected]" && alias git="git --no-pager"
    |11:25:40 - openhands:DEBUG: client.py:259 - Exit code Output: 0
    |
    |11:25:40 - openhands:DEBUG: client.py:277 - Command output:
    |[Python Interpreter: /openhands/poetry/openhands-ai-5O4_aCHf-py3.11/bin/python]
    |openhands@43065a282ea3:/workspace $
    |11:25:40 - openhands:DEBUG: client.py:203 - Init command outputs (exit code: 0):
    |[Python Interpreter: /openhands/poetry/openhands-ai-5O4_aCHf-py3.11/bin/python]
    |openhands@43065a282ea3:/workspace $
    |11:25:40 - openhands:INFO: client.py:208 - Bash init commands completed
    |11:25:40 - openhands:INFO: client.py:485 - Runtime client initialized.
    |INFO:     Application startup complete.
    |INFO:     Uvicorn running on http://0.0.0.0:39013 (Press CTRL+C to quit)
------------------------------------------------------------------------------------------
http://localhost:39013
19:25:47 - openhands:ERROR: runtime.py:303 - Action execution API is not alive. Response: <Response [500]>
19:25:57 - openhands:DEBUG: runtime.py:252 - Getting container logs...
19:25:57 - openhands:INFO: runtime.py:263 - 
------------------------------Container logs:------------------------------
    |INFO:     172.17.0.1:59112 - "GET / HTTP/1.1" 404 Not Found
    |INFO:     172.17.0.1:59112 - "GET /favicon.ico HTTP/1.1" 404 Not Found
------------------------------------------------------------------------------------------
http://localhost:39013
19:25:57 - openhands:ERROR: runtime.py:303 - Action execution API is not alive. Response: <Response [500]>

I open http://localhost:39013, and the page shows that:

{"detail":"Not Found"}

Current OpenHands version

0.9.1

Installation and Configuration

poetry run python ./openhands/core/main.py \
        -i 1 \
        -t "帮我写一个multi-head attent" \
        -c CodeActAgent \
        -l llm.gpt4omini

Model and Agent

-model: gpt-4o-mini -agent: CodeActAgent

Operating System

Linux

Reproduction Steps

No response

Logs, Errors, Screenshots, and Additional Context

No response

tjb-tech avatar Sep 04 '24 11:09 tjb-tech

Could you please help me? @xingyaoww @shubhamofbce

tjb-tech avatar Sep 04 '24 11:09 tjb-tech

Can you try to curl http://localhost:39013/alive to see if it is working?

I open http://localhost:39013/, and the page shows that: {"detail":"Not Found"}

This looks to me that the runtime API should be working and accessible from your host machine. Can you wait for one more minute to see if it automatically connects (e.g., the retry should be able to connect to it)? If not, can you docker ps your setup, especially pay attention the port mapping of the runtime container?

xingyaoww avatar Sep 04 '24 15:09 xingyaoww

Can you try to curl http://localhost:39013/alive to see if it is working?

I open http://localhost:39013/, and the page shows that: {"detail":"Not Found"}

This looks to me that the runtime API should be working and accessible from your host machine. Can you wait for one more minute to see if it automatically connects (e.g., the retry should be able to connect to it)? If not, can you docker ps your setup, especially pay attention the port mapping of the runtime container?

I wait for the response, and get the response like:

File "/Users/tangjiabin/Library/Caches/pypoetry/virtualenvs/openhands-ai--lTyJ5OC-py3.11/lib/python3.11/site-packages/tenacity/__init__.py", line 419, in exc_check
    raise retry_exc from fut.exception()
tenacity.RetryError: RetryError[<Future at 0x2861bb110 state=finished raised RuntimeError>]

tjb-tech avatar Sep 05 '24 07:09 tjb-tech

帮我写一个multi-head attention 模块

It seems that I set the http_proxy and https_proxy to access the openai so that the error occur:

export http_proxy="http://127.0.0.1:7890"
export https_proxy="http://127.0.0.1:7890"
export DEBUG=1 
poetry run python ./openhands/core/main.py \
        -i 1 \
        -t "帮我写一个multi-head attent" \
        -c CodeActAgent \
        -l llm.gpt4omini

because I don't set proxy and use deepseek, It occur another error:

15:57:56 - openhands:INFO: runtime.py:263 - 
------------------------------Container logs:------------------------------
    |INFO:     192.168.65.1:64346 - "GET /alive HTTP/1.1" 200 OK
    |INFO:     192.168.65.1:64346 - "POST /execute_action HTTP/1.1" 200 OK
------------------------------------------------------------------------------------------
15:57:57 - OBSERVATION
**IPythonRunCellObservation**
[Code executed successfully with no output]
[Jupyter current working directory: /workspace]
[Jupyter Python interpreter: /openhands/poetry/openhands-ai-5O4_aCHf-py3.11/bin/python]

May I ask what is the error of deepseek and how to set proxy while using openhands to access the openai? @xingyaoww

tjb-tech avatar Sep 05 '24 08:09 tjb-tech

ah, I see - your HTTP/HTTPS proxy also impacts the requests sent to the runtime client. I don't think we yet support setting HTTP/HTTPS proxy for only LLM calls.

I think either you can:

  • Set route rule of your proxy to do not proxy request send to localhost (so request to localhost won't get proxied)
  • Figure out a way to set proxy directly in this llm.py so that only LLM calls are proxied: https://github.com/All-Hands-AI/OpenHands/blob/main/openhands/llm/llm.py. For example, you can add an https_proxy under LLMConfig and use it inside LLM class
  • Or you can setup LiteLLM Proxy using your proxy, and having OpenHands directly connect to that LiteLLM Proxy without HTTPS proxy.

xingyaoww avatar Sep 05 '24 13:09 xingyaoww

ah, I see - your HTTP/HTTPS proxy also impacts the requests sent to the runtime client. I don't think we yet support setting HTTP/HTTPS proxy for only LLM calls.

I think either you can:

  • Set route rule of your proxy to do not proxy request send to localhost (so request to localhost won't get proxied)
  • Figure out a way to set proxy directly in this llm.py so that only LLM calls are proxied: https://github.com/All-Hands-AI/OpenHands/blob/main/openhands/llm/llm.py. For example, you can add an https_proxy under LLMConfig and use it inside LLM class
  • Or you can setup LiteLLM Proxy using your proxy, and having OpenHands directly connect to that LiteLLM Proxy without HTTPS proxy.

Thank you very much! By the way, do you know why deepseek-chat cannot follow the right instruct format? It seems to be powerful enough.

tjb-tech avatar Sep 05 '24 13:09 tjb-tech

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.

github-actions[bot] avatar Oct 07 '24 02:10 github-actions[bot]

This issue was closed because it has been stalled for over 30 days with no activity.

github-actions[bot] avatar Oct 15 '24 01:10 github-actions[bot]