[Bug]: fail Waiting for client to become ready...
Is there an existing issue for the same bug?
- [X] I have checked the existing issues.
Describe the bug and reproduction steps
when i use make docker and make bulid and make run , show Waiting for client to become ready..., and does not go on
OpenHands Installation
Docker command in README
OpenHands Version
No response
Operating System
None
Logs, Errors, Screenshots, and Additional Context
INFO: 172.18.0.1:38276 - "POST /api/conversations HTTP/1.1" 200 OK INFO: ('172.18.0.1', 38284) - "WebSocket /socket.io/?latest_event_id=-1&conversation_id=a989d29db53f479db2eafaafb461c76d&EIO=4&transport=websocket" [accepted] INFO: 172.18.0.1:38276 - "GET /api/options/config HTTP/1.1" 200 OK 07:31:30 - openhands:INFO: listen_socket.py:26 - sio:connect: 7r8dqN7ApDcuNCJuAAAF 07:31:30 - openhands:INFO: manager.py:209 - join_conversation:a989d29db53f479db2eafaafb461c76d:7r8dqN7ApDcuNCJuAAAF 07:31:30 - openhands:INFO: manager.py:364 - _get_event_stream:a989d29db53f479db2eafaafb461c76d 07:31:30 - openhands:INFO: manager.py:367 - found_local_agent_loop:a989d29db53f479db2eafaafb461c76d 07:31:31 - openhands:INFO: docker_runtime.py:133 - [runtime a989d29db53f479db2eafaafb461c76d] Starting runtime with image: ghcr.io/all-hands-ai/runtime:0.17-nikolaik 07:31:31 - openhands:INFO: docker_runtime.py:137 - [runtime a989d29db53f479db2eafaafb461c76d] Container started: openhands-runtime-a989d29db53f479db2eafaafb461c76d. VSCode URL: None 07:31:31 - openhands:INFO: docker_runtime.py:145 - [runtime a989d29db53f479db2eafaafb461c76d] Waiting for client to become ready at http://localhost:35402... INFO: 172.18.0.1:38298 - "GET /api/settings HTTP/1.1" 200 OK INFO: 172.18.0.1:38308 - "GET /api/options/config HTTP/1.1" 200 OK 07:31:45 - openhands:INFO: manager.py:434 - _cleanup_session:a989d29db53f479db2eafaafb461c76d INFO: 172.18.0.1:53872 - "GET /api/options/config HTTP/1.1" 200 OK INFO: 172.18.0.1:53874 - "GET /api/settings HTTP/1.1" 200 OK INFO: 172.18.0.1:53900 - "GET /api/options/config HTTP/1.1" 200 OK INFO: 172.18.0.1:53890 - "GET /api/settings HTTP/1.1" 200 OK INFO: 172.18.0.1:42306 - "GET /api/options/config HTTP/1.1" 200 OK INFO: 172.18.0.1:42298 - "GET /api/settings HTTP/1.1" 200 OK ^CINFO: Shutting down 07:32:06 - openhands:INFO: listen_socket.py:99 - sio:disconnect:7r8dqN7ApDcuNCJuAAAF 07:32:06 - openhands:INFO: manager.py:413 - disconnect_from_session:7r8dqN7ApDcuNCJuAAAF:a989d29db53f479db2eafaafb461c76d 07:32:06 - openhands:INFO: manager.py:458 - _close_session:a989d29db53f479db2eafaafb461c76d 07:32:06 - openhands:INFO: manager.py:466 - removing connections: [] 07:32:06 - openhands:INFO: manager.py:475 - closing_session:a989d29db53f479db2eafaafb461c76d INFO: Waiting for background tasks to complete. (CTRL+C to force quit) 07:32:08 - openhands:ERROR: session.py:116 - Error creating controller: HTTPConnectionPool(host='localhost', port=36361): Max retries exceeded with url: /alive (Caused by NewCon nectionError('<urllib3.connection.HTTPConnection object at 0x7f8acde96930>: Failed to establish a new connection: [Errno 111] Connection refused')) Traceback (most recent call last): File "/root/.cache/pypoetry/virtualenvs/openhands-ai-9TtSrW0h-py3.12/lib/python3.12/site-packages/urllib3/connection.py", line 199, in _new_conn sock = connection.create_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/.cache/pypoetry/virtualenvs/openhands-ai-9TtSrW0h-py3.12/lib/python3.12/site-packages/urllib3/util/connection.py", line 85, in create_connection raise err File "/root/.cache/pypoetry/virtualenvs/openhands-ai-9TtSrW0h-py3.12/lib/python3.12/site-packages/urllib3/util/connection.py", line 73, in create_connection sock.connect(sa) ConnectionRefusedError: [Errno 111] Connection refused
The above exception was the direct cause of the following exception:
I'm guessing you are running the main branch and running make build and make run?
What OS are you running?
I'm also experiencing this
Docker command:
docker run
-p 3000:3000
-v /var/run/docker.sock:/var/run/docker.sock
-v C:\repositories\extensions\youtube-time-manager\:/opt/workspace_base
-v ~/.openhands-state:/.openhands-state
--env DEBUG=1
--env LOG_ALL_EVENTS=true
--env SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:main-nikolaik
--env WORKSPACE_MOUNT_PATH=C:\repositories\extensions\youtube-time-manager
--name open-hands
--pull always
-t
-i
--add-host host.docker.internal:host-gateway
--rm
docker.all-hands.dev/all-hands-ai/openhands:main
I think the Docker log is
ERROR: Exception in ASGI application
2025-01-02T13:58:50.774356780Z + Exception Group Traceback (most recent call last):
2025-01-02T13:58:50.774364923Z | File "/app/.venv/lib/python3.12/site-packages/starlette/_utils.py", line 76, in collapse_excgroups
2025-01-02T13:58:50.774367938Z | yield
2025-01-02T13:58:50.774370653Z | File "/app/.venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 186, in __call__
2025-01-02T13:58:50.774373266Z | async with anyio.create_task_group() as task_group:
2025-01-02T13:58:50.774375780Z | File "/app/.venv/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 763, in __aexit__
2025-01-02T13:58:50.774378796Z | raise BaseExceptionGroup(
2025-01-02T13:58:50.774381208Z | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
2025-01-02T13:58:50.774383722Z +-+---------------- 1 ----------------
2025-01-02T13:58:50.774386134Z | Traceback (most recent call last):
2025-01-02T13:58:50.774388648Z | File "/app/.venv/lib/python3.12/site-packages/uvicorn/protocols/http/h11_impl.py", line 403, in run_asgi
2025-01-02T13:58:50.774391261Z | result = await app( # type: ignore[func-returns-value]
2025-01-02T13:58:50.774393674Z | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774407949Z | File "/app/.venv/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
2025-01-02T13:58:50.774411066Z | return await self.app(scope, receive, send)
2025-01-02T13:58:50.774413478Z | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774415992Z | File "/app/.venv/lib/python3.12/site-packages/engineio/async_drivers/asgi.py", line 75, in __call__
2025-01-02T13:58:50.774418605Z | await self.other_asgi_app(scope, receive, send)
2025-01-02T13:58:50.774421119Z | File "/app/.venv/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in __call__
2025-01-02T13:58:50.774425642Z | await super().__call__(scope, receive, send)
2025-01-02T13:58:50.774428055Z | File "/app/.venv/lib/python3.12/site-packages/starlette/applications.py", line 113, in __call__
2025-01-02T13:58:50.774430568Z | await self.middleware_stack(scope, receive, send)
2025-01-02T13:58:50.774433383Z | File "/app/.venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 187, in __call__
2025-01-02T13:58:50.774436198Z | raise exc
2025-01-02T13:58:50.774438611Z | File "/app/.venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 165, in __call__
2025-01-02T13:58:50.774441124Z | await self.app(scope, receive, _send)
2025-01-02T13:58:50.774444844Z | File "/app/.venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 185, in __call__
2025-01-02T13:58:50.774447558Z | with collapse_excgroups():
2025-01-02T13:58:50.774449971Z | File "/usr/local/lib/python3.12/contextlib.py", line 158, in __exit__
2025-01-02T13:58:50.774452484Z | self.gen.throw(value)
2025-01-02T13:58:50.774454897Z | File "/app/.venv/lib/python3.12/site-packages/starlette/_utils.py", line 82, in collapse_excgroups
2025-01-02T13:58:50.774457611Z | raise exc
2025-01-02T13:58:50.774460024Z | File "/app/.venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 187, in __call__
2025-01-02T13:58:50.774462637Z | response = await self.dispatch_func(request, call_next)
2025-01-02T13:58:50.774465050Z | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774467563Z | File "/app/openhands/server/middleware.py", line 157, in __call__
2025-01-02T13:58:50.774470077Z | response = await self._attach_conversation(request)
2025-01-02T13:58:50.774472489Z | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774477315Z | File "/app/openhands/server/middleware.py", line 137, in _attach_conversation
2025-01-02T13:58:50.774479929Z | request.state.conversation = await session_manager.attach_to_conversation(
2025-01-02T13:58:50.774482341Z | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774488071Z | File "/app/openhands/server/session/manager.py", line 197, in attach_to_conversation
2025-01-02T13:58:50.774490786Z | await c.connect()
2025-01-02T13:58:50.774493198Z | File "/app/openhands/server/session/conversation.py", line 43, in connect
2025-01-02T13:58:50.774495712Z | await self.runtime.connect()
2025-01-02T13:58:50.774498124Z | File "/app/openhands/runtime/impl/docker/docker_runtime.py", line 148, in connect
2025-01-02T13:58:50.774500638Z | await call_sync_from_async(self._wait_until_alive)
2025-01-02T13:58:50.774503050Z | File "/app/openhands/utils/async_utils.py", line 18, in call_sync_from_async
2025-01-02T13:58:50.774505564Z | result = await coro
2025-01-02T13:58:50.774508077Z | ^^^^^^^^^^
2025-01-02T13:58:50.774510389Z | File "/usr/local/lib/python3.12/concurrent/futures/thread.py", line 58, in run
2025-01-02T13:58:50.774513003Z | result = self.fn(*self.args, **self.kwargs)
2025-01-02T13:58:50.774515416Z | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774517828Z | File "/app/openhands/utils/async_utils.py", line 17, in <lambda>
2025-01-02T13:58:50.774520543Z | coro = loop.run_in_executor(None, lambda: fn(*args, **kwargs))
2025-01-02T13:58:50.774523056Z | ^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774525770Z | File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 336, in wrapped_f
2025-01-02T13:58:50.774528384Z | return copy(f, *args, **kw)
2025-01-02T13:58:50.774530897Z | ^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774533310Z | File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 475, in __call__
2025-01-02T13:58:50.774535823Z | do = self.iter(retry_state=retry_state)
2025-01-02T13:58:50.774538236Z | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774540648Z | File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 376, in iter
2025-01-02T13:58:50.774543162Z | result = action(retry_state)
2025-01-02T13:58:50.774545574Z | ^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774547987Z | File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 418, in exc_check
2025-01-02T13:58:50.774550500Z | raise retry_exc.reraise()
2025-01-02T13:58:50.774552913Z | ^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774555326Z | File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 185, in reraise
2025-01-02T13:58:50.774557839Z | raise self.last_attempt.result()
2025-01-02T13:58:50.774560252Z | ^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774562664Z | File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
2025-01-02T13:58:50.774581866Z | return self.__get_result()
2025-01-02T13:58:50.774586993Z | ^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774589506Z | File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
2025-01-02T13:58:50.774592019Z | raise self._exception
2025-01-02T13:58:50.774594432Z | File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 478, in __call__
2025-01-02T13:58:50.774597046Z | result = fn(*args, **kwargs)
2025-01-02T13:58:50.774599659Z | ^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774602072Z | File "/app/openhands/runtime/impl/docker/docker_runtime.py", line 331, in _wait_until_alive
2025-01-02T13:58:50.774604686Z | self.check_if_alive()
2025-01-02T13:58:50.774607099Z | File "/app/openhands/runtime/impl/action_execution/action_execution_client.py", line 99, in check_if_alive
2025-01-02T13:58:50.774609612Z | with self._send_action_server_request(
2025-01-02T13:58:50.774612025Z | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774614638Z | File "/app/openhands/runtime/impl/action_execution/action_execution_client.py", line 96, in _send_action_server_request
2025-01-02T13:58:50.774617353Z | return send_request(self.session, method, url, **kwargs)
2025-01-02T13:58:50.774619966Z | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774622480Z | File "/app/openhands/runtime/utils/request.py", line 28, in send_request
2025-01-02T13:58:50.774625797Z | response = session.request(method, url, timeout=timeout, **kwargs)
2025-01-02T13:58:50.774628411Z | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774630924Z | File "/app/.venv/lib/python3.12/site-packages/requests/sessions.py", line 589, in request
2025-01-02T13:58:50.774633538Z | resp = self.send(prep, **send_kwargs)
2025-01-02T13:58:50.774635951Z | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774638263Z | File "/app/.venv/lib/python3.12/site-packages/requests/sessions.py", line 703, in send
2025-01-02T13:58:50.774640877Z | r = adapter.send(request, **kwargs)
2025-01-02T13:58:50.774643189Z | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774646004Z | File "/app/.venv/lib/python3.12/site-packages/requests/adapters.py", line 682, in send
2025-01-02T13:58:50.774648517Z | raise ConnectionError(err, request=request)
2025-01-02T13:58:50.774651030Z | requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
2025-01-02T13:58:50.774653744Z +------------------------------------
2025-01-02T13:58:50.774656157Z
2025-01-02T13:58:50.774662390Z During handling of the above exception, another exception occurred:
2025-01-02T13:58:50.774665004Z
2025-01-02T13:58:50.774667416Z Traceback (most recent call last):
2025-01-02T13:58:50.774669829Z File "/app/.venv/lib/python3.12/site-packages/uvicorn/protocols/http/h11_impl.py", line 403, in run_asgi
2025-01-02T13:58:50.774672443Z result = await app( # type: ignore[func-returns-value]
2025-01-02T13:58:50.774674856Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774677369Z File "/app/.venv/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
2025-01-02T13:58:50.774679882Z return await self.app(scope, receive, send)
2025-01-02T13:58:50.774682295Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774684708Z File "/app/.venv/lib/python3.12/site-packages/engineio/async_drivers/asgi.py", line 75, in __call__
2025-01-02T13:58:50.774687221Z await self.other_asgi_app(scope, receive, send)
2025-01-02T13:58:50.774689634Z File "/app/.venv/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in __call__
2025-01-02T13:58:50.774692247Z await super().__call__(scope, receive, send)
2025-01-02T13:58:50.774695967Z File "/app/.venv/lib/python3.12/site-packages/starlette/applications.py", line 113, in __call__
2025-01-02T13:58:50.774698581Z await self.middleware_stack(scope, receive, send)
2025-01-02T13:58:50.774700993Z File "/app/.venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 187, in __call__
2025-01-02T13:58:50.774703507Z raise exc
2025-01-02T13:58:50.774706221Z File "/app/.venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 165, in __call__
2025-01-02T13:58:50.774708835Z await self.app(scope, receive, _send)
2025-01-02T13:58:50.774711247Z File "/app/.venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 185, in __call__
2025-01-02T13:58:50.774713861Z with collapse_excgroups():
2025-01-02T13:58:50.774716274Z File "/usr/local/lib/python3.12/contextlib.py", line 158, in __exit__
2025-01-02T13:58:50.774718687Z self.gen.throw(value)
2025-01-02T13:58:50.774721099Z File "/app/.venv/lib/python3.12/site-packages/starlette/_utils.py", line 82, in collapse_excgroups
2025-01-02T13:58:50.774723613Z raise exc
2025-01-02T13:58:50.774726025Z File "/app/.venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 187, in __call__
2025-01-02T13:58:50.774728539Z response = await self.dispatch_func(request, call_next)
2025-01-02T13:58:50.774730951Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774733464Z File "/app/openhands/server/middleware.py", line 157, in __call__
2025-01-02T13:58:50.774735978Z response = await self._attach_conversation(request)
2025-01-02T13:58:50.774740904Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774743417Z File "/app/openhands/server/middleware.py", line 137, in _attach_conversation
2025-01-02T13:58:50.774745930Z request.state.conversation = await session_manager.attach_to_conversation(
2025-01-02T13:58:50.774748443Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774788454Z File "/app/openhands/server/session/manager.py", line 197, in attach_to_conversation
2025-01-02T13:58:50.774813788Z await c.connect()
2025-01-02T13:58:50.774817206Z File "/app/openhands/server/session/conversation.py", line 43, in connect
2025-01-02T13:58:50.774819719Z await self.runtime.connect()
2025-01-02T13:58:50.774822132Z File "/app/openhands/runtime/impl/docker/docker_runtime.py", line 148, in connect
2025-01-02T13:58:50.774824645Z await call_sync_from_async(self._wait_until_alive)
2025-01-02T13:58:50.774827058Z File "/app/openhands/utils/async_utils.py", line 18, in call_sync_from_async
2025-01-02T13:58:50.774829470Z result = await coro
2025-01-02T13:58:50.774831883Z ^^^^^^^^^^
2025-01-02T13:58:50.774834296Z File "/usr/local/lib/python3.12/concurrent/futures/thread.py", line 58, in run
2025-01-02T13:58:50.774836910Z result = self.fn(*self.args, **self.kwargs)
2025-01-02T13:58:50.774839423Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774841836Z File "/app/openhands/utils/async_utils.py", line 17, in <lambda>
2025-01-02T13:58:50.774844449Z coro = loop.run_in_executor(None, lambda: fn(*args, **kwargs))
2025-01-02T13:58:50.774846862Z ^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774849375Z File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 336, in wrapped_f
2025-01-02T13:58:50.774851888Z return copy(f, *args, **kw)
2025-01-02T13:58:50.774855105Z ^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774857619Z File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 475, in __call__
2025-01-02T13:58:50.774860132Z do = self.iter(retry_state=retry_state)
2025-01-02T13:58:50.774862545Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774864957Z File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 376, in iter
2025-01-02T13:58:50.774867471Z result = action(retry_state)
2025-01-02T13:58:50.774869883Z ^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774872195Z File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 418, in exc_check
2025-01-02T13:58:50.774874809Z raise retry_exc.reraise()
2025-01-02T13:58:50.774877222Z ^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774884159Z File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 185, in reraise
2025-01-02T13:58:50.774886873Z raise self.last_attempt.result()
2025-01-02T13:58:50.774889286Z ^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774891799Z File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
2025-01-02T13:58:50.774894312Z return self.__get_result()
2025-01-02T13:58:50.774897026Z ^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774899439Z File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
2025-01-02T13:58:50.774901952Z raise self._exception
2025-01-02T13:58:50.774904365Z File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 478, in __call__
2025-01-02T13:58:50.774906979Z result = fn(*args, **kwargs)
2025-01-02T13:58:50.774909291Z ^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774911704Z File "/app/openhands/runtime/impl/docker/docker_runtime.py", line 331, in _wait_until_alive
2025-01-02T13:58:50.774914317Z self.check_if_alive()
2025-01-02T13:58:50.774916630Z File "/app/openhands/runtime/impl/action_execution/action_execution_client.py", line 99, in check_if_alive
2025-01-02T13:58:50.774919243Z with self._send_action_server_request(
2025-01-02T13:58:50.774921656Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774924069Z File "/app/openhands/runtime/impl/action_execution/action_execution_client.py", line 96, in _send_action_server_request
2025-01-02T13:58:50.774926783Z return send_request(self.session, method, url, **kwargs)
2025-01-02T13:58:50.774929196Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774931609Z File "/app/openhands/runtime/utils/request.py", line 28, in send_request
2025-01-02T13:58:50.774934122Z response = session.request(method, url, timeout=timeout, **kwargs)
2025-01-02T13:58:50.774936535Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774939048Z File "/app/.venv/lib/python3.12/site-packages/requests/sessions.py", line 589, in request
2025-01-02T13:58:50.774941963Z resp = self.send(prep, **send_kwargs)
2025-01-02T13:58:50.774944476Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774946789Z File "/app/.venv/lib/python3.12/site-packages/requests/sessions.py", line 703, in send
2025-01-02T13:58:50.774949402Z r = adapter.send(request, **kwargs)
2025-01-02T13:58:50.774951714Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-01-02T13:58:50.774954127Z File "/app/.venv/lib/python3.12/site-packages/requests/adapters.py", line 682, in send
2025-01-02T13:58:50.774956640Z raise ConnectionError(err, request=request)
2025-01-02T13:58:50.774959154Z requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
@rbren any changes recently that would have changed this on main?
Huh, when I returned to the OH instance a moment later I noticed that the agent did start executing commands; strange
I think something changed on main that makes OH wait on "Waiting for client" for some time instead of the regular messages. Mine took 2-3 minutes. However, I think the original user may be running into something different as they get errors.
In the original issue, the agent is getting cleaned up, even though there's clearly a websocket connection...not sure what is happening here
this logic did change recently, so good chance there's a bug
Oh no I was wrong, _cleanup_session gets called, but bails out before it gets to _close_session. We should rename that function/log.
It seems like we just timed out while waiting for the container to come up. Maybe we need a little more grace there for slower machines?
@chenglu66 it looks like you hit Ctrl+C about 15 seconds after things started up. Can you try waiting 2-3 minutes for the container to start up? it usually takes about 1 minute
@rbren I pulled 0.18 and after prompting the agent, it hasn't initialized even after 10 minutes
Sorry @avi12 your problem appears to be totally different from OP's.
Seems like you're on windows--are you using WSL?
@mamoodi yes, i just run main branch in my docker,my OS Windows 10, when i make run it show waitting,i try many times and wait sever mintues , it does not work, but when i bulid image and use this image in my docker it can work。
@rbren I'm not using WSL As far as I found, either running a Docker command through PowerShell or letting my IDE around the Docker command directly works just as well
@avi12 we don't support running on windows without WSL. You might be able to make it work! But it's too error-prone for us to really offer support there
@chenglu66 are you using WSL? or are you also running directly on windows?
It practically works the same in my experience I also built OH from source via WSL and it often worked
I am getting this issue currently - running from wsl with the 19 docker image.
I'm getting the same error. It only happens when I try to connect to my local filesystem. Running the docker command without workspace configuration fixes the issue.
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.
Shut up stale bot
Is this an issue still? I've tried Mac, Ubuntu and Windows (WSL) and they all work fine following the docs: https://docs.all-hands.dev/modules/usage/installation Not sure what's going on for you all.
It is. Intermittent but is, I feel like it comes and goes away and in my testing I couldn't figure out where it came from. it's intermittent
This issue still exists. Running on debian docker compose. Not sure whats causing this but the agent just sits there and doesn't start. I removed all volumes and images. Recreated but issue still persists. Also tried switching from anthropic to openai but nothing seems to fix this.
I am also getting the same error. I tried increasing some request timeouts but the error still persists. Are there any solutions for solving it?
I actually figured it out on my end. It was related to the Runtime container. When I deployed using docker-compose, I set the main image to the latest, but I never updated the tag in the container runtime image version. It was set to 0.21-nikolaik, which was causing my issues. After updating to 0.27-nikolaik, it ran fine again
Doing another pulse check here, are people still running into this? We have made a lot of improvements around connecting and runtimes.
I just started trying to use OpenHands, and I'm running into the same issue. I started OpenHands similar to the instructions on the website:
docker pull docker.all-hands.dev/all-hands-ai/runtime:0.30-nikolaik
docker run -it --rm --pull=always \
-e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.30-nikolaik \
-e LOG_ALL_EVENTS=true \
-v /tmp/run/docker.sock:/var/run/docker.sock \
-v ./openhands-state:/.openhands-state \
-p 3031:3000 \
--add-host host.docker.internal:host-gateway \
--name openhands-app \
docker.all-hands.dev/all-hands-ai/openhands:0.30
Here are the docker logs:
`docker.all-hands.dev/all-hands-ai/openhands:0.30`
03:59:46 - openhands:INFO: listen_socket.py:65 - oh_event: AgentStateChangedObservation
03:59:46 - openhands:INFO: listen_socket.py:77 - Finished replaying event stream for conversation 462dedad043f495a949a1381f2b12476
03:59:47 - openhands:INFO: docker_runtime.py:140 - [runtime 462dedad043f495a949a1381f2b12476] Starting runtime with image: docker.all-hands.dev/all-hands-ai/runtime:0.30-nikolaik
03:59:47 - openhands:INFO: docker_runtime.py:144 - [runtime 462dedad043f495a949a1381f2b12476] Container started: openhands-runtime-462dedad043f495a949a1381f2b12476. VSCode URL: None
03:59:47 - openhands:INFO: docker_runtime.py:155 - [runtime 462dedad043f495a949a1381f2b12476] Waiting for client to become ready at http://host.docker.internal:31134...
INFO: 172.17.0.1:43198 - "GET /api/github/repositories?sort=pushed&page=5&per_page=100 HTTP/1.1" 200 OK
04:01:47 - openhands:INFO: agent_session.py:175 - Agent session start
04:01:47 - openhands:ERROR: session.py:163 - Error creating agent_session: [Errno 111] Connection refused
Traceback (most recent call last):
File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
yield
File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 250, in handle_request
resp = self._pool.handle_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 256, in handle_request
raise exc from None
File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 236, in handle_request
response = connection.handle_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 101, in handle_request
raise exc
File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 78, in handle_request
stream = self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 124, in _connect
stream = self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpcore/_backends/sync.py", line 207, in connect_tcp
with map_exceptions(exc_map):
File "/usr/local/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/app/.venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: [Errno 111] Connection refused
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/app/openhands/server/session/session.py", line 148, in initialize_agent
await self.agent_session.start(
File "/app/openhands/server/session/agent_session.py", line 117, in start
runtime_connected = await self._create_runtime(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/openhands/server/session/agent_session.py", line 316, in _create_runtime
await self.runtime.connect()
File "/app/openhands/runtime/impl/docker/docker_runtime.py", line 158, in connect
await call_sync_from_async(self._wait_until_alive)
File "/app/openhands/utils/async_utils.py", line 18, in call_sync_from_async
result = await coro
^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/openhands/utils/async_utils.py", line 17, in <lambda>
coro = loop.run_in_executor(None, lambda: fn(*args, **kwargs))
^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 336, in wrapped_f
return copy(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 475, in __call__
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 376, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 418, in exc_check
raise retry_exc.reraise()
^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 185, in reraise
raise self.last_attempt.result()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 478, in __call__
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/app/openhands/runtime/impl/docker/docker_runtime.py", line 366, in _wait_until_alive
self.check_if_alive()
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 336, in wrapped_f
return copy(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 475, in __call__
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 376, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 398, in <lambda>
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 478, in __call__
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/app/openhands/runtime/impl/action_execution/action_execution_client.py", line 123, in check_if_alive
response = self._send_action_server_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/openhands/runtime/impl/action_execution/action_execution_client.py", line 115, in _send_action_server_request
return send_request(self.session, method, url, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 336, in wrapped_f
return copy(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 475, in __call__
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 376, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 398, in <lambda>
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/app/.venv/lib/python3.12/site-packages/tenacity/__init__.py", line 478, in __call__
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/app/openhands/runtime/utils/request.py", line 44, in send_request
response = session.request(method, url, timeout=timeout, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/openhands/utils/http_session.py", line 31, in request
return CLIENT.request(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 825, in request
return self.send(request, auth=auth, follow_redirects=follow_redirects)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 914, in send
response = self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 942, in _send_handling_auth
response = self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
response = self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpx/_client.py", line 1014, in _send_single_request
response = transport.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 249, in handle_request
with map_httpcore_exceptions():
File "/usr/local/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/app/.venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: [Errno 111] Connection refused
docker.all-hands.dev/all-hands-ai/runtime:0.30-nikolaik
NFO: Started server process [8] INFO: Waiting for application startup. 03:59:55 - openhands:INFO: action_execution_server.py:177 - No max memory limit set, using all available system memory 04:00:01 - openhands:INFO: browser_env.py:103 - Successfully called env.reset 04:00:01 - openhands:INFO: browser_env.py:120 - Browser env started. [I 2025-03-30 04:00:02.214 KernelGatewayApp] Writing Jupyter server cookie secret to /root/.local/share/jupyter/runtime/jupyter_cookie_secret [I 2025-03-30 04:00:02.215 KernelGatewayApp] Jupyter Kernel Gateway 3.0.1 is available at http://0.0.0.0:43979 Server bound to 0.0.0.0:40360 (IPv4) Extension host agent listening on 40360
[04:00:02]
Web UI available at http://localhost:40360?tkn=776f0606-bc96-4c9f-a219-3b5df205b817 [04:00:02] Extension host agent started. [04:00:02] Started initializing default profile extensions in extensions installation folder. file:///root/.openvscode-server/extensions [04:00:02] Completed initializing default profile extensions in extensions installation folder. file:///root/.openvscode-server/extensions [I 2025-03-30 04:00:03.384 KernelGatewayApp] Kernel started: 3f69667c-1f03-49b8-b483-473660d11535 [I 250330 04:00:03 web:2348] 201 POST /api/kernels (127.0.0.1) 167.25ms [W 2025-03-30 04:00:03.387 KernelGatewayApp] No session ID specified [I 250330 04:00:03 web:2348] 101 GET /api/kernels/3f69667c-1f03-49b8-b483-473660d11535/channels (127.0.0.1) 549.08ms [I 2025-03-30 04:00:03.936 KernelGatewayApp] Connecting to kernel 3f69667c-1f03-49b8-b483-473660d11535. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:31134 (Press CTRL+C to quit)
It seems that the openhands-app container can't access ports on the host:
❯ docker exec -it openhands-app curl host.docker.internal:31134
curl: (7) Failed to connect to host.docker.internal port 31134 after 0 ms: Couldn't connect to server
❯ curl localhost:31134
{"detail":"Not Found"}%
EDIT: I just found this post and resolved this by replacing the host-gateway with the external IP address of my host: --add-host host.docker.internal:<external_ip>. This seems to be because I'm using Docker rootless on linux instead of the rootful Docker. Now this works:
❯ docker exec -it openhands-app curl host.docker.internal:31134
{"detail":"Not Found"}%
Feedback: I tried replacing the IP, but issue persists:
docker run -it --rm --pull=always \
-e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.30-nikolaik \
-e LOG_ALL_EVENTS=true \
-v /var/run/docker.sock:/var/run/docker.sock \
-v ~/.openhands-state:/.openhands-state \
-p 3000:3000 \
--add-host=host.docker.internal:172.xx.xx.1 \
--name openhands-app \
docker.all-hands.dev/all-hands-ai/openhands:0.30
Also on Windows through WSL [Ubuntu].
(172.xx.xx.1 was replaced with my actual ip obtained by ip route | awk 'NR==1 {print $3}')
I'll keep tabs on this thread and try again once resolved.
Feedback: I tried replacing the IP, but issue persists:
docker run -it --rm --pull=always \ -e SANDBOX_RUNTIME_CONTAINER_IMAGE=docker.all-hands.dev/all-hands-ai/runtime:0.30-nikolaik \ -e LOG_ALL_EVENTS=true \ -v /var/run/docker.sock:/var/run/docker.sock \ -v ~/.openhands-state:/.openhands-state \ -p 3000:3000 \ --add-host=host.docker.internal:172.xx.xx.1 \ --name openhands-app \ docker.all-hands.dev/all-hands-ai/openhands:0.30Also on Windows through WSL [Ubuntu]. (172.xx.xx.1 was replaced with my actual ip obtained by
ip route | awk 'NR==1 {print $3}')I'll keep tabs on this thread and try again once resolved.
172.x.x.x looks like a private IP: https://en.m.wikipedia.org/wiki/Private_network
It’s usually used by docker networking. Try other IPs on your machine?
I just tried the README instructions using latest docker-ce within the latest WSL and it doesn't work. The logs print "unable to derive the IP value for host-gateway". I tried replacing "host-gateway" with various hardcoded IPs (172.19.0.1, 192.168.x.x) without luck
I also tried bypassing docker networks entirely by setting network_mode=host, but then the spawned Runtime container seems to fail reaching the main container