OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

[Bug]: Problematic OpenHands always complains and it won't work!

Open ManiProjs opened this issue 9 months ago β€’ 36 comments

Is there an existing issue for the same bug?

  • [x] I have checked the existing issues.

Describe the bug and reproduction steps

I haven't started OH successfully since I installed it on my system! Always giving an error (Psst... They will be always something)

One of them is in Logs section

OpenHands Installation

Docker command in README

OpenHands Version

0.24

Operating System

MacOS

Logs, Errors, Screenshots, and Additional Context

INFO: 192.168.65.1:27411 - "GET /api/settings HTTP/1.1" 200 OK 13:58:58 - openhands:ERROR: session.py:144 - Error creating agent_session: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) Traceback (most recent call last): File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 787, in urlopen response = self._make_request( ^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 534, in _make_request response = conn.getresponse() ^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 516, in getresponse httplib_response = super().getresponse() ^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/http/client.py", line 1428, in getresponse response.begin() File "/usr/local/lib/python3.12/http/client.py", line 331, in begin version, status, reason = self._read_status() ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/http/client.py", line 300, in _read_status raise RemoteDisconnected("Remote end closed connection without" http.client.RemoteDisconnected: Remote end closed connection without response

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/app/.venv/lib/python3.12/site-packages/requests/adapters.py", line 667, in send resp = conn.urlopen( ^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 841, in urlopen retries = retries.increment( ^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/urllib3/util/retry.py", line 474, in increment raise reraise(type(error), error, _stacktrace) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/urllib3/util/util.py", line 38, in reraise raise value.with_traceback(tb) File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 787, in urlopen response = self._make_request( ^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/urllib3/connectionpool.py", line 534, in _make_request response = conn.getresponse() ^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/urllib3/connection.py", line 516, in getresponse httplib_response = super().getresponse() ^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/http/client.py", line 1428, in getresponse response.begin() File "/usr/local/lib/python3.12/http/client.py", line 331, in begin version, status, reason = self._read_status() ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/http/client.py", line 300, in _read_status raise RemoteDisconnected("Remote end closed connection without" urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/app/openhands/server/session/session.py", line 131, in initialize_agent await self.agent_session.start( File "/app/openhands/server/session/agent_session.py", line 102, in start await self._create_runtime( File "/app/openhands/server/session/agent_session.py", line 230, in _create_runtime await self.runtime.connect() File "/app/openhands/runtime/impl/docker/docker_runtime.py", line 157, in connect await call_sync_from_async(self._wait_until_alive) File "/app/openhands/utils/async_utils.py", line 18, in call_sync_from_async result = await coro ^^^^^^^^^^ File "/usr/local/lib/python3.12/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/openhands/utils/async_utils.py", line 17, in coro = loop.run_in_executor(None, lambda: fn(*args, **kwargs)) ^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) ^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) ^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 418, in exc_check raise retry_exc.reraise() ^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 185, in reraise raise self.last_attempt.result() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result return self.__get_result() ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result raise self._exception File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "/app/openhands/runtime/impl/docker/docker_runtime.py", line 347, in _wait_until_alive self.check_if_alive() File "/app/openhands/runtime/impl/action_execution/action_execution_client.py", line 104, in check_if_alive with self._send_action_server_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/openhands/runtime/impl/action_execution/action_execution_client.py", line 101, in _send_action_server_request return send_request(self.session, method, url, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) ^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) ^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result return self.__get_result() ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result raise self._exception File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "/app/openhands/runtime/utils/request.py", line 44, in send_request response = session.request(method, url, timeout=timeout, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/requests/sessions.py", line 589, in request resp = self.send(prep, **send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/requests/sessions.py", line 703, in send r = adapter.send(request, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/requests/adapters.py", line 682, in send raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) INFO: 192.168.65.1:19105 - "GET /api/settings HTTP/1.1" 200 OK INFO: 192.168.65.1:64945 - "GET /api/settings HTTP/1.1" 200 OK INFO: 192.168.65.1:33225 - "GET /api/settings HTTP/1.1" 200 OK INFO: 192.168.65.1:25524 - "GET /api/settings HTTP/1.1" 200 OK INFO: 192.168.65.1:50023 - "GET /api/options/config HTTP/1.1" 200 OK INFO: 192.168.65.1:60196 - "GET /api/settings HTTP/1.1" 200 OK INFO: 192.168.65.1:47334 - "GET /api/conversations/5ad2dd8342064c2688746b73851490d7 HTTP/1.1" 200 OK INFO: 192.168.65.1:50023 - "GET /api/settings HTTP/1.1" 200 OK INFO: 192.168.65.1:30272 - "GET /api/settings HTTP/1.1" 200 OK 14:05:42 - openhands:INFO: listen_socket.py:108 - sio:disconnect:9EYy6X2HedCtOLR_AAAM 14:05:42 - openhands:INFO: standalone_conversation_manager.py:244 - disconnect_from_session:9EYy6X2HedCtOLR_AAAM:5ad2dd8342064c2688746b73851490d7 14:05:45 - openhands:INFO: manage_conversations.py:132 - Initializing new conversation 14:05:45 - openhands:INFO: manage_conversations.py:52 - Loading settings 14:05:45 - openhands:INFO: manage_conversations.py:55 - Settings loaded 14:05:45 - openhands:INFO: manage_conversations.py:78 - Loading conversation store 14:05:45 - openhands:INFO: manage_conversations.py:80 - Conversation store loaded 14:05:45 - openhands:INFO: manage_conversations.py:86 - New conversation ID: ce0a520ced304e1dbb8cc3fae2edb829 14:05:45 - openhands:INFO: manage_conversations.py:93 - Saving metadata for conversation ce0a520ced304e1dbb8cc3fae2edb829 14:05:45 - openhands:INFO: manage_conversations.py:103 - Starting agent loop for conversation ce0a520ced304e1dbb8cc3fae2edb829 14:05:45 - openhands:INFO: standalone_conversation_manager.py:192 - maybe_start_agent_loop:ce0a520ced304e1dbb8cc3fae2edb829 14:05:45 - openhands:INFO: standalone_conversation_manager.py:195 - start_agent_loop:ce0a520ced304e1dbb8cc3fae2edb829 14:05:45 - openhands:INFO: standalone_conversation_manager.py:222 - _get_event_stream:ce0a520ced304e1dbb8cc3fae2edb829 14:05:45 - openhands:INFO: standalone_conversation_manager.py:225 - found_local_agent_loop:ce0a520ced304e1dbb8cc3fae2edb829 14:05:45 - openhands:INFO: manage_conversations.py:121 - Finished initializing conversation ce0a520ced304e1dbb8cc3fae2edb829 INFO: 192.168.65.1:64380 - "POST /api/conversations HTTP/1.1" 200 OK INFO: ('192.168.65.1', 16064) - "WebSocket /socket.io/?latest_event_id=-1&conversation_id=ce0a520ced304e1dbb8cc3fae2edb829&EIO=4&transport=websocket" [accepted] INFO: 192.168.65.1:64380 - "GET /api/conversations/ce0a520ced304e1dbb8cc3fae2edb829 HTTP/1.1" 200 OK INFO: 192.168.65.1:36740 - "GET /api/settings HTTP/1.1" 200 OK 14:05:45 - openhands:INFO: listen_socket.py:30 - sio:connect: SNJdDt5Uo3QPlvarAAAO 14:05:45 - openhands:INFO: standalone_conversation_manager.py:92 - join_conversation:ce0a520ced304e1dbb8cc3fae2edb829:SNJdDt5Uo3QPlvarAAAO 14:05:45 - openhands:INFO: standalone_conversation_manager.py:222 - _get_event_stream:ce0a520ced304e1dbb8cc3fae2edb829 14:05:45 - openhands:INFO: standalone_conversation_manager.py:225 - found_local_agent_loop:ce0a520ced304e1dbb8cc3fae2edb829 14:05:46 - openhands:INFO: docker_runtime.py:139 - [runtime ce0a520ced304e1dbb8cc3fae2edb829] Starting runtime with image: docker.all-hands.dev/all-hands-ai/runtime:0.21-nikolaik 14:05:52 - openhands:INFO: standalone_conversation_manager.py:256 - _close_session:5ad2dd8342064c2688746b73851490d7 14:05:52 - openhands:INFO: standalone_conversation_manager.py:264 - removing connections: [] 14:05:52 - openhands:INFO: standalone_conversation_manager.py:273 - closing_session:5ad2dd8342064c2688746b73851490d7 14:05:58 - openhands:INFO: docker_runtime.py:143 - [runtime ce0a520ced304e1dbb8cc3fae2edb829] Container started: openhands-runtime-ce0a520ced304e1dbb8cc3fae2edb829. VSCode URL: None 14:05:58 - openhands:INFO: docker_runtime.py:154 - [runtime ce0a520ced304e1dbb8cc3fae2edb829] Waiting for client to become ready at http://host.docker.internal:38700... 14:06:07 - openhands:ERROR: standalone_conversation_manager.py:153 - error_cleaning_stale INFO: 192.168.65.1:51141 - "GET /api/settings HTTP/1.1" 200 OK 14:07:19 - openhands:ERROR: agent_session.py:232 - Runtime initialization failed: Container openhands-runtime-ce0a520ced304e1dbb8cc3fae2edb829 has exited. 14:07:19 - openhands:INFO: agent_controller.py:451 - [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] Setting agent(CodeActAgent) state from AgentState.LOADING to AgentState.ERROR 14:07:19 - USER_ACTION [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] MessageAction (source=EventSource.USER) CONTENT: I want to create a Hello World app in Javascript that:

  • Displays Hello World in the middle.
  • Has a button that when clicked, changes the greeting with a bouncing animation to fun versions of Hello.
  • Has a counter for how many times the button has been clicked.
  • Has another button that changes the app's background color. 14:07:19 - openhands:INFO: agent_controller.py:451 - [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] Setting agent(CodeActAgent) state from AgentState.ERROR to AgentState.RUNNING

============== [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] LEVEL 0 LOCAL STEP 0 GLOBAL STEP 0

14:07:19 - openhands:ERROR: agent_controller.py:228 - [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] Error while running the agent (session ID: ce0a520ced304e1dbb8cc3fae2edb829): litellm.NotFoundError: NotFoundError: OpenAIException - 404 page not found. Traceback: Traceback (most recent call last): File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 711, in completion raise e File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 638, in completion self.make_sync_openai_chat_completion_request( File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_utils.py", line 145, in sync_wrapper result = func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 457, in make_sync_openai_chat_completion_request raise e File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 439, in make_sync_openai_chat_completion_request raw_response = openai_client.chat.completions.with_raw_response.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/openai/_legacy_response.py", line 364, in wrapped return cast(LegacyAPIResponse[R], func(*args, **kwargs)) ^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/openai/_utils/_utils.py", line 279, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 863, in create return self._post( ^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1283, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 960, in request return self._request( ^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1064, in _request raise self._make_status_error_from_response(err.response) from None openai.NotFoundError: 404 page not found

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 1724, in completion raise e File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 1697, in completion response = openai_chat_completions.completion( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/litellm/llms/openai/openai.py", line 721, in completion raise OpenAIError( litellm.llms.openai.common_utils.OpenAIError: 404 page not found

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/app/openhands/controller/agent_controller.py", line 226, in _step_with_exception_handling await self._step() File "/app/openhands/controller/agent_controller.py", line 662, in _step action = self.agent.step(self.state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/openhands/agenthub/codeact_agent/codeact_agent.py", line 391, in step response = self.llm.completion(**params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) ^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) ^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 449, in result return self.__get_result() ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result raise self._exception File "/app/.venv/lib/python3.12/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "/app/openhands/llm/llm.py", line 247, in wrapper resp: ModelResponse = self._completion_unwrapped(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 1156, in wrapper raise e File "/app/.venv/lib/python3.12/site-packages/litellm/utils.py", line 1034, in wrapper result = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/litellm/main.py", line 3085, in completion raise exception_type( ^^^^^^^^^^^^^^^ File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2201, in exception_type raise e File "/app/.venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 399, in exception_type raise NotFoundError( litellm.exceptions.NotFoundError: litellm.NotFoundError: NotFoundError: OpenAIException - 404 page not found

14:07:19 - openhands:INFO: agent_controller.py:451 - [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] Setting agent(CodeActAgent) state from AgentState.RUNNING to AgentState.ERROR 14:07:19 - openhands:INFO: agent_controller.py:451 - [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] Setting agent(CodeActAgent) state from AgentState.ERROR to AgentState.ERROR 14:07:19 - openhands:INFO: agent_controller.py:451 - [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] Setting agent(CodeActAgent) state from AgentState.ERROR to AgentState.RUNNING 14:07:19 - OBSERVATION [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] NullObservation(content='', observation='null') 14:07:19 - OBSERVATION [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] AgentStateChangedObservation(content='', agent_state=<AgentState.ERROR: 'error'>, observation='agent_state_changed') 14:07:19 - OBSERVATION [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] NullObservation(content='', observation='null') 14:07:19 - OBSERVATION [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] AgentStateChangedObservation(content='', agent_state=<AgentState.RUNNING: 'running'>, observation='agent_state_changed') 14:07:19 - OBSERVATION [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] AgentStateChangedObservation(content='', agent_state=<AgentState.ERROR: 'error'>, observation='agent_state_changed') 14:07:19 - OBSERVATION [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] AgentStateChangedObservation(content='', agent_state=<AgentState.RUNNING: 'running'>, observation='agent_state_changed') INFO: 192.168.65.1:30802 - "GET /api/settings HTTP/1.1" 200 OK INFO: 192.168.65.1:26508 - "GET /api/options/config HTTP/1.1" 200 OK INFO: 192.168.65.1:26619 - "GET /api/settings HTTP/1.1" 200 OK INFO: 192.168.65.1:26508 - "GET /api/settings HTTP/1.1" 200 OK 14:08:55 - openhands:ERROR: standalone_conversation_manager.py:79 - Error connecting to conversation ce0a520ced304e1dbb8cc3fae2edb829: Container openhands-runtime-ce0a520ced304e1dbb8cc3fae2edb829 has exited. INFO: 192.168.65.1:59883 - "GET /api/conversations/ce0a520ced304e1dbb8cc3fae2edb829/list-files HTTP/1.1" 404 Not Found 14:10:11 - openhands:ERROR: standalone_conversation_manager.py:79 - Error connecting to conversation ce0a520ced304e1dbb8cc3fae2edb829: Container openhands-runtime-ce0a520ced304e1dbb8cc3fae2edb829 has exited. INFO: 192.168.65.1:59686 - "GET /api/conversations/ce0a520ced304e1dbb8cc3fae2edb829/vscode-url HTTP/1.1" 404 Not Found 14:11:25 - openhands:ERROR: standalone_conversation_manager.py:79 - Error connecting to conversation ce0a520ced304e1dbb8cc3fae2edb829: Container openhands-runtime-ce0a520ced304e1dbb8cc3fae2edb829 has exited. INFO: 192.168.65.1:26508 - "GET /api/conversations/ce0a520ced304e1dbb8cc3fae2edb829/list-files HTTP/1.1" 404 Not Found ^C^C^C^C^C^C^C^C^C c^C^C^C^C^C^C^C^CINFO: Shutting down 14:12:50 - openhands:INFO: listen_socket.py:108 - sio:disconnect:SNJdDt5Uo3QPlvarAAAO 14:12:50 - openhands:INFO: standalone_conversation_manager.py:244 - disconnect_from_session:SNJdDt5Uo3QPlvarAAAO:ce0a520ced304e1dbb8cc3fae2edb829 INFO: Waiting for application shutdown. INFO: Application shutdown complete. INFO: Finished server process [10] 14:12:50 - openhands:INFO: standalone_conversation_manager.py:256 - _close_session:ce0a520ced304e1dbb8cc3fae2edb829 14:12:50 - openhands:INFO: standalone_conversation_manager.py:264 - removing connections: [] 14:12:50 - openhands:INFO: standalone_conversation_manager.py:273 - closing_session:ce0a520ced304e1dbb8cc3fae2edb829 14:12:50 - openhands:INFO: agent_controller.py:451 - [Agent Controller ce0a520ced304e1dbb8cc3fae2edb829] Setting agent(CodeActAgent) state from AgentState.RUNNING to AgentState.STOPPED 14:12:50 - openhands:WARNING: stream.py:259 - Callback not found during unsubscribe: ce0a520ced304e1dbb8cc3fae2edb829

ManiProjs avatar Mar 04 '25 14:03 ManiProjs

Thank you for the report! This looks like the cause:

raise OpenAIError(
litellm.llms.openai.common_utils.OpenAIError: 404 page not found

Can you please check the model name in Settings, what model did you use?

enyst avatar Mar 04 '25 21:03 enyst

openai/llama3.2 I use Ollama, Ollama uses OpenAI-compatible API (I've set base URL to http://host.docker.internal:11434 (Ollama's base URL))

ManiProjs avatar Mar 08 '25 11:03 ManiProjs

Please take a look at this doc for ollama and see it depends how you run: https://docs.all-hands.dev/modules/usage/llms/local-llms

Base URL if you need access from inside a docker container should be http://host.docker.internal:11434

enyst avatar Mar 08 '25 12:03 enyst

On Mar 8, 2025, at 4:28β€―PM, Engel Nyst @.***> wrote:

enyst left a comment (All-Hands-AI/OpenHands#7094) Please take a look at this doc for ollama and see it depends how you run: https://docs.all-hands.dev/modules/usage/llms/local-llms

Base URL if you need access from inside a docker container should be http://host.docker.internal:11434

β€” Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.

https://docs.all-hands.dev/modules/usage/llms/local-llms http://host.docker.internal:11434/ https://github.com/All-Hands-AI/OpenHands/issues/7094#issuecomment-2708254936 https://github.com/notifications/unsubscribe-auth/BMLYYWW7ECZE6SVSXCUXHTT2TLSPRAVCNFSM6AAAAABYJVBEKWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDOMBYGI2TIOJTGY

enyst left a comment (All-Hands-AI/OpenHands#7094) https://github.com/All-Hands-AI/OpenHands/issues/7094#issuecomment-2708254936 Please take a look at this doc for ollama and see it depends how you run: https://docs.all-hands.dev/modules/usage/llms/local-llms

Base URL if you need access from inside a docker container should be http://host.docker.internal:11434 http://host.docker.internal:11434/ β€” Reply to this email directly, view it on GitHub https://github.com/All-Hands-AI/OpenHands/issues/7094#issuecomment-2708254936, or unsubscribe https://github.com/notifications/unsubscribe-auth/BMLYYWW7ECZE6SVSXCUXHTT2TLSPRAVCNFSM6AAAAABYJVBEKWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDOMBYGI2TIOJTGY. You are receiving this because you authored the thread.

I just found the way to fix it. I must put http://com.docker.internal:11434/v1 http://com.docker.internal/v1 instead of http://com.docker.interna http://com.docker.interna/l:11434 (Sorry for the mistake, I put http://localhost:11434 http://localhost:11434/ by mistake, I set http://com.docker.interna http://com.docker.interna/l:11434)

ManiProjs avatar Mar 08 '25 13:03 ManiProjs

I have another problem. If I want to build it from source, Building the Docker image takes forever. (On a MacBook Air (M3) with 8GB of RAM) Is it because I don't have enough system requirements, or the problem of OpenHands? (I will try to use GitHub for the provider)

It prints these thousands of times: "Delete 1041 entries? (yes/no) [yes] Aborted"

ManiProjs avatar Mar 17 '25 15:03 ManiProjs

@ManiProjs Are you using the latest version? Please do upgrade, that sounds like a cache issue with poetry 2.0.x that we solved a while ago.

enyst avatar Mar 17 '25 16:03 enyst

How? ( I did a git pull)

ManiProjs avatar Mar 17 '25 16:03 ManiProjs

Yes, and please run make build.

Also, what version of poetry are you using?

enyst avatar Mar 17 '25 16:03 enyst

2.0.1 works well, although higher versions should work too.

enyst avatar Mar 17 '25 16:03 enyst

I use poetry version 2.1.1 When I start building, After some time, It will print "Delete 1041 entries? (yes/no) [yes] Aborted" millions of times and Docker will print "[output clipped, log limit 2MiB reached]"

ManiProjs avatar Mar 17 '25 16:03 ManiProjs

@enyst Thank you so much. Build completed with updating. Now, It giving me this error (Docker is running):

➜ OpenHands git:(main) βœ— make run Running the app... Starting backend server... Waiting for the backend to start... Connection to localhost port 3000 [tcp/hbci] succeeded! Backend started successfully. Starting frontend...

[email protected] dev npm run make-i18n && cross-env VITE_MOCK_API=false react-router dev --port 3001 --host 127.0.0.1

[email protected] make-i18n node scripts/make-i18n-translations.cjs

➜ Local: http://127.0.0.1:3001/ ➜ press h + enter to show help 19:55:54 - openhands:WARNING: llm_config.py:128 - Cannot parse [llm] config from toml. Continuing with defaults. 19:55:54 - openhands:INFO: server_config.py:38 - Using config class None INFO: Started server process [85359] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:3000 (Press CTRL+C to quit) INFO: 127.0.0.1:55836 - "GET /api/options/config HTTP/1.1" 200 OK INFO: 127.0.0.1:55837 - "GET /api/settings HTTP/1.1" 200 OK 19:56:04 - openhands:INFO: manage_conversations.py:148 - Initializing new conversation 19:56:04 - openhands:INFO: manage_conversations.py:54 - Creating conversation 19:56:04 - openhands:INFO: manage_conversations.py:58 - Loading settings 19:56:04 - openhands:INFO: manage_conversations.py:61 - Settings loaded 19:56:04 - openhands:INFO: manage_conversations.py:85 - Loading conversation store 19:56:04 - openhands:INFO: manage_conversations.py:87 - Conversation store loaded 19:56:04 - openhands:INFO: manage_conversations.py:93 - New conversation ID: cfba1811045d46238c81b1e737200ce0 19:56:04 - openhands:INFO: manage_conversations.py:103 - Saving metadata for conversation cfba1811045d46238c81b1e737200ce0 19:56:04 - openhands:INFO: manage_conversations.py:115 - Starting agent loop for conversation cfba1811045d46238c81b1e737200ce0 19:56:04 - openhands:INFO: standalone_conversation_manager.py:259 - maybe_start_agent_loop:cfba1811045d46238c81b1e737200ce0 19:56:04 - openhands:INFO: standalone_conversation_manager.py:262 - start_agent_loop:cfba1811045d46238c81b1e737200ce0 19:56:04 - openhands:INFO: standalone_conversation_manager.py:312 - get_event_stream:cfba1811045d46238c81b1e737200ce0 19:56:04 - openhands:INFO: standalone_conversation_manager.py:315 - found_local_agent_loop:cfba1811045d46238c81b1e737200ce0 19:56:04 - openhands:INFO: manage_conversations.py:136 - Finished initializing conversation cfba1811045d46238c81b1e737200ce0 19:56:04 - openhands:INFO: session.py:121 - Enabling default condenser: type='llm' llm_config=LLMConfig(model='github/DeepSeek-V3', api_key=''(''), base_url='https://models.inference.ai.azure.com', api_version=None, aws_access_key_id='', aws_secret_access_key='****', aws_region_name=None, openrouter_site_url='https://docs.all-hands.dev/', openrouter_app_name='OpenHands', num_retries=4, retry_multiplier=2, retry_min_wait=5, retry_max_wait=30, timeout=None, max_message_chars=30000, temperature=0.0, top_p=1.0, custom_llm_provider=None, max_input_tokens=4096, max_output_tokens=4096, input_cost_per_token=None, output_cost_per_token=None, ollama_base_url=None, drop_params=True, modify_params=True, disable_vision=None, caching_prompt=True, log_completions=False, log_completions_folder='/Users/farrokharasteh/projs/OpenHands/logs/completions', custom_tokenizer=None, native_tool_calling=None, reasoning_effort='high') keep_first=3 max_size=40 INFO: 127.0.0.1:55842 - "POST /api/conversations HTTP/1.1" 200 OK INFO: ('127.0.0.1', 55847) - "WebSocket /socket.io/?latest_event_id=-1&conversation_id=cfba1811045d46238c81b1e737200ce0&EIO=4&transport=websocket" [accepted] INFO: ('127.0.0.1', 55848) - "WebSocket /socket.io/?latest_event_id=-1&conversation_id=cfba1811045d46238c81b1e737200ce0&EIO=4&transport=websocket" [accepted] INFO: 127.0.0.1:55846 - "GET /api/conversations/cfba1811045d46238c81b1e737200ce0 HTTP/1.1" 200 OK 19:56:05 - openhands:INFO: listen_socket.py:32 - sio:connect: o5KFkm25wwSZwPouAAAC 19:56:05 - openhands:INFO: standalone_conversation_manager.py:116 - join_conversation:cfba1811045d46238c81b1e737200ce0:o5KFkm25wwSZwPouAAAC 19:56:05 - openhands:INFO: standalone_conversation_manager.py:312 - get_event_stream:cfba1811045d46238c81b1e737200ce0 19:56:05 - openhands:INFO: standalone_conversation_manager.py:315 - found_local_agent_loop:cfba1811045d46238c81b1e737200ce0 19:56:05 - openhands:INFO: runtime_build.py:182 - Building image: ghcr.io/all-hands-ai/runtime:oh_v0.28.1_int1hazwbuzz6xgj_2oh7gm2mz1gmzvnk 19:56:05 - openhands:INFO: docker_runtime.py:140 - [runtime cfba1811045d46238c81b1e737200ce0] Starting runtime with image: ghcr.io/all-hands-ai/runtime:oh_v0.28.1_int1hazwbuzz6xgj_2oh7gm2mz1gmzvnk 19:56:06 - openhands:INFO: docker_runtime.py:144 - [runtime cfba1811045d46238c81b1e737200ce0] Container started: openhands-runtime-cfba1811045d46238c81b1e737200ce0. VSCode URL: None 19:56:06 - openhands:INFO: docker_runtime.py:155 - [runtime cfba1811045d46238c81b1e737200ce0] Waiting for client to become ready at http://localhost:38939... 19:56:10 - openhands:ERROR: agent_session.py:266 - Runtime initialization failed: Container openhands-runtime-cfba1811045d46238c81b1e737200ce0 has exited. 19:56:10 - openhands:INFO: base.py:318 - [runtime cfba1811045d46238c81b1e737200ce0] Selected repo: None, loading microagents from /workspace/.openhands/microagents (inside runtime) 19:56:10 - openhands:INFO: agent_controller.py:507 - [Agent Controller cfba1811045d46238c81b1e737200ce0] Setting agent(CodeActAgent) state from AgentState.LOADING to AgentState.ERROR 19:56:10 - openhands:INFO: session.py:259 - Agent status error 19:56:10 - openhands:INFO: session.py:200 - Agent status error 19:56:10 - openhands:INFO: agent_session.py:161 - Agent session start 19:56:10 - openhands:ERROR: session.py:149 - Error creating agent_session: HTTPConnectionPool(host='localhost', port=38939): Max retries exceeded with url: /execute_action (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x12457e360>: Failed to establish a new connection: [Errno 61] Connection refused')) Traceback (most recent call last): File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/urllib3/connection.py", line 198, in new_conn sock = connection.create_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/urllib3/util/connection.py", line 85, in create_connection raise err File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/urllib3/util/connection.py", line 73, in create_connection sock.connect(sa) ConnectionRefusedError: [Errno 61] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc_-py3.12/lib/python3.12/site-packages/urllib3/connectionpool.py", line 787, in urlopen response = self.make_request( ^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/urllib3/connectionpool.py", line 493, in make_request conn.request( File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/urllib3/connection.py", line 445, in request self.endheaders() File "/opt/local/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/http/client.py", line 1333, in endheaders self._send_output(message_body, encode_chunked=encode_chunked) File "/opt/local/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/http/client.py", line 1093, in send_output self.send(msg) File "/opt/local/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/http/client.py", line 1037, in send self.connect() File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/urllib3/connection.py", line 276, in connect self.sock = self.new_conn() ^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/urllib3/connection.py", line 213, in _new_conn raise NewConnectionError( urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x12457e360>: Failed to establish a new connection: [Errno 61] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc_-py3.12/lib/python3.12/site-packages/requests/adapters.py", line 667, in send resp = conn.urlopen( ^^^^^^^^^^^^^ File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc_-py3.12/lib/python3.12/site-packages/urllib3/connectionpool.py", line 841, in urlopen retries = retries.increment( ^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc_-py3.12/lib/python3.12/site-packages/urllib3/util/retry.py", line 519, in increment raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=38939): Max retries exceeded with url: /execute_action (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x12457e360>: Failed to establish a new connection: [Errno 61] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/Users/farrokharasteh/projs/OpenHands/openhands/server/session/session.py", line 135, in initialize_agent await self.agent_session.start( File "/Users/farrokharasteh/projs/OpenHands/openhands/server/session/agent_session.py", line 134, in start self.memory = await self.create_memory( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/projs/OpenHands/openhands/server/session/agent_session.py", line 357, in create_memory microagents: list[BaseMicroAgent] = await call_sync_from_async( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/projs/OpenHands/openhands/utils/async_utils.py", line 18, in call_sync_from_async result = await coro ^^^^^^^^^^ File "/opt/local/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/concurrent/futures/thread.py", line 59, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/projs/OpenHands/openhands/utils/async_utils.py", line 17, in coro = loop.run_in_executor(None, lambda: fn(*args, **kwargs)) ^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/projs/OpenHands/openhands/runtime/base.py", line 325, in get_microagents_from_selected_repo obs = self.read( ^^^^^^^^^^ File "/Users/farrokharasteh/projs/OpenHands/openhands/runtime/impl/action_execution/action_execution_client.py", line 286, in read return self.send_action_for_execution(action) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/projs/OpenHands/openhands/runtime/impl/action_execution/action_execution_client.py", line 263, in send_action_for_execution with self.send_action_server_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/projs/OpenHands/openhands/runtime/impl/action_execution/action_execution_client.py", line 104, in send_action_server_request return send_request(self.session, method, url, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 336, in wrapped_f return copy(f, *args, **kw) ^^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 475, in call do = self.iter(retry_state=retry_state) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 376, in iter result = action(retry_state) ^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 398, in self._add_action_func(lambda rs: rs.outcome.result()) ^^^^^^^^^^^^^^^^^^^ File "/opt/local/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/concurrent/futures/_base.py", line 449, in result return self.__get_result() ^^^^^^^^^^^^^^^^^^^ File "/opt/local/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/concurrent/futures/base.py", line 401, in get_result raise self.exception File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/tenacity/init.py", line 478, in call result = fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/projs/OpenHands/openhands/runtime/utils/request.py", line 44, in send_request response = session.request(method, url, timeout=timeout, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/requests/sessions.py", line 589, in request resp = self.send(prep, **send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/requests/sessions.py", line 703, in send r = adapter.send(request, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/farrokharasteh/Library/Caches/pypoetry/virtualenvs/openhands-ai-zrIbYYc-py3.12/lib/python3.12/site-packages/requests/adapters.py", line 700, in send raise ConnectionError(e, request=request) requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=38939): Max retries exceeded with url: /execute_action (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x12457e360>: Failed to establish a new connection: [Errno 61] Connection refused'))

ManiProjs avatar Mar 17 '25 16:03 ManiProjs

Could you delete your old openhands containers and images and try again?

enyst avatar Mar 17 '25 16:03 enyst

Problem still exists😭

ManiProjs avatar Mar 17 '25 16:03 ManiProjs

Could you please confirm, so you set settings like these in the UI Settings window?

llm_config=LLMConfig(model='github/DeepSeek-V3', api_key=''(''), base_url='https://models.inference.ai.azure.com/',

enyst avatar Mar 17 '25 16:03 enyst

Yes (I set API key)

ManiProjs avatar Mar 17 '25 16:03 ManiProjs

Have you selected a github repo from the home page?

enyst avatar Mar 17 '25 16:03 enyst

No. I used the recommendations shown in the homepage

ManiProjs avatar Mar 17 '25 16:03 ManiProjs

I am curious why it's looking for a.... (I don't know the names of sites that they have localhost in their domain. I will call them Server (with capital S)) with some random ports (They usually start with 3). It must be looking for a Server that it has port 3000 (or any other ports)

I am git pulling frequently so maybe, someone modifies the code

ManiProjs avatar Mar 17 '25 17:03 ManiProjs

Forget it. I kissed building from source goodbye. I pulled OpenHands from GitHub Packages

ManiProjs avatar Mar 17 '25 17:03 ManiProjs

Great, if it works for you, good to hear!

Just to answer the question, the agent is running inside a docker sandbox container, and the communication between the main application and the inside of the sandbox is via http, with that port. It's an action execution server in docker. It's for security, the agent does what the LLM says, and the LLMs are perfectly capable to modify files, delete etc. You probably don't want that to happen directly on your machine! πŸ˜…

(I still suspect the problem there was with docker communication/initialization. Maybe a restart docker desktop would take care of it.)

enyst avatar Mar 17 '25 18:03 enyst

When I use curl to get access to /execute_action, it gives me this weird HTML code 🀣:

<!DOCTYPE html><html lang="en"><head><meta charSet="utf-8"/><meta name="viewport" content="width=device-width, initial-scale=1"/><title>OpenHands</title><meta name="description" content="Let&#x27;s Start Building!"/><link rel="modulepreload" href="/assets/manifest-3044e59f.js"/><link rel="modulepreload" href="/assets/entry.client-DNAXCeCt.js"/><link rel="modulepreload" href="/assets/chunk-K6CSEXPM-oxKRe8gl.js"/><link rel="modulepreload" href="/assets/settings-context-DxY9gF0_.js"/><link rel="modulepreload" href="/assets/react-redux-C2e_S6_C.js"/><link rel="modulepreload" href="/assets/index-BH-AKPIL.js"/><link rel="modulepreload" href="/assets/open-hands-axios-ChWdMQne.js"/><link rel="modulepreload" href="/assets/infiniteQueryBehavior-DR6hsGfW.js"/><link rel="modulepreload" href="/assets/store-CsPrx2CU.js"/><link rel="modulepreload" href="/assets/use-config-CTpMT_Oe.js"/><link rel="modulepreload" href="/assets/open-hands-CerUI6hN.js"/><link rel="modulepreload" href="/assets/index-DC282svN.js"/><link rel="modulepreload" href="/assets/preload-helper-D7HrI6pR.js"/><link rel="modulepreload" href="/assets/i18nInstance-DBIXdvxg.js"/><link rel="modulepreload" href="/assets/initial-query-slice-vu2vfy5H.js"/><link rel="modulepreload" href="/assets/browser-slice-CJzqAyuI.js"/><link rel="modulepreload" href="/assets/agent-state-u5yf9HVO.js"/><link rel="modulepreload" href="/assets/root-DCbqQEAC.js"/><link rel="modulepreload" href="/assets/with-props-DamRiGtk.js"/><link rel="stylesheet" href="/assets/root-DoQcXSxY.css"/></head><body><script>
              console.log(
                "πŸ’Ώ Hey developer πŸ‘‹. You can provide a way better UX than this " +
                "when your app is loading JS modules and/or running `clientLoader` " +
                "functions. Check out https://remix.run/route/hydrate-fallback " +
                "for more information."
              );
            </script><script>window.__reactRouterContext = {"basename":"/","future":{"unstable_middleware":false,"unstable_optimizeDeps":false,"unstable_splitRouteModules":false,"unstable_viteEnvironmentApi":false},"ssr":false,"isSpaMode":true};window.__reactRouterContext.stream = new ReadableStream({start(controller){window.__reactRouterContext.streamController = controller;}}).pipeThrough(new TextEncoderStream());</script><script type="module" async="">import "/assets/manifest-3044e59f.js";
import * as route0 from "/assets/root-DCbqQEAC.js";

  window.__reactRouterRouteModules = {"root":route0};

import("/assets/entry.client-DNAXCeCt.js");</script><div id="_rht_toaster" style="position:fixed;z-index:9999;top:16px;left:16px;right:16px;bottom:16px;pointer-events:none"></div><!--$--><script>window.__reactRouterContext.streamController.enqueue("[{\"_1\":2,\"_6\":-5,\"_7\":-5},\"loaderData\",{\"_3\":-5,\"_4\":-5,\"_5\":-5},\"root\",\"routes/_oh/route\",\"routes/_oh._index/route\",\"actionData\",\"errors\"]\n");</script><!--$--><script>window.__reactRouterContext.streamController.close();</script><!--/$--><!--/$--></body></html>

EDIT: Another error when using built from source version: openhands.runtime.utils.request.RequestHTTPError: 405 Client Error: Method Not Allowed for url: http://localhost:35291/execute_action Details: Method Not Allowed

ManiProjs avatar Mar 17 '25 18:03 ManiProjs

Sorry, I don't follow, when does it give that error, what else did you change? It's not supposed to be run directly

enyst avatar Mar 17 '25 18:03 enyst

Forget that. Now it's giving Connection refused. By the way, I kissed it goodbye. Now it giving this error: BadRequestError: litellm.BadRequestError: OpenAIException - Error code: 400 - {'error': {'code': 'Bad Request', 'message': '{"object":"error","message":"400","type":"Failed to parse fc related info to json format!","param":null,"code":400}', 'status': 400}}

Note that I am using Docker image available in GitHub Packages

ManiProjs avatar Mar 17 '25 18:03 ManiProjs

Ok, that's a problem with the LLM. For Azure, I think the model needs to be what Azure calls "deployment of the model": model='github/DeepSeek-V3' should be the deployment name. What is the name of your deployment of v3 on Azure?

enyst avatar Mar 17 '25 18:03 enyst

I use GitHub Models. I don’t use Azure

On Mon, Mar 17, 2025 at 22:04 Engel Nyst @.***> wrote:

Ok, that's a problem with the LLM. For Azure, I think the model needs to be what Azure calls "deployment of the model": model='github/DeepSeek-V3' should be the deployment name. What is the name of your deployment of v3 on Azure?

β€” Reply to this email directly, view it on GitHub https://github.com/All-Hands-AI/OpenHands/issues/7094#issuecomment-2730491617, or unsubscribe https://github.com/notifications/unsubscribe-auth/BMLYYWTCIL2KMSHSAHGZZWL2U4IT7AVCNFSM6AAAAABYJVBEKWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDOMZQGQ4TCNRRG4 . You are receiving this because you were mentioned.Message ID: @.***> [image: enyst]enyst left a comment (All-Hands-AI/OpenHands#7094) https://github.com/All-Hands-AI/OpenHands/issues/7094#issuecomment-2730491617

Ok, that's a problem with the LLM. For Azure, I think the model needs to be what Azure calls "deployment of the model": model='github/DeepSeek-V3' should be the deployment name. What is the name of your deployment of v3 on Azure?

β€” Reply to this email directly, view it on GitHub https://github.com/All-Hands-AI/OpenHands/issues/7094#issuecomment-2730491617, or unsubscribe https://github.com/notifications/unsubscribe-auth/BMLYYWTCIL2KMSHSAHGZZWL2U4IT7AVCNFSM6AAAAABYJVBEKWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDOMZQGQ4TCNRRG4 . You are receiving this because you were mentioned.Message ID: @.***>

ManiProjs avatar Mar 17 '25 18:03 ManiProjs

Could you please explain, what do you mean by "GitHub models"?

I see this in your settings:

base_url='https://models.inference.ai.azure.com/'

Where does Azure come from?

enyst avatar Mar 17 '25 18:03 enyst

GitHub Models is GitHub's new feature. Go to https://github.com/marketplace/models

Click on "Select a model":

Image

For example, I use GPT 4o mini:

Image

It opens a playground

Image

Now, If you want to use it in any projects (including OpenHands), You click on "Use this model":

Image

It shows the instruction of how to use it in multiple programming languages:

Image

ManiProjs avatar Mar 17 '25 18:03 ManiProjs

GitHub Models! GitHub's new feature. Go to https://github.com/marketplace/models

According to that link, there are hosted on Azure:

Image

I'm not sure how directly on GitHub works, could you help me understand what API key did you use, as I didn't see another than token or Azure?

enyst avatar Mar 17 '25 18:03 enyst

I use my GitHub's Personal Access Token (PAT). I don't recommend that you use Fine-grained token

ManiProjs avatar Mar 17 '25 18:03 ManiProjs

Ah, I see, you tried to use the token as API key. I have no idea if that works with litellm.

openhands uses LiteLLM library to support over a hundred models and providers. I don't know if litellm supports yet these models. Maybe we can see here or ask them in an issue?

https://github.com/BerriAI/litellm

enyst avatar Mar 17 '25 18:03 enyst