agent-zero icon indicating copy to clipboard operation
agent-zero copied to clipboard

All connection attempts failed. -- So I have looked thru all other issues that are technically the same and nothing has fixed this. What am I doing wrong?

Open Reaper176 opened this issue 6 months ago • 3 comments

Here is a printout of the error

Text	

All connection attempts failed

Traceback (most recent call last):
Traceback (most recent call last):
  File "/opt/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
    yield
  File "/opt/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 394, in handle_async_request
    resp = await self._pool.handle_async_request(req)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 256, in handle_async_request
    raise exc from None
  File "/opt/venv/lib/python3.12/site-packages/httpcore/_async/connection_pool.py", line 236, in handle_async_request
    response = await connection.handle_async_request(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 101, in handle_async_request
    raise exc
  File "/opt/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 78, in handle_async_request
    stream = await self._connect(request)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/httpcore/_async/connection.py", line 124, in _connect
    stream = await self._network_backend.connect_tcp(**kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/httpcore/_backends/auto.py", line 31, in connect_tcp
    return await self._backend.connect_tcp(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/httpcore/_backends/anyio.py", line 113, in connect_tcp
    with map_exceptions(exc_map):
         ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
    self.gen.throw(value)
  File "/opt/venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
    raise to_exc(exc) from exc
httpcore.ConnectError: All connection attempts failed

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/a0/agent.py", line 322, in monologue
    prompt = await self.prepare_prompt(loop_data=self.loop_data)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/a0/agent.py", line 408, in prepare_prompt
    await self.call_extensions("message_loop_prompts_after", loop_data=loop_data)
  File "/a0/agent.py", line 790, in call_extensions
    await cls(agent=self).execute(**kwargs)
  File "/a0/python/extensions/message_loop_prompts_after/_91_recall_wait.py", line 13, in execute
    await task
  File "/usr/lib/python3.12/asyncio/futures.py", line 289, in __await__
    yield self  # This tells Task to wait for completion.
    ^^^^^^^^^^
  File "/usr/lib/python3.12/asyncio/tasks.py", line 385, in __wakeup
    future.result()
  File "/usr/lib/python3.12/asyncio/futures.py", line 202, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/lib/python3.12/asyncio/tasks.py", line 314, in __step_run_and_handle_result
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/a0/python/extensions/message_loop_prompts_after/_50_recall_memories.py", line 60, in search_memories
    query = await self.agent.call_utility_model(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/a0/agent.py", line 614, in call_utility_model
    async for chunk in (prompt | model).astream({}):
  File "/opt/venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 3465, in astream
    async for chunk in self.atransform(input_aiter(), config, **kwargs):
  File "/opt/venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 3447, in atransform
    async for chunk in self._atransform_stream_with_config(
  File "/opt/venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 2322, in _atransform_stream_with_config
    chunk = await coro_with_context(py_anext(iterator), context)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/asyncio/futures.py", line 289, in __await__
    yield self  # This tells Task to wait for completion.
    ^^^^^^^^^^
  File "/usr/lib/python3.12/asyncio/tasks.py", line 385, in __wakeup
    future.result()
  File "/usr/lib/python3.12/asyncio/futures.py", line 202, in result
    raise self._exception.with_traceback(self._exception_tb)
  File "/usr/lib/python3.12/asyncio/tasks.py", line 314, in __step_run_and_handle_result
    result = coro.send(None)
             ^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 3414, in _atransform
    async for output in final_pipeline:
  File "/opt/venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 1489, in atransform
    async for output in self.astream(final, config, **kwargs):
  File "/opt/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 592, in astream
    async for chunk in self._astream(
  File "/opt/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 755, in _astream
    async for stream_resp in self._acreate_chat_stream(messages, stop, **kwargs):
  File "/opt/venv/lib/python3.12/site-packages/langchain_ollama/chat_models.py", line 575, in _acreate_chat_stream
    async for part in await self._async_client.chat(**chat_params):

>>>  33 stack lines skipped <<<

  File "/opt/venv/lib/python3.12/site-packages/httpx/_client.py", line 1657, in _send_handling_auth
    response = await self._send_handling_redirects(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/httpx/_client.py", line 1694, in _send_handling_redirects
    response = await self._send_single_request(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/httpx/_client.py", line 1730, in _send_single_request
    response = await transport.handle_async_request(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 393, in handle_async_request
    with map_httpcore_exceptions():
         ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
    self.gen.throw(value)
  File "/opt/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
    raise mapped_exc(message) from exc
httpx.ConnectError: All connection attempts failed


httpx.ConnectError: All connection attempts failed

Also a screenshot of my settings. Each of the other fields is filled out in a similar way. Please not that localhost and 127.0.0.1 both produce this same error.

Image

ollama is in fact running

Image

I am able to connect to ollama thru openwebui with no issues. I will also note that I have tried this with openwebui running and not just to make sure they are not running into each other somehow.

I have also set OLLAMA_BASE_URL="http://127.0.0.1:11434" in the .env

I have bashed my head against this for a few hrs and am now at the point of confusion. Even watched a few videos in languages I could not understand just to try to get an idea of what was working for other people. Any help would be greatly appreciated.

Reaper176 avatar Jun 11 '25 15:06 Reaper176

Hi Reaper176 , I assume you are using dockerized version of agent-zero and ollama is running on your computer not inside agent-zero's docker container ? then you have to do following to point agent-zero to ollama running on host OS, assuming you are using linux based OS.

  1. get ip address of docker interface on host machine using cmd ifconfig -a docker0 copy the ip address next to inet

Image

  1. by default ollama port is exposed locally.In order to expose port on other interfaces add following
[Service]
Environment="OLLAMA_HOST=0.0.0.0:11434"

in /etc/systemd/system/ollama.service

Image

after that run sudo systemctl daemon-reload && sudo systemctl restart ollama.service

Now you can check ollama is availabe docker interfce ip using docker interface ip you copied in step 1 curl http://172.17.0.1:11434/) . if this cmd not working make sure to check port is allowed on os firewall.

Image

Now finally set OLLAMA_BASE_URL="http://172.17.0.1:11434" in .env file , from web ui restart and it will work.

unamewiki avatar Jun 13 '25 15:06 unamewiki

@unamewiki I do not understand how to proceed with your instructions.

This is my ollama.service

Image

and the curl command shows that it is available. Image

I did try reloading and restarting ollama just on the off chance that it had not picked something up but no change. :(

also I did this just to check

Image

still this error

httpx.ConnectError: All connection attempts failed

Probably not helpful but openwebui (also running in a docker) has no problems seeing ollama

Image

Reaper176 avatar Jun 15 '25 17:06 Reaper176

Note to the dev. The circled area looks like a really good place to put "url = x.x.x.x:port"

Image

Reaper176 avatar Jun 15 '25 17:06 Reaper176