agent-zero icon indicating copy to clipboard operation
agent-zero copied to clipboard

AgentZero can't connect to Ollama

Open Theblackcat98 opened this issue 5 months ago • 9 comments

I am getting a connection error, it seems for some reason the container cannot reach the Ollama IP address: NOTE: The IP address is open to my local network, verified using another device and getting a proper response (Ollama is running). I have tried using http://localhost:11434 and http://192.168.1.199:11434.

Error Log

Text

litellm.APIConnectionError: OllamaException - Cannot connect to host 192.168.1.199:11434 ssl:default [Connect call failed ('192.168.1.199', 11434)]

Traceback (most recent call last): Traceback (most recent call last): File "/opt/venv/lib/python3.12/site-packages/aiohttp/connector.py", line 1268, in _wrap_create_connection sock = await aiohappyeyeballs.start_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/aiohappyeyeballs/impl.py", line 122, in start_connection raise first_exception File "/opt/venv/lib/python3.12/site-packages/aiohappyeyeballs/impl.py", line 73, in start_connection sock = await _connect_sock( ^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/aiohappyeyeballs/impl.py", line 208, in _connect_sock await loop.sock_connect(sock, address) File "/usr/lib/python3.12/asyncio/selector_events.py", line 651, in sock_connect return await fut ^^^^^^^^^ File "/usr/lib/python3.12/asyncio/futures.py", line 289, in await yield self # This tells Task to wait for completion. ^^^^^^^^^^ File "/usr/lib/python3.12/asyncio/tasks.py", line 385, in __wakeup future.result() File "/usr/lib/python3.12/asyncio/futures.py", line 202, in result raise self._exception.with_traceback(self._exception_tb) File "/usr/lib/python3.12/asyncio/selector_events.py", line 691, in _sock_connect_cb raise OSError(err, f'Connect call failed {address}') ConnectionRefusedError: [Errno 111] Connect call failed ('192.168.1.199', 11434)

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/aiohttp_transport.py", line 59, in map_aiohttp_exceptions yield File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/aiohttp_transport.py", line 213, in handle_async_request response = await client_session.request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/aiohttp/client.py", line 1488, in aenter self._resp: _RetType = await self._coro ^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/aiohttp/client.py", line 770, in _request resp = await handler(req) ^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/aiohttp/client.py", line 725, in _connect_and_send_request conn = await self._connector.connect( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/aiohttp/connector.py", line 642, in connect proto = await self._create_connection(req, traces, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/aiohttp/connector.py", line 1209, in _create_connection _, proto = await self._create_direct_connection(req, traces, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/aiohttp/connector.py", line 1581, in _create_direct_connection raise last_exc File "/opt/venv/lib/python3.12/site-packages/aiohttp/connector.py", line 1550, in _create_direct_connection transp, proto = await self._wrap_create_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/aiohttp/connector.py", line 1291, in _wrap_create_connection raise client_error(req.connection_key, exc) from exc aiohttp.client_exceptions.ClientConnectorError: Cannot connect to host 192.168.1.199:11434 ssl:default [Connect call failed ('192.168.1.199', 11434)]

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 111, in _make_common_async_call response = await async_httpx_client.post( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/logging_utils.py", line 135, in async_wrapper result = await func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 288, in post return await self.single_connection_post_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 513, in single_connection_post_request response = await client.send(req, stream=stream) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/httpx/_client.py", line 1629, in send response = await self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/httpx/_client.py", line 1657, in _send_handling_auth response = await self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/httpx/_client.py", line 1694, in _send_handling_redirects response = await self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/httpx/_client.py", line 1730, in _send_single_request response = await transport.handle_async_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/aiohttp_transport.py", line 206, in handle_async_request with map_aiohttp_exceptions(): ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.12/contextlib.py", line 158, in exit self.gen.throw(value) File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/aiohttp_transport.py", line 73, in map_aiohttp_exceptions raise mapped_exc(message) from exc httpx.ConnectError: Cannot connect to host 192.168.1.199:11434 ssl:default [Connect call failed ('192.168.1.199', 11434)]

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 538, in acompletion response = await init_response ^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 600, in acompletion_stream_function completion_stream, _response_headers = await self.make_async_call_stream_helper( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 659, in make_async_call_stream_helper response = await self._make_common_async_call( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 138, in _make_common_async_call raise self._handle_error(e=e, provider_config=provider_config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 2403, in _handle_error raise provider_config.get_error_class( litellm.llms.ollama.common_utils.OllamaError: Cannot connect to host 192.168.1.199:11434 ssl:default [Connect call failed ('192.168.1.199', 11434)]

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/a0/agent.py", line 323, in monologue prompt = await self.prepare_prompt(loop_data=self.loop_data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/agent.py", line 411, in prepare_prompt await self.call_extensions("message_loop_prompts_after", loop_data=loop_data) File "/a0/agent.py", line 831, in call_extensions await cls(agent=self).execute(**kwargs) File "/a0/python/extensions/message_loop_prompts_after/_91_recall_wait.py", line 13, in execute await task File "/usr/lib/python3.12/asyncio/futures.py", line 289, in await yield self # This tells Task to wait for completion. ^^^^^^^^^^ File "/usr/lib/python3.12/asyncio/tasks.py", line 385, in __wakeup future.result()

41 stack lines skipped <<<

File "/a0/models.py", line 272, in unified_call _completion = await acompletion( ^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/utils.py", line 1552, in wrapper_async raise e File "/opt/venv/lib/python3.12/site-packages/litellm/utils.py", line 1410, in wrapper_async result = await original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 557, in acompletion raise exception_type( ^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2293, in exception_type raise e File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2262, in exception_type raise APIConnectionError( litellm.exceptions.APIConnectionError: litellm.APIConnectionError: OllamaException - Cannot connect to host 192.168.1.199:11434 ssl:default [Connect call failed ('192.168.1.199', 11434)]

litellm.exceptions.APIConnectionError: litellm.APIConnectionError: OllamaException - Cannot connect to host 192.168.1.199:11434 ssl:default [Connect call failed ('192.168.1.199', 11434)]

Theblackcat98 avatar Jul 18 '25 12:07 Theblackcat98

In order to get agent zero to communicate outside the Docker container, you're going to need to use this URL with Ollama http://host.docker.internal:11434 Assuming Ollama is installed on your local system. After that, you're also going to need to increase the context length of the model. num_ctx=100000 The standard context length of Ollama is not big enough to fully fit the template in for AgentZero.

qwerty108109 avatar Jul 20 '25 00:07 qwerty108109

That would be true if I was running docker... I'm using Agent-Zero with Podman: WHICH lead me to search is Podman has an alternative to host.docker.int.

IT DOES!

It's http://host.containers.internal so I tried http://host.containers.internal:11434 and it works!

Theblackcat98 avatar Jul 20 '25 05:07 Theblackcat98

Well, I'm back.

leaving this log here for now

LOGS

User message:

1 Initializing VectorDB... Found 2 knowledge files in /a0/knowledge/default/main, processing... Processed 10 documents from 2 files. Processed 0 documents from 0 files. Processed 0 documents from 0 files. Processed 0 documents from 0 files. Processed 0 documents from 0 files. Processed 0 documents from 0 files. Processed 0 documents from 0 files. Processed 0 documents from 0 files. Found 1 knowledge files in /a0/instruments, processing... Processed 1 documents from 1 files. Task exception was never retrieved future: <Task finished name='Task-151' coro=<RecallSolutions.search_solutions() done, defined at /a0/python/extensions/message_loop_prompts_after/_51_recall_solutions.py:27> exception=litellm.APIConnectionError: OllamaException - [Errno 111] Connection refused> Traceback (most recent call last): File "/opt/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions yield File "/opt/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 250, in handle_request resp = self._pool.handle_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 256, in handle_request raise exc from None File "/opt/venv/lib/python3.12/site-packages/httpcore/_sync/connection_pool.py", line 236, in handle_request response = connection.handle_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 101, in handle_request raise exc File "/opt/venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 78, in handle_request stream = self._connect(request) ^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/httpcore/_sync/connection.py", line 124, in _connect stream = self._network_backend.connect_tcp(**kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/httpcore/_backends/sync.py", line 207, in connect_tcp with map_exceptions(exc_map): ^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.12/contextlib.py", line 158, in exit self.gen.throw(value) File "/opt/venv/lib/python3.12/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.ConnectError: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 4012, in embedding response = ollama_embeddings_fn( # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/ollama/completion/handler.py", line 114, in ollama_embeddings response = litellm.module_level_client.post(url=api_base, json=data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 782, in post raise e File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 761, in post response = self.client.send(req, stream=stream) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/httpx/_client.py", line 914, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/httpx/_client.py", line 942, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/httpx/_client.py", line 979, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/httpx/_client.py", line 1014, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 249, in handle_request with map_httpcore_exceptions(): ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.12/contextlib.py", line 158, in exit self.gen.throw(value) File "/opt/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ConnectError: [Errno 111] Connection refused

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/usr/lib/python3.12/asyncio/tasks.py", line 314, in __step_run_and_handle_result result = coro.send(None) ^^^^^^^^^^^^^^^ File "/a0/python/extensions/message_loop_prompts_after/_51_recall_solutions.py", line 63, in search_solutions db = await Memory.get(self.agent) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/memory.py", line 69, in get db, created = Memory.initialize( ^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/memory.py", line 173, in initialize index = faiss.IndexFlatIP(len(embedder.embed_query("example"))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/langchain/embeddings/cache.py", line 194, in embed_query return self.underlying_embeddings.embed_query(text) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/models.py", line 360, in embed_query resp = embedding(model=self.model_name, input=[text], **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/utils.py", line 1306, in wrapper raise e File "/opt/venv/lib/python3.12/site-packages/litellm/utils.py", line 1181, in wrapper result = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 4242, in embedding raise exception_type( ^^^^^^^^^^^^^^^ � File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2293, in exception_type raise e � File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2262, in exception_type raise APIConnectionError( litellm.exceptions.APIConnectionError: litellm.APIConnectionError: OllamaException - [Errno 111] Connection refused Task exception was never retrieved future: <Task finished name='Task-4531' coro=<RecallSolutions.search_solutions() done, defined at /a0/python/extensions/message_loop_prompts_after/_51_recall_solutions.py:27> exception=litellm.APIConnectionError: OllamaException - Client error '404 Not Found' for url 'http://host.containers.internal:11434/api/embed' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404> Traceback (most recent call last): File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 4012, in embedding response = ollama_embeddings_fn( # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/ollama/completion/handler.py", line 114, in ollama_embeddings response = litellm.module_level_client.post(url=api_base, json=data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 780, in post raise e File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 762, in post response.raise_for_status() File "/opt/venv/lib/python3.12/site-packages/httpx/_models.py", line 829, in raise_for_status raise HTTPStatusError(message, request=request, response=self) httpx.HTTPStatusError: Client error '404 Not Found' for url 'http://host.containers.internal:11434/api/embed' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/usr/lib/python3.12/asyncio/tasks.py", line 314, in __step_run_and_handle_result result = coro.send(None) ^^^^^^^^^^^^^^^ File "/a0/python/extensions/message_loop_prompts_after/_51_recall_solutions.py", line 63, in search_solutions db = await Memory.get(self.agent) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/memory.py", line 69, in get db, created = Memory.initialize( ^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/memory.py", line 173, in initialize index = faiss.IndexFlatIP(len(embedder.embed_query("example"))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/langchain/embeddings/cache.py", line 194, in embed_query return self.underlying_embeddings.embed_query(text) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/models.py", line 360, in embed_query resp = embedding(model=self.model_name, input=[text], **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/utils.py", line 1306, in wrapper raise e File "/opt/venv/lib/python3.12/site-packages/litellm/utils.py", line 1181, in wrapper result = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 4242, in embedding raise exception_type( ^^^^^^^^^^^^^^^ � File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2293, in exception_type raise e � File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2262, in exception_type raise APIConnectionError( �litellm.exceptions.APIConnectionError: litellm.APIConnectionError: OllamaException - Client error '404 Not Found' for url 'http://host.containers.internal:11434/api/embed' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404 Task exception was never retrieved future: <Task finished name='Task-9074' coro=<RecallSolutions.search_solutions() done, defined at /a0/python/extensions/message_loop_prompts_after/_51_recall_solutions.py:27> exception=litellm.APIConnectionError: OllamaException - Client error '404 Not Found' for url 'http://host.containers.internal:11434/api/embed' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404> Traceback (most recent call last): File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 4012, in embedding response = ollama_embeddings_fn( # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/ollama/completion/handler.py", line 114, in ollama_embeddings response = litellm.module_level_client.post(url=api_base, json=data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 780, in post raise e File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 762, in post response.raise_for_status() File "/opt/venv/lib/python3.12/site-packages/httpx/_models.py", line 829, in raise_for_status raise HTTPStatusError(message, request=request, response=self) httpx.HTTPStatusError: Client error '404 Not Found' for url 'http://host.containers.internal:11434/api/embed' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/usr/lib/python3.12/asyncio/tasks.py", line 314, in __step_run_and_handle_result result = coro.send(None) ^^^^^^^^^^^^^^^ File "/a0/python/extensions/message_loop_prompts_after/_51_recall_solutions.py", line 63, in search_solutions db = await Memory.get(self.agent) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/memory.py", line 69, in get db, created = Memory.initialize( ^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/memory.py", line 173, in initialize index = faiss.IndexFlatIP(len(embedder.embed_query("example"))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/langchain/embeddings/cache.py", line 194, in embed_query return self.underlying_embeddings.embed_query(text) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/models.py", line 360, in embed_query resp = embedding(model=self.model_name, input=[text], **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/utils.py", line 1306, in wrapper raise e File "/opt/venv/lib/python3.12/site-packages/litellm/utils.py", line 1181, in wrapper result = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 4242, in embedding raise exception_type( ^^^^^^^^^^^^^^^ � File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2293, in exception_type raise e � File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2262, in exception_type raise APIConnectionError( �litellm.exceptions.APIConnectionError: litellm.APIConnectionError: OllamaException - Client error '404 Not Found' for url 'http://host.containers.internal:11434/api/embed' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404 Task exception was never retrieved future: <Task finished name='Task-10481' coro=<RecallSolutions.search_solutions() done, defined at /a0/python/extensions/message_loop_prompts_after/_51_recall_solutions.py:27> exception=litellm.APIConnectionError: OllamaException - Client error '404 Not Found' for url 'http://host.containers.internal:11434/api/embed' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404> Traceback (most recent call last): File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 4012, in embedding response = ollama_embeddings_fn( # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/ollama/completion/handler.py", line 114, in ollama_embeddings response = litellm.module_level_client.post(url=api_base, json=data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 780, in post raise e File "/opt/venv/lib/python3.12/site-packages/litellm/llms/custom_httpx/http_handler.py", line 762, in post response.raise_for_status() File "/opt/venv/lib/python3.12/site-packages/httpx/_models.py", line 829, in raise_for_status raise HTTPStatusError(message, request=request, response=self) httpx.HTTPStatusError: Client error '404 Not Found' for url 'http://host.containers.internal:11434/api/embed' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/usr/lib/python3.12/asyncio/tasks.py", line 314, in __step_run_and_handle_result result = coro.send(None) ^^^^^^^^^^^^^^^ File "/a0/python/extensions/message_loop_prompts_after/_51_recall_solutions.py", line 63, in search_solutions db = await Memory.get(self.agent) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/memory.py", line 69, in get db, created = Memory.initialize( ^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/memory.py", line 173, in initialize index = faiss.IndexFlatIP(len(embedder.embed_query("example"))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/langchain/embeddings/cache.py", line 194, in embed_query return self.underlying_embeddings.embed_query(text) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/models.py", line 360, in embed_query resp = embedding(model=self.model_name, input=[text], **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/utils.py", line 1306, in wrapper raise e File "/opt/venv/lib/python3.12/site-packages/litellm/utils.py", line 1181, in wrapper result = original_function(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 4242, in embedding raise exception_type( ^^^^^^^^^^^^^^^ � File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2293, in exception_type raise e � File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2262, in exception_type raise APIConnectionError( �litellm.exceptions.APIConnectionError: litellm.APIConnectionError: OllamaException - Client error '404 Not Found' for url 'http://host.containers.internal:11434/api/embed' For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404 Response:

1" is pretty vague, so maybe they want me to list tasks or check the scheduler status?ent datetime as 2025-07-19 23:37:05-07:00. I need to figure out what the user is asking for here. The message "

�First, I should check if there are any existing tasks in the scheduler. The user might not have created any tasks yet, but I need to confirm. The available tools include scheduler:list_tasks, which can list all tasks. Let me use that to see what's currently scheduled.

Wait, the user didn't specify any particular task, so maybe they just need a general overview. Alternatively, maybe they want to create a new task. But since the message is just "1", it's unclear. The best approac Let me call the scheduler:list_tasks tool without any filters to get a comprehensive list.

{ � "thoughts": [ � "The user provided a simple message '1' which might indicate a request for task information", "I should check the scheduler status to see if there are any existing tasks", � "Using scheduler:list_tasks to get a comprehensive list of tasks" ], � "headline": "Checking scheduler status for task information", "tool_name": "scheduler:list_tasks", � "tool_args": {} /a0/python/tools/scheduler.py:152: SyntaxWarning: invalid escape sequence '\d' cron_regex = "^((((\d+,)+\d+|(\d+(/|-|#)\d+)|\d+L?|*(/\d+)?|L(-\d+)?|?|[A-Z]{3}(-[A-Z]{3})?) ?){5,7})$" }

A0: Using tool 'scheduler'

A0: Response from tool 'scheduler' [] Response: �Okay, the user's message was "1", which is pretty vague. I tried to list all scheduled tasks using the scheduler:list_tasks tool, but there were no results. That means there are no existing tasks in the system ri out what they might need next.specify any particular task or action, I need to figure

=The [EXTRAS] info shows the current datetime as 2025-07-19 23:37:41-07:00. Maybe the user wants to create a new task? Or perhaps they need help with something else related to the scheduler. Since there are no tas, I should ask them to clarify what they want to do. They might need to create a task, check task creation options, or maybe they have a different request entirely.

I should respond by letting them know there are no tasks currently scheduled and ask for more details on what they need assistance with. That way, I can guide them to provide more specific instructions or help them create a new task if that's what they want.

{ � "thoughts": [ "No tasks found in the scheduler", � "User needs to clarify what action they want to take", � "Current datetime: 2025-07-19 23:37:41-07:00" ], � "headline": "No scheduled tasks found, request clarification", � "tool_name": "response", � "tool_args": { � "text": "No tasks are currently scheduled. Could you please clarify what action you'd like me to take? For example, you can ask me to create a new task, check task details, or provide information about the scheduler." }

Theblackcat98 avatar Jul 20 '25 07:07 Theblackcat98

More Embedding related Errors and Bugs

Traceback (most recent call last): Traceback (most recent call last): File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 4012, in embedding response = ollama_embeddings_fn( # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/ollama/completion/handler.py", line 117, in ollama_embeddings return _process_ollama_embedding_response( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ � File "/opt/venv/lib/python3.12/site-packages/litellm/llms/ollama/completion/handler.py", line 50, in _process_ollama_embedding_response logging_obj.debug( ^^^^^^^^^^^^^^^^^ AttributeError: 'Logging' object has no attribute 'debug'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/a0/agent.py", line 323, in monologue prompt = await self.prepare_prompt(loop_data=self.loop_data) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/agent.py", line 411, in prepare_prompt await self.call_extensions("message_loop_prompts_after", loop_data=loop_data) File "/a0/agent.py", line 831, in call_extensions await cls(agent=self).execute(**kwargs) File "/a0/python/extensions/message_loop_prompts_after/_91_recall_wait.py", line 13, in execute await task File "/usr/lib/python3.12/asyncio/futures.py", line 289, in await yield self # This tells Task to wait for completion. ^^^^^^^^^^ File "/usr/lib/python3.12/asyncio/tasks.py", line 385, in __wakeup future.result() File "/usr/lib/python3.12/asyncio/futures.py", line 202, in result raise self._exception.with_traceback(self._exception_tb) File "/usr/lib/python3.12/asyncio/tasks.py", line 316, in __step_run_and_handle_result result = coro.throw(exc) ^^^^^^^^^^^^^^^ File "/a0/python/extensions/message_loop_prompts_after/_50_recall_memories.py", line 69, in search_memories memories = await db.search_similarity_threshold( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/a0/python/helpers/memory.py", line 305, in search_similarity_threshold return await self.db.asearch( ^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/langchain_core/vectorstores/base.py", line 380, in asearch docs_and_similarities = await self.asimilarity_search_with_relevance_scores( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ � File "/opt/venv/lib/python3.12/site-packages/langchain_core/vectorstores/base.py", line 603, in asimilarity_search_with_relevance_scores docs_and_similarities = await self._asimilarity_search_with_relevance_scores( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ � File "/opt/venv/lib/python3.12/site-packages/langchain_community/vectorstores/faiss.py", line 1327, in _asimilarity_search_with_relevance_scores docs_and_scores = await self.asimilarity_search_with_score( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ � File "/opt/venv/lib/python3.12/site-packages/langchain_community/vectorstores/faiss.py", line 549, in asimilarity_search_with_score embedding = await self._aembed_query(query) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/langchain_community/vectorstores/faiss.py", line 272, in _aembed_query return await self.embedding_function.aembed_query(text) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/langchain/embeddings/cache.py", line 217, in aembed_query return await self.underlying_embeddings.aembed_query(text) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/langchain_core/embeddings/embeddings.py", line 78, in aembed_query return await run_in_executor(None, self.embed_query, text) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/langchain_core/runnables/config.py", line 622, in run_in_executor return await asyncio.get_running_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.12/asyncio/futures.py", line 289, in await yield self # This tells Task to wait for completion. ^^^^^^^^^^ File "/usr/lib/python3.12/asyncio/tasks.py", line 385, in __wakeup future.result() File "/usr/lib/python3.12/asyncio/futures.py", line 202, in result raise self._exception.with_traceback(self._exception_tb) File "/usr/lib/python3.12/concurrent/futures/thread.py", line 59, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

25 stack lines skipped <<<

File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 4242, in embedding raise exception_type( ^^^^^^^^^^^^^^^ � File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2293, in exception_type raise e � File "/opt/venv/lib/python3.12/site-packages/litellm/litellm_core_utils/exception_mapping_utils.py", line 2269, in exception_type raise APIConnectionError( litellm.exceptions.APIConnectionError: litellm.APIConnectionError: 'Logging' object has no attribute 'debug' Traceback (most recent call last): File "/opt/venv/lib/python3.12/site-packages/litellm/main.py", line 4012, in embedding response = ollama_embeddings_fn( # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/venv/lib/python3.12/site-packages/litellm/llms/ollama/completion/handler.py", line 117, in ollama_embeddings return _process_ollama_embedding_response( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ � File "/opt/venv/lib/python3.12/site-packages/litellm/llms/ollama/completion/handler.py", line 50, in _process_ollama_embedding_response logging_obj.debug( ^^^^^^^^^^^^^^^^^ AttributeError: 'Logging' object has no attribute 'debug'

Theblackcat98 avatar Jul 20 '25 08:07 Theblackcat98

Just to clarify are you still having trouble or were you able to get it working?

qwerty108109 avatar Jul 20 '25 21:07 qwerty108109

To be honest it's a hit and miss. Did you glance over the logs? (Between those I didn't change anything). Both logs show two different errors, the more baffling and confusing one is the first. Using http://host.containers.internal:11434 it connects to Ollama just fine, I see the model correctly generates text. But during some part of the process/requests it throws those exceptions, including the API connection error. Which makes no sense since it just properly generated text. I tried asking another simple question, and I did confirm that the embedding model is also properly recognized and being used.

Theblackcat98 avatar Jul 20 '25 21:07 Theblackcat98

Let me know if you need more information or how I can help traceback the issue. It's confusing to see the embedding fail at first (my first comment reopening the issue) Then trying a second time, received a different error.

Theblackcat98 avatar Jul 22 '25 06:07 Theblackcat98

yo, just dropping in cause this sounds painfully familiar — you might be hitting one of those weird semi-deployment bugs where things half-load and then explode randomly.

what you're describing (works on first query, then API connection error during follow-up) smells like a classic deployment deadlock: the retriever/embedding/model layers don’t fully sync or race during cold boot. sometimes they handshake, sometimes they silently betray you.

we ran into something almost identical. ended up mapping it as part of a failure pattern called “pre-deploy collapse” (when everything looks fine until you actually use it). happy to walk you through what we did to fix it, just let me know.

good luck out there. embedding bugs are a dark forest.

onestardao avatar Aug 07 '25 15:08 onestardao

I got the same problem. Is there any upgrade? thx a lot

maxmasetti avatar Aug 13 '25 14:08 maxmasetti