LLocalSearch
LLocalSearch copied to clipboard
Containers network communication fails on Windows (Solution)
Describe the bug
Hello,
Starting from scratch on my Windows 10 I installed Ollama, Docker Desktop and LLocalSearch. After starting with docker compose up
I had error when using the web interface.
The errors were about the backend that cannot communicate with searxng
and searxng that cannot communicate with redis
.
Example of errors :
backend-1 | Aug 26 14:01:01.960 WRN llm_tools/simple_websearch.go:50 Error making the request error="Get \"http://searxng:8080/?q=jeu+vid%C3%A9o+recommand%C3%A9&format=json\": dial tcp: lookup searxng on 127.0.0.11:53: no such host"
searxng-1 | File "/usr/lib/python3.12/site-packages/redis/connection.py", line 1074, in get_connection
searxng-1 | connection.connect()
searxng-1 | File "/usr/lib/python3.12/site-packages/redis/connection.py", line 283, in connect
searxng-1 | raise ConnectionError(self._error_message(e))
searxng-1 | redis.exceptions.ConnectionError: Error -2 connecting to redis:6379. Name does not resolve.
To Reproduce
Basic install on Windows
Expected behavior
You should have a failed communication when the tool try to query the web to answer your question
Solution
Here is what I've done to solve the issue. I am not an expert with docker compose at all. Is it the best solution ?
index 3067cb8..50595a1 100644
--- a/docker-compose.yaml
+++ b/docker-compose.yaml
@@ -3,7 +3,7 @@ services:
backend:
image: nilsherzig/llocalsearch-backend:latest
environment:
- - OLLAMA_HOST=${OLLAMA_HOST:-host.docker.internal:11434}
+ - OLLAMA_HOST=${OLLAMA_HOST:-http://host.docker.internal:11434}
- CHROMA_DB_URL=${CHROMA_DB_URL:-http://chromadb:8000}
- SEARXNG_DOMAIN=${SEARXNG_DOMAIN:-http://searxng:8080}
- EMBEDDINGS_MODEL_NAME=${EMBEDDINGS_MODEL_NAME:-nomic-embed-text:v1.5}
@@ -39,6 +39,8 @@ services:
- SETGID
- SETUID
- DAC_OVERRIDE
+ ports:
+ - '6379:6379'
searxng:
image: docker.io/searxng/searxng:latest
@@ -60,6 +62,8 @@ services:
options:
max-size: '1m'
max-file: '1'
+ ports:
+ - '8080:8080'
networks:
llm_network:
~
Do you want a pull request ? :)