port 11434 is in use, execution failed
Describe the bug After first attempt to run it I got this error message: [+] Running 1/2 ✔ Network agenticseek_agentic-seek-net Created 0.1s ⠧ Container backend Starting 0.7s Error response from daemon: failed to set up container networking: driver failed programming external connectivity on endpoint backend (7958498fa18a74b9e98b73e17e6b760912e67a406b2537ea89c302596851c51f): failed to bind host port for 0.0.0.0:11434:172.18.0.2:11434/tcp: address already in use Error: Failed to start backend container.
To Reproduce Steps to reproduce the behavior:
-
Added to config.ini the following settings: ~/agenticSeek$ cat config.ini [MAIN] is_local = False provider_name = google provider_model = gemini-2.0-flash provider_server_address = 127.0.0.1:11434
-
Started service with this command: ~/agenticSeek$ ./start_services.sh full
-
See error Expected behavior It should not start docker service on ollama's port 11434.
Screenshots Error message provided
LLM Model used gemini-2.0-flash
Desktop (please complete the following information):
- OS: Ubuntu 24.04
- Browser: chrome
- Version: 136
Additional context I think in the config.ini the ollama is defined to port 11434. Why the system starts the docker on port 11434 and drops the error message when found that it is occupied? Even the is_local is false, but I would use ollama for other purpose. I had to stop ollama service in order to run this service. Now it is started and using port 11434 for docker proxy:
~/agenticSeek$ sudo netstat -luntp| grep 11434
[sudo] password for laci:
tcp 0 0 0.0.0.0:11434 0.0.0.0:* LISTEN 24749/docker-proxy
tcp6 0 0 :::11434 :::* LISTEN 24756/docker-proxy
Because the docker-compose.yml file strictly prescribes port forwarding:
...
backend:
container_name: backend
profiles: ["backend", "full"]
image: agenticseek-backend
build:
context: .
dockerfile: Dockerfile.backend
ports:
- ${BACKEND_PORT:-7777}:${BACKEND_PORT:-7777}
- ${OLLAMA_PORT:-11434}:${OLLAMA_PORT:-11434}
...
You can specify a different port in your .env: OLLAMA_PORT="22334" and then there will be no conflict.
Hello everyone,
I'm trying to run agenticSeek with a local Ollama instance and I'm running into a port conflict issue. I hope you can point me in the right direction.
My Goal:
To use agenticSeek with a local LLM hosted by Ollama (gemma3:12b).
The Problem:
When I start the services with ./start_services.sh full, the docker-compose setup successfully starts. The netstat command shows that docker-proxy is listening on port 11434.
Bash
$ sudo netstat -luntp| grep 11434
tcp 0 0 [0.0.0.0:11434](http://0.0.0.0:11434/) 0.0.0.0:* LISTEN 211329/docker-proxy
tcp6 0 0 :::11434 :::* LISTEN 211338/docker-proxy
However, I believe I also need to run the Ollama server on my host machine. When I try to do that in a separate terminal with ollama serve, I get the expected error because the port is already taken by Docker:
Bash
$ ollama serve
Error: listen tcp [127.0.0.1:11434](http://127.0.0.1:11434/): bind: address already in use
The backend container log shows it's trying to connect to this address:
backend | Provider: openai initialized at 127.0.0.1:11434
It seems like the backend container in Docker is itself reserving the port 11434 on the host, which prevents me from running the actual Ollama server on that same port. I feel like I'm missing a step or misunderstanding the intended workflow for a local setup.
My Configuration:
config.ini:
Ini, TOML
[MAIN]
is_local = True
provider_name = ollama
provider_model = gemma3:12b
provider_server_address = 127.0.0.1:11434
...
.env:
Snippet de código
SEARXNG_BASE_URL="http://127.0.0.1:8080/"
OLLAMA_PORT="11434"
BACKEND_PORT="7777"
...
docker-compose.yml (backend service):
YAML
backend:
container_name: backend
profiles: ["backend", "full"]
build:
context: .
dockerfile: Dockerfile.backend
ports:
- ${BACKEND_PORT:-7777}:${BACKEND_PORT:-7777}
- ${OLLAMA_PORT:-11434}:${OLLAMA_PORT:-11434} # This seems to be the source of the conflict
# ... other ports
# ... other settings
Could you clarify the correct procedure for using a local Ollama instance? Should Ollama be run in a separate container, or should the docker-compose.yml be modified?
Thank you for your help!
Sorry for replying and fixing the issues so late, I had a couple of personal things/motivation issues. It should now be fixed along with a couple others issues so I recommend updating.
Upon starting ollama you must follow these steps that i've added in the readme in order for it to work properly:
Unless you wish to to run AgenticSeek on host (CLI mode), you first need export the provider listen address:
export OLLAMA_HOST=0.0.0.0:11434
Then, start you provider:
ollama serve
If using a provider other than ollama you will need to read their docs to set how you can change the listen address.