[Errno 111] Connection refused
On Macbook M1, running via Docker.
I made the adjustment to the dockerfile as describe here: https://github.com/browser-use/web-ui/issues/100
But when I run the agent, I receive [Errno 111] Connection refused
How to debug/resolve this issue?
Thanks?
i have the same issue but on Windows by only following the Docker installation
same issue using docker in windows
same issue on Macbook M1 with 16gig ram
Same issue here on windows ERROR [agent] ❌ Result failed 1/5 times: browser-use-webui-1 | [Errno 111] Connection refused browser-use-webui-1 | 2025-01-27 14:14:35,560 DEBG 'webui' stdout output.
I got the exact same m1 16gb ram
Same on M3
Same in WSL Win 11.
Same here on Macbook Pro M4
same here on linux fedora
what fixed it for me is using host network mode:
services:
browser-use-webui:
# ...
network_mode: host
what fixed it for me is using host network mode:
services: browser-use-webui: # ... network_mode: host
I tried this and it didn't fix it for me - in fact I can't even get it to properly boot up, it seems to get stuck in some sort of error loop.
Just add this link to the base url shia: http://host.docker.internal:11434 for docker install or add this for local http://localhost:11434
Make sure ollama refreshed after each server restart.
Just add this link to the base url shia: http://host.docker.internal:11434 for docker install or add this for local http://localhost:11434
Make sure ollama refreshed after each server restart.
Having the same issue and tried this but no luck
Just add this link to the base url shia: http://host.docker.internal:11434 for docker install or add this for local http://localhost:11434
Make sure ollama refreshed after each server restart.
Having the same issue and tried this but no luck
Then make sure in the requirements file to increase the browser-use version from 1.29.0 to 1.30.0
Its not perfect, sometimes the agent runs, and sometimes doesn't lol also the browser options disable recording, and enable use of browser (left most option) if doesn't work decrease the number of steps by half from agent settings.
here's what i tried and still not luck
- adding network host to docker-compose.yml
- adding the base url to configuration
- increasing the browser-use version
- disabling recording
- decreasing the number of step to very few
since this is like a network thing, probably by modifying something in the firewall might work?
i'm run on mac somona chip intel 2018 Run agent with ollama ai
But when I run the agent, I receive [Errno 61] Connection refused
How to resolve this issue? @warmshao Thanks?
Same thing here ...
i'm run on mac somona chip intel 2018 Run agent with ollama ai
But when I run the agent, I receive [Errno 61] Connection refused
How to resolve this issue? @warmshao Thanks?
Please ensure that the url of ollama can be accessed. If you use the webui in docker, I don't think http://localhost:11434/ is the correct base url.
Ollama can be accessed from other docker AI app like open webui but not from browser web-ui using same url. So there is some issue. Edit: reffering to Errno 111
i'm run on mac somona chip intel 2018 Run agent with ollama ai But when I run the agent, I receive [Errno 61] Connection refused How to resolve this issue? @warmshao Thanks?
Please ensure that the url of ollama can be accessed. If you use the webui in docker, I don't think http://localhost:11434/ is the correct base url.
@vvincent1234 I'm running directly on the terminal without docker. I just clone the source code and run. I try to many times but "Connection refused" every times
I had same Errno111 while running the browser-use/webui via docker on M4 mac
changed the .env files OLLAMA_ENDPOINT to below and worked for me
//.env #OLLAMA_ENDPOINT=http://localhost:11434 OLLAMA_ENDPOINT=http://host.docker.internal:11434 # changed to docker internal host //
Additional point that I also changed the docker-compose.yml file becoz it didn't detect the mac silicon somehow while build the docker image, so forced to use arm64 platform configuration
//docker-compose.yml browser-use-webui: platform: linux/arm64 # changed to arm64 from amd64 build: context: . dockerfile: ${DOCKERFILE:-Dockerfile} args: TARGETPLATFORM: ${TARGETPLATFORM:-linux/arm64} # changed to arm64 from amd64 //
as i understood problem localhost url will not work because it is looking into container itself so only solution is run ollama as docker container docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama and then run model docker exec -it ollama ollama run qwen2.5:7b and then use base url http://yourhostname:11434 now i dont have issue but it is very slow for local model it taken task (go to google.com and type 'OpenAI' click search) for me to complete around 15 minute when we use locally deepseek and qwen model so better use google gemini gemini-2.0-flash-exp model with free api which can complete in 1 minute only
I have ollama running in a docker container and I was getting this same error.
I was able to get this working by adding:
extra_hosts:
- "host.docker.internal:host-gateway"
To the docker-compose file.
Then, in my .env file, I set the Ollama endpoint like this:
OLLAMA_ENDPOINT=http://host.docker.internal:11434
For troubleshooting you can use:
docker exec -it web-ui-browser-use-webui-1 bash
Then try: curl http://host.docker.internal:11434
If you get a "curl: (6) Could not resolve host: host.docker.internal" message then the web-ui container is unable to talk to the ollama container so you'll have to dig a bit more.
If you're working with Ollama, you should update the OLLAMA_ENDPOINT to "http://host.docker.internal:11434" in your .env file. This allows the Docker container to communicate with the Ollama application running on your host machine, as "localhost" inside a container refers to the container itself, not your computer.
also I made pr about this issue : https://github.com/browser-use/web-ui/pull/399