web-ui icon indicating copy to clipboard operation
web-ui copied to clipboard

[Errno 111] Connection refused

Open dpaluy opened this issue 11 months ago • 24 comments

On Macbook M1, running via Docker.

I made the adjustment to the dockerfile as describe here: https://github.com/browser-use/web-ui/issues/100

But when I run the agent, I receive [Errno 111] Connection refused

How to debug/resolve this issue?

Thanks?

dpaluy avatar Jan 24 '25 20:01 dpaluy

i have the same issue but on Windows by only following the Docker installation

chris-amaya avatar Jan 26 '25 05:01 chris-amaya

same issue using docker in windows

RepairYourTech avatar Jan 26 '25 11:01 RepairYourTech

same issue on Macbook M1 with 16gig ram

barrettluke avatar Jan 27 '25 05:01 barrettluke

Same issue here on windows ERROR [agent] ❌ Result failed 1/5 times: browser-use-webui-1 | [Errno 111] Connection refused browser-use-webui-1 | 2025-01-27 14:14:35,560 DEBG 'webui' stdout output.

Image

forero94 avatar Jan 27 '25 14:01 forero94

I got the exact same m1 16gb ram

jasenwar avatar Jan 27 '25 18:01 jasenwar

Same on M3

TDMarko avatar Jan 28 '25 14:01 TDMarko

Same in WSL Win 11.

meteoro avatar Jan 28 '25 20:01 meteoro

Same here on Macbook Pro M4

telnemri avatar Jan 28 '25 23:01 telnemri

same here on linux fedora

madebytoilets avatar Jan 29 '25 16:01 madebytoilets

what fixed it for me is using host network mode:

services:
  browser-use-webui:
    # ...
    network_mode: host

eEQK avatar Jan 29 '25 18:01 eEQK

what fixed it for me is using host network mode:

services: browser-use-webui: # ... network_mode: host

I tried this and it didn't fix it for me - in fact I can't even get it to properly boot up, it seems to get stuck in some sort of error loop.

paulpenney avatar Jan 30 '25 01:01 paulpenney

Just add this link to the base url shia: http://host.docker.internal:11434 for docker install or add this for local http://localhost:11434

Image

Make sure ollama refreshed after each server restart.

TahaW863 avatar Jan 30 '25 03:01 TahaW863

Just add this link to the base url shia: http://host.docker.internal:11434 for docker install or add this for local http://localhost:11434

Image

Make sure ollama refreshed after each server restart.

Having the same issue and tried this but no luck

chen8160 avatar Jan 30 '25 05:01 chen8160

Just add this link to the base url shia: http://host.docker.internal:11434 for docker install or add this for local http://localhost:11434 Image Make sure ollama refreshed after each server restart.

Having the same issue and tried this but no luck

Then make sure in the requirements file to increase the browser-use version from 1.29.0 to 1.30.0

Its not perfect, sometimes the agent runs, and sometimes doesn't lol also the browser options disable recording, and enable use of browser (left most option) if doesn't work decrease the number of steps by half from agent settings.

TahaW863 avatar Jan 30 '25 15:01 TahaW863

here's what i tried and still not luck

  • adding network host to docker-compose.yml
  • adding the base url to configuration
  • increasing the browser-use version
  • disabling recording
  • decreasing the number of step to very few

since this is like a network thing, probably by modifying something in the firewall might work?

chris-amaya avatar Jan 30 '25 17:01 chris-amaya

i'm run on mac somona chip intel 2018 Run agent with ollama ai

But when I run the agent, I receive [Errno 61] Connection refused

How to resolve this issue? @warmshao Thanks?

ishrek avatar Jan 31 '25 17:01 ishrek

Same thing here ...

fxgardes avatar Jan 31 '25 23:01 fxgardes

i'm run on mac somona chip intel 2018 Run agent with ollama ai

But when I run the agent, I receive [Errno 61] Connection refused

How to resolve this issue? @warmshao Thanks?

Please ensure that the url of ollama can be accessed. If you use the webui in docker, I don't think http://localhost:11434/ is the correct base url.

vvincent1234 avatar Feb 01 '25 02:02 vvincent1234

Ollama can be accessed from other docker AI app like open webui but not from browser web-ui using same url. So there is some issue. Edit: reffering to Errno 111

iulko avatar Feb 01 '25 12:02 iulko

i'm run on mac somona chip intel 2018 Run agent with ollama ai But when I run the agent, I receive [Errno 61] Connection refused How to resolve this issue? @warmshao Thanks?

Please ensure that the url of ollama can be accessed. If you use the webui in docker, I don't think http://localhost:11434/ is the correct base url.

@vvincent1234 I'm running directly on the terminal without docker. I just clone the source code and run. I try to many times but "Connection refused" every times

ishrek avatar Feb 08 '25 08:02 ishrek

I had same Errno111 while running the browser-use/webui via docker on M4 mac

changed the .env files OLLAMA_ENDPOINT to below and worked for me

//.env #OLLAMA_ENDPOINT=http://localhost:11434 OLLAMA_ENDPOINT=http://host.docker.internal:11434 # changed to docker internal host //

Additional point that I also changed the docker-compose.yml file becoz it didn't detect the mac silicon somehow while build the docker image, so forced to use arm64 platform configuration

//docker-compose.yml browser-use-webui: platform: linux/arm64 # changed to arm64 from amd64 build: context: . dockerfile: ${DOCKERFILE:-Dockerfile} args: TARGETPLATFORM: ${TARGETPLATFORM:-linux/arm64} # changed to arm64 from amd64 //

th1nkd0g avatar Feb 13 '25 09:02 th1nkd0g

as i understood problem localhost url will not work because it is looking into container itself so only solution is run ollama as docker container docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama and then run model docker exec -it ollama ollama run qwen2.5:7b and then use base url http://yourhostname:11434 now i dont have issue but it is very slow for local model it taken task (go to google.com and type 'OpenAI' click search) for me to complete around 15 minute when we use locally deepseek and qwen model so better use google gemini gemini-2.0-flash-exp model with free api which can complete in 1 minute only

ATTO-RATHORE avatar Feb 15 '25 08:02 ATTO-RATHORE

I have ollama running in a docker container and I was getting this same error.

I was able to get this working by adding:

    extra_hosts:
      - "host.docker.internal:host-gateway"

To the docker-compose file.

Then, in my .env file, I set the Ollama endpoint like this:

OLLAMA_ENDPOINT=http://host.docker.internal:11434

For troubleshooting you can use: docker exec -it web-ui-browser-use-webui-1 bash

Then try: curl http://host.docker.internal:11434

If you get a "curl: (6) Could not resolve host: host.docker.internal" message then the web-ui container is unable to talk to the ollama container so you'll have to dig a bit more.

q0r3y avatar Mar 08 '25 21:03 q0r3y

If you're working with Ollama, you should update the OLLAMA_ENDPOINT to "http://host.docker.internal:11434" in your .env file. This allows the Docker container to communicate with the Ollama application running on your host machine, as "localhost" inside a container refers to the container itself, not your computer.

also I made pr about this issue : https://github.com/browser-use/web-ui/pull/399

AliYmn avatar Mar 14 '25 23:03 AliYmn