OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

Error: connect ECONNREFUSED 127.0.0.1:3000

Open lijom80 opened this issue 1 year ago • 13 comments

Describe the bug

on the frontend, I am getting this error ERROR: Failed connection to server. Please ensure the server is reachable at ws://:3001/ws.

at the backend, I am getting the below error Running the app...

[email protected] start vite --port 3001 --host 0.0.0.0

VITE v5.2.7 ready in 578 ms

➜ Local: http://localhost:3001/ ➜ Network: http://:3001/ ➜ press h + enter to show help 4:38:45 PM [vite] http proxy error: /litellm-agents Error: connect ECONNREFUSED 127.0.0.1:3000 at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1481:16) 4:38:45 PM [vite] http proxy error: /litellm-models Error: connect ECONNREFUSED 127.0.0.1:3000 at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1481:16) 4:38:45 PM [vite] http proxy error: /litellm-models Error: connect ECONNREFUSED 127.0.0.1:3000 at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1481:16) (x2) 4:38:45 PM [vite] http proxy error: /litellm-agents Error: connect ECONNREFUSED 127.0.0.1:3000 at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1481:16) 4:38:45 PM [vite] ws proxy error: Error: connect ECONNREFUSED 127.0.0.1:3000 at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1481:16)

Setup and configuration

Current version:

commit c37124d74022fc73f32a576d88dd59db2ebfe2ac (HEAD -> main, origin/main, origin/HEAD) Author: Xingyao Wang [email protected] Date: Tue Apr 2 14:58:28 2024 +0800

My config.toml and environment vars (be sure to redact API keys):

cat config.toml
LLM_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxxxx"
LLM_MODEL="open-devin-preview-gpt4"
WORKSPACE_DIR="./workspace"

My model and agent (you can see these settings in the UI): Initializing agent (may take up to 10 seconds)... nothing after that

Commands I ran to install and run OpenDevin:

Steps to Reproduce:

  1. the only change I made apart from the instructions provided is in the makefile @cd frontend && npm run start -- --port $(FRONTEND_PORT) --host "0.0.0.0" <- added 0.0.0.0 to be accessible over the network.

Logs, error messages, and screenshots: image

Additional Context

lijom80 avatar Apr 02 '24 11:04 lijom80

Error: connect ECONNREFUSED 127.0.0.1:3000

Looks like you forget to start the backend or mock server at 3000

xcodebuild avatar Apr 02 '24 11:04 xcodebuild

#576

huybery avatar Apr 02 '24 12:04 huybery

Open file opendevin/frontend/vite.config and there corect these lines of code "const BACKEND_HOST = process.env.BACKEND_HOST || "127.0.0.1:3000";

// check BACKEND_HOST is something like "example.com" if (!BACKEND_HOST.match(/^([\w\d-]+(.[\w\d-]+)+(:\d+)?)/)) { throw new Error( Invalid BACKEND_HOST ${BACKEND_HOST}, example BACKEND_HOST 127.0.0.1:3000, ); }" with those corected ones "const BACKEND_HOST = process.env.BACKEND_HOST || "127.0.0.1:3001";

// check BACKEND_HOST is something like "example.com" if (!BACKEND_HOST.match(/^([\w\d-]+(.[\w\d-]+)+(:\d+)?)/)) { throw new Error( Invalid BACKEND_HOST ${BACKEND_HOST}, example BACKEND_HOST 127.0.0.1:3001, ); }" - the error is that frontend try to comunicate on 127.0.0.1:3000 and the backend try to communicate on 127.0.0.1:3001 . Afer that save and rerun . It will work ok now .

CatalinCiocea avatar Apr 02 '24 13:04 CatalinCiocea

the only change I made apart from the instructions provided is in the makefile @cd frontend && npm run start -- --port $(FRONTEND_PORT) --host "0.0.0.0" <- added 0.0.0.0 to be accessible over the network.

Why do you need the backend to be accessible over the network? Are you running backend and front end on different boxes? I suspect your setup will run fine if you dropped --host "0.0.0.0"

If you need to keep it, you need to set the environment variable BACKEND_HOST to your backend server's IP address or DNS.

foragerr avatar Apr 02 '24 14:04 foragerr

You should check the log. I guess it is also related to #573

JustinLin610 avatar Apr 02 '24 15:04 JustinLin610

#576

@JustinLin610 - Yes it is the issue When I tried wget without sudo, I got Cannot write to ‘/tmp/llama_index/models--BAAI--bge-small-en-v1.5/snapshots/5c38ec7c405ec4b44b94cc5a9bb96e735b38267a/1_Pooling/config.json’ (Success) the fact that it prints (Success) at the end could be misleading with sudo 2024-04-02 21:46:29 (2.78 MB/s) - ‘/tmp/llama_index/models--BAAI--bge-small-en-v1.5/snapshots/5c38ec7c405ec4b44b94cc5a9bb96e735b38267a/1_Pooling/config.json’ saved [190/190]

Success Starting backend... INFO: Started server process [12999] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:3000 (Press CTRL+C to quit)

have a question though, do I have to start the back end separately? If so, where can I have a nohup added?

lijom80 avatar Apr 02 '24 16:04 lijom80

The newest makefile has a run target that runs both backend and frontend for you.

foragerr avatar Apr 02 '24 16:04 foragerr

have a question though, do I have to start the back end separately? If so, where can I have a nohup added?

@lijom80 You don't need to, now make run will let you start both the front and back end at the same time and see their outputs simultaneously.

xcodebuild avatar Apr 03 '24 13:04 xcodebuild

i've got the same problem with the new make run command, giving me this error:

5:39:57 PM [vite] http proxy error: /litellm-models
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:39:57 PM [vite] http proxy error: /litellm-agents
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:39:57 PM [vite] http proxy error: /litellm-models
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:39:57 PM [vite] http proxy error: /litellm-agents
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:39:58 PM [vite] ws proxy error:
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:40:03 PM [vite] http proxy error: /litellm-models
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:40:03 PM [vite] http proxy error: /litellm-agents
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:40:03 PM [vite] http proxy error: /litellm-models
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:40:03 PM [vite] http proxy error: /litellm-agents
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)
5:40:03 PM [vite] ws proxy error:
Error: connect ECONNREFUSED 127.0.0.1:3000
    at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1602:16)

Francescodotta avatar Apr 03 '24 15:04 Francescodotta

Backend always take some time to start, it there any output logs from backend?

xcodebuild avatar Apr 03 '24 15:04 xcodebuild

no thanks, for me the issue was some cuda library not properly installed on the wsl. I solved it.

Francescodotta avatar Apr 03 '24 16:04 Francescodotta

My services are running as well. I realised after reboot the node, and running backend again, it installed few more components.

Also, the new make run does not run the backend automatically for some reason.

I have a question, what is the ideal infra requirements for make it run smoothly?

On Wed, 3 Apr 2024, 22:02 Francesco Dotta, @.***> wrote:

no thanks, for me the issue was some cuda library not properly installed on the wsl. I solved it.

— Reply to this email directly, view it on GitHub https://github.com/OpenDevin/OpenDevin/issues/568#issuecomment-2035070098, or unsubscribe https://github.com/notifications/unsubscribe-auth/BF3SFTXL2WJLBENCJ2EGGO3Y3QVLLAVCNFSM6AAAAABFTF6E7CVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAMZVGA3TAMBZHA . You are receiving this because you were mentioned.Message ID: @.***>

lijom80 avatar Apr 04 '24 08:04 lijom80

My services are running as well. I realised after reboot the node, and running backend again, it installed few more components. Also, the new make run does not run the backend automatically for some reason. I have a question, what is the ideal infra requirements for make it run smoothly?

make build will install dependences to run smoothly.

xcodebuild avatar Apr 04 '24 08:04 xcodebuild

Seems user can run opendevin now. But maye still keep an eye on backend launch slowly problem.

yufansong avatar Apr 05 '24 21:04 yufansong