Error: The socket connection was closed unexpectedly. For more information, pass `verbose: true` in the second argument to fetch()
Getting this all the time, other tools works fine in same env with constant networkscalls and lookups.
Facing this issue continuously and can't use OpenCode at all.
Same here. MacOS 14.8 (23J21) Opencode 0.9.11
It's nearly impossible to work with. At the beginning it's fine, but it goes quickly to this error, over and over, sometimes just after entering a new prompt, sometimes a few seconds after, sometimes minutes after…
Also seeing this. on open code 0.13.5. Was working with gpt4o through OpenAI.
Debug Logs
INFO 2025-10-12T01:04:30 +1ms service=tui commands={"agent_cycle":{},"agent_cycle_reverse":{},"agent_list":{},"app_exit":{},"app_help":{},"editor_open":{},"input_clear":{},"input_newline":{},"input_paste":{},"input_submit":{},"messages_copy":{},"messages_first":{},"messages_half_page_down":{},"messages_half_page_up":{},"messages_last":{},"messages_page_down":{},"messages_page_up":{},"messages_redo":{},"messages_undo":{},"model_cycle_recent":{},"model_cycle_recent_reverse":{},"model_list":{},"project_init":{},"session_child_cycle":{},"session_child_cycle_reverse":{},"session_compact":{},"session_export":{},"session_interrupt":{},"session_list":{},"session_new":{},"session_timeline":{},"theme_list":{},"thinking_blocks":{},"tool_details":{}} Loaded commands
INFO 2025-10-12T01:04:30 +4ms service=server method=GET path=/config/providers request
INFO 2025-10-12T01:04:30 +0ms service=server duration=0 response
DEBUG 2025-10-12T01:04:30 +1ms service=tui model=claude-sonnet-4-20250514 provider=anthropic Selected model from config
DEBUG 2025-10-12T01:04:30 +11ms service=tui timeTakenMs=0 messages.renderView
DEBUG 2025-10-12T01:04:30 +5ms service=tui file=/home/.local/state/opencode/tui State saved to file
DEBUG 2025-10-12T01:04:30 +2ms service=tui color=#1e1e2e isDark=true Background color
DEBUG 2025-10-12T01:04:30 +6ms service=tui timeTakenMs=0 messages.renderView
DEBUG 2025-10-12T01:04:30 +0ms service=tui file=/home.local/state/opencode/tui State saved to file
INFO 2025-10-12T01:04:30 +1ms service=server duration=33 response
INFO 2025-10-12T01:04:54 +24301ms service=server method=POST path=/session request
INFO 2025-10-12T01:04:54 +4ms service=session id=ses_62a0d0748ffeoY1Lanl3RIQTQ8 version=0.14.7 projectID=79a3f1767a8362769c13882681d279cff93dd902 directory=/home/personal/tonitum title=New session - 2025-10-12T01:04:54.456Z time={"created":1760231094456,"updated":1760231094456} created
INFO 2025-10-12T01:04:54 +1ms service=bus type=session.updated publishing
INFO 2025-10-12T01:04:54 +2ms service=server duration=7 response
INFO 2025-10-12T01:04:54 +7ms service=server method=POST path=/session/ses_62a0d0748ffeoY1Lanl3RIQTQ8/init request
INFO 2025-10-12T01:04:54 +5ms service=session.prompt session=ses_62a0d0748ffeoY1Lanl3RIQTQ8 prompt
DEBUG 2025-10-12T01:04:54 +4ms service=tui timeTakenMs=0 messages.renderView
INFO 2025-10-12T01:04:54 +0ms service=bus type=message.updated publishing
DEBUG 2025-10-12T01:04:54 +4ms service=tui timeTakenMs=0 messages.renderView
INFO 2025-10-12T01:04:54 +0ms service=bus type=message.part.updated publishing
INFO 2025-10-12T01:04:54 +1ms service=bus type=session.updated publishing
INFO 2025-10-12T01:04:54 +0ms service=provider providerID=anthropic modelID=claude-sonnet-4-20250514 getModel
INFO 2025-10-12T01:04:54 +1ms service=provider status=started providerID=anthropic getSDK
DEBUG 2025-10-12T01:04:54 +0ms service=tui message=msg_9d5f2f8bc001tZqE8Wq6xM4AOo part=prt_9d5f2f8c6001JR1wuCY5XNzLep message part updated
INFO 2025-10-12T01:04:54 +44ms service=provider status=completed duration=44 providerID=anthropic getSDK
INFO 2025-10-12T01:04:54 +0ms service=provider providerID=anthropic modelID=claude-sonnet-4-20250514 found
INFO 2025-10-12T01:04:54 +0ms service=session.prompt session=ses_62a0d0748ffeoY1Lanl3RIQTQ8 sessionID=ses_62a0d0748ffeoY1Lanl3RIQTQ8 locking
DEBUG 2025-10-12T01:04:54 +1ms service=tui pending render, skipping
DEBUG 2025-10-12T01:04:54 +1ms service=tui timeTakenMs=0 messages.renderView
DEBUG 2025-10-12T01:04:54 +0ms service=tui timeTakenMs=0 messages.renderView
INFO 2025-10-12T01:04:54 +11ms service=provider providerID=anthropic modelID=claude-3-5-haiku-20241022 getModel
INFO 2025-10-12T01:04:54 +0ms service=provider status=started providerID=anthropic getSDK
INFO 2025-10-12T01:04:54 +0ms service=provider status=completed duration=0 providerID=anthropic getSDK
INFO 2025-10-12T01:04:54 +0ms service=bus type=message.updated publishing
INFO 2025-10-12T01:04:54 +0ms service=provider providerID=anthropic modelID=claude-3-5-haiku-20241022 found
INFO 2025-10-12T01:04:54 +2ms service=session.prompt session=ses_62a0d0748ffeoY1Lanl3RIQTQ8 process
INFO 2025-10-12T01:04:54 +2ms service=session.prompt session=ses_62a0d0748ffeoY1Lanl3RIQTQ8 type=start part
DEBUG 2025-10-12T01:04:54 +5ms service=tui timeTakenMs=0 messages.renderView
DEBUG 2025-10-12T01:04:54 +93ms service=tui timeTakenMs=0 messages.renderView
ERROR 2025-10-12T01:04:54 +63ms service=session.prompt session=ses_62a0d0748ffeoY1Lanl3RIQTQ8 error={"error":{"code":"ECONNRESET","path":"https://api.anthropic.com/v1/messages","errno":0}} stream error
INFO 2025-10-12T01:04:54 +1ms service=session.prompt session=ses_62a0d0748ffeoY1Lanl3RIQTQ8 type=error part
ERROR 2025-10-12T01:04:54 +0ms service=session.prompt session=ses_62a0d0748ffeoY1Lanl3RIQTQ8 error=The socket connection was closed unexpectedly. For more information, pass `verbose: true` in the second argument to fetch() process
INFO 2025-10-12T01:04:54 +1ms service=bus type=session.error publishing
INFO 2025-10-12T01:04:54 +1ms service=bus type=message.updated publishing
INFO 2025-10-12T01:04:54 +0ms service=bus type=message.updated publishing
INFO 2025-10-12T01:04:54 +1ms service=session.compaction pruning
INFO 2025-10-12T01:04:54 +0ms service=session.prompt session=ses_62a0d0748ffeoY1Lanl3RIQTQ8 sessionID=ses_62a0d0748ffeoY1Lanl3RIQTQ8 unlocking
INFO 2025-10-12T01:04:54 +0ms service=bus type=session.idle publishing
INFO 2025-10-12T01:04:54 +0ms service=server duration=241 response
ERROR 2025-10-12T01:04:54 +0ms service=tui message=Error: The socket connection was closed unexpectedly. For more information, pass `verbose: true` in the second argument to fetch() name=UnknownError Server error
INFO 2025-10-12T01:04:54 +1ms service=session.compaction pruned=0 total=0 found
DEBUG 2025-10-12T01:04:54 +4ms service=tui timeTakenMs=0 messages.renderView
DEBUG 2025-10-12T01:04:54 +2ms service=tui pending render, skipping
DEBUG 2025-10-12T01:04:54 +3ms service=tui timeTakenMs=0 messages.renderView
INFO 2025-10-12T01:04:55 +990ms service=bus type=session.updated publishing
DEBUG 2025-10-12T01:04:55 +4ms service=tui timeTakenMs=0 messages.renderView
This was in a fresh session, running the /init command. Happened immediately.
Currently on opencode v0.14.7
@Tonitum looks like anthropic had an issue and closed the connection on u prematurely, we should try ot handle more gracefully for better ux but just wanted to clarify, incase it is helpful
I can believe that. I have been seeing this with more than one provider though (such as gpt-4o via OpenAI, grok-code-fast via Opencode Zen). Do you think my local network could be a source of the issue, or would you think primarily on the provider side? I have a hard time thinking that all of these providers would be having the same issues.
Hi!
To get it back to work I have to compact the context (/compact) and it comes back to normal.
It usually reaches 60% or something, sometimes 80%, but I never go further in the context length.
I feel there is an issue somewhere still.
@Tonitum What's the context size when you get the error? Top right of the screen.
I can check that next time, but it has happened on a fresh launch and fresh session multiple times.
Context from most recent issue. I looked at some other sessions and it ranges from 12.7k/4% to 20.1k/7%. I feel like that's too small of a context to really need compaction.
Is there any way to retry the fetch?
Well rn this isnt retried but we could attempt to retry, im rewriting some logic so that we can have more granular retry controls
for now if i get this i just send a follow up like “continue”
Well rn this isnt retried but we could attempt to retry, im rewriting some logic so that we can have more granular retry controls
for now if i get this i just send a follow up like “continue”
I have this error happened like 5 times today.
It happens in the latest version 0.15.14, using github-copilot/cluade- model, I use the "continue" magic too.
From the release notes of 0.15.14, it says:
- Added retry functionality for failed parts of conversations, improving reliability
But it didn't retry the failed parts.
that was a bad release note lol i should update it, it was ai generated
we already do retries i think this error is when the server closes the connection on us which we dont automatically retry but perhaps we should, it is a little weird to retry because it could happen mid stream i think
Hello team. This still happens to me as you can see from the screenshot.
@mantzas what kind of project did that happen in? I wonder if the lsp server crapped out
@rekram1-node go project. Are you refering to the gofmt LSP?
@mantzas gopls but yeah, tho now that i think about it, that may not be the issue id have to check
~/.dotfiles [HEAD] direnv 🔽/😤
➜ ocode 12:03:10
8597 | }
8598 | onClose() {
8599 | if (this.isClosed)
8600 | return;
8601 | this.isClosed = true;
8602 | const error = new MCPClientError({
^
MCPClientError: Connection closed
cause: undefined,
vercel.ai.error: true,
vercel.ai.error.AI_MCPClientError: true,
at onClose (../../node_modules/.bun/[email protected]+d6123d32214422cb/node_modules/ai/dist/index.mjs:8602:19)
at <anonymous> (../../node_modules/.bun/@[email protected]/node_modules/@modelcontextprotocol/sdk/dist/esm/client/stdio.js:83:81)
at emitError (node:events:43:23)
at abortChildProcess (node:child_process:935:17)
at onAbortListener2 (node:child_process:35:24)
at close (../../node_modules/.bun/@[email protected]/node_modules/@modelcontextprotocol/sdk/dist/esm/client/stdio.js:153:31)
at close (../../node_modules/.bun/@[email protected]/node_modules/@modelcontextprotocol/sdk/dist/esm/client/stdio.js:152:19)
at close (../../node_modules/.bun/[email protected]+d6123d32214422cb/node_modules/ai/dist/index.mjs:8444:60)
at close (../../node_modules/.bun/[email protected]+d6123d32214422cb/node_modules/ai/dist/index.mjs:8440:17)
➜ ocode --version 1.0.15
Quite consistently occuring with GLM 4.6, seems rarer on Anthropic models.
@27Bslash6 What provider are you using? is it Zai? They may be dropping your connection due to load
Yep, z.ai GLM.
I am observing the same error as of today. It happens on first prompt with OpenAI official API and also OpenAI-compatible custom endpoint. Google API works fine.
Is this maybe a problem in @ai-sdk?