opencode icon indicating copy to clipboard operation
opencode copied to clipboard

Opencode become stuck all of a sudden. Does not launch anymore.

Open raymelon opened this issue 3 weeks ago • 34 comments

Description

Curious if uninstalling is safe. If not, how can I make sure that my config and conversations are kept after the process?

This is what is happening right now:

Image

So I figured to just uninstall

I am running opencode in Windows 10, WSL 1.0 Ubuntu 20.14

OpenCode version

1.0.67

Steps to reproduce

No response

Screenshot and/or share link

No response

Operating System

Windows 10

Terminal

Warp

raymelon avatar Dec 13 '25 15:12 raymelon

This issue might be a duplicate of existing issues. Please check:

  • #5338: OpenCode becomes stuck on start if internet connection is not stable
  • #4682: OpenCode gets stuck when bun add gets stuck (can freeze indefinitely during initialization)
  • #5361: TUI freezes for ~10 seconds periodically on WSL2 (regression in v1.0.129)
  • #3798: OpenTUI causes crashes and heavy memory usage
  • #2689: OpenCode crashes when launched without explicit port

Feel free to ignore if none of these address your specific case.

github-actions[bot] avatar Dec 13 '25 15:12 github-actions[bot]

Well, apparently my opencode was updated to 1.0.142

opencode --print-logs --log-level DEBUG
INFO  2025-12-13T15:17:08 +220ms service=default version=1.0.142 args=["--print-logs","--log-level","DEBUG"] opencode

It looks like the update caused the issue?

raymelon avatar Dec 13 '25 15:12 raymelon

I decided to proceed with updating it to 1.0.125, but it also looks like it failed to detect the installed version.

Image

raymelon avatar Dec 13 '25 15:12 raymelon

This is bad. Despite updating to 1.0.152, it's still stuck.

Image

raymelon avatar Dec 13 '25 15:12 raymelon

hm that’s very strange

are you able to do:

opencode run hello

and pass print logs to that?

rekram1-node avatar Dec 13 '25 16:12 rekram1-node

I have a similar issue since this morning. It looks like an issue with TUI - open code is just stuck and doesn't open:

$ opencode --print-logs
INFO  2025-12-14T09:24:59 +83ms service=default version=1.0.152 args=["--print-logs"] opencode
ERROR 2025-12-14T09:24:59 +142ms service=acp-command promise={} reason=Failed to initialize OpenTUI render library: Failed to open library "/$bunfs/root/libopentui-5kxbhhq3.so": /$bunfs/root/libopentui-5kxbhhq3.so: cannot open shared object file: No such file or directory Unhandled rejection
ERROR 2025-12-14T09:24:59 +0ms service=default e=Failed to initialize OpenTUI render library: Failed to open library "/$bunfs/root/libopentui-5kxbhhq3.so": /$bunfs/root/libopentui-5kxbhhq3.so: cannot open shared object file: No such file or directory rejection
ERROR 2025-12-14T09:24:59 +0ms service=default Error: Failed to initialize OpenTUI render library: Failed to open library "/$bunfs/root/libopentui-5kxbhhq3.so": /$bunfs/root/libopentui-5kxbhhq3.so: cannot open shared object file: No such file or directory

This works: opencode run hello --print-logs

radomirml avatar Dec 14 '25 09:12 radomirml

Ignore my comment - in my case the problem was full /tmp partition

radomirml avatar Dec 14 '25 09:12 radomirml

This issue usually happens because of opencode autoupdating, which is why you see "No such file or directory" You may add this to your opencode.json to prevent this from happening.

"autoupdate": false,

CountElqyd avatar Dec 15 '25 02:12 CountElqyd

I have been having the exact same issue today (everything worked on Friday 12/12), very similar set up: Windows 11, Ubuntu 22.04 WSL 1. opencode opens on projects without actual code (i.e. markdown files), however coding projects hang as described in this thread.

Disabling lsp support and autoupdate globally yielded no change for me, unfortunately

However, I was able to get opencode working again when downgrading substantially (i.e. 1.0.0)

acbk2b avatar Dec 15 '25 20:12 acbk2b

Can anyone having issue show me:

opencode run hello --print-logs --model opencode/grok-code

and we can see if stuff is hanging and where

rekram1-node avatar Dec 15 '25 20:12 rekram1-node

Output from my end, looks like in my case the corporate proxy my laptop sits behind is stepping on opencode.ai (I can't access it in the browser. I was using the GitHub Copilot backend before, which was working fine

Edit: Ran with the wrong version, but shows the same on the latest (`

$ opencode run hello --print-logs --model opencode/grok-code
INFO 2025-12-15T20:39:53 +194ms service=default version=1.0.158 args=["run","hello","--print-logs","--model","opencode/grok-code"] opencode
INFO 2025-12-15T20:39:53 +1ms service=default directory=/home/user/repos/testproject creating instance
INFO 2025-12-15T20:39:53 +0ms service=project directory=/home/user/repos/testproject fromDirectory
INFO 2025-12-15T20:39:53 +24ms service=default directory=/home/user/repos/testproject bootstrapping
INFO 2025-12-15T20:39:53 +4ms service=config path=/home/user/.config/opencode/config.json loading
INFO 2025-12-15T20:39:53 +0ms service=config path=/home/user/.config/opencode/opencode.json loading
INFO 2025-12-15T20:39:53 +1ms service=config path=/home/user/.config/opencode/opencode.jsonc loading
INFO 2025-12-15T20:39:53 +2ms service=bun cmd=["/home/user/.config/nvm/versions/node/v22.21.1/lib/node_modules/opencode-ai/node_modules/opencode-linux-x64/bin/opencode","add","@opencode-ai/[email protected]","--exact"] cwd=/home/user/.config/opencode running
INFO 2025-12-15T20:39:53 +36ms service=bun code=0 stdout=bun add v1.3.4 (5eb2145b)

installed @opencode-ai/[email protected]

[6.00ms] done
stderr=Saved lockfile
done
INFO 2025-12-15T20:39:53 +3ms service=plugin [email protected] loading plugin
INFO 2025-12-15T20:39:53 +3ms service=plugin [email protected] loading plugin
INFO 2025-12-15T20:39:53 +31ms service=bus type=_ subscribing
INFO 2025-12-15T20:39:53 +1ms service=bus type=session.updated subscribing
INFO 2025-12-15T20:39:53 +0ms service=bus type=message.updated subscribing
INFO 2025-12-15T20:39:53 +0ms service=bus type=message.part.updated subscribing
INFO 2025-12-15T20:39:53 +0ms service=bus type=session.updated subscribing
INFO 2025-12-15T20:39:53 +0ms service=bus type=message.updated subscribing
INFO 2025-12-15T20:39:53 +0ms service=bus type=message.part.updated subscribing
INFO 2025-12-15T20:39:53 +0ms service=bus type=session.diff subscribing
INFO 2025-12-15T20:39:53 +0ms service=format init
INFO 2025-12-15T20:39:53 +0ms service=bus type=file.edited subscribing
INFO 2025-12-15T20:39:53 +0ms service=lsp serverIds=deno, typescript, vue, eslint, biome, gopls, ruby-lsp, pyright, elixir-ls, zls, csharp, fsharp, sourcekit-lsp, rust, clangd, svelte, astro, jdtls, yaml-ls, lua-ls, php intelephense, dart, ocaml-lsp, bash, terraform, texlab, dockerfile, gleam enabled LSP servers
INFO 2025-12-15T20:39:53 +0ms service=file.watcher init
INFO 2025-12-15T20:39:53 +3ms service=bus type=command.executed subscribing
INFO 2025-12-15T20:39:53 +5ms service=file.watcher platform=linux backend=inotify watcher backend
INFO 2025-12-15T20:39:53 +98ms service=vcs branch=master initialized
INFO 2025-12-15T20:39:53 +1ms service=bus type=file.watcher.updated subscribing
INFO 2025-12-15T20:39:53 +6ms service=server method=POST path=/session request
INFO 2025-12-15T20:39:53 +0ms service=server status=started method=POST path=/session request
INFO 2025-12-15T20:39:53 +3ms service=session id=session-id version=1.0.158 projectID=projectId directory=/home/user/repos/testproject title=New session - 2025-12-15T20:39:53.958Z time={"created":1765831193958,"updated":1765831193958} created
INFO 2025-12-15T20:39:53 +2ms service=bus type=session.created publishing
INFO 2025-12-15T20:39:53 +0ms service=bus type=session.updated publishing
INFO 2025-12-15T20:39:53 +2ms service=server status=completed duration=7 method=POST path=/session request
INFO 2025-12-15T20:39:53 +2ms service=server method=GET path=/config request
INFO 2025-12-15T20:39:53 +0ms service=server status=started method=GET path=/config request
INFO 2025-12-15T20:39:53 +0ms service=server status=completed duration=0 method=GET path=/config request
ERROR 2025-12-15T20:39:53 +0ms service=acp-command promise={} reason=Bad file descriptor Unhandled rejection
ERROR 2025-12-15T20:39:53 +0ms service=default e=Bad file descriptor rejection
INFO 2025-12-15T20:39:53 +5ms service=server method=GET path=/event request
INFO 2025-12-15T20:39:53 +0ms service=server status=started method=GET path=/event request
INFO 2025-12-15T20:39:53 +0ms service=server event connected
INFO 2025-12-15T20:39:53 +1ms service=bus type=_ subscribing
INFO 2025-12-15T20:39:53 +1ms service=server status=completed duration=2 method=GET path=/event request
INFO 2025-12-15T20:39:53 +0ms service=server method=POST path=/session/session-id/message request
INFO 2025-12-15T20:39:53 +1ms service=server status=started method=POST path=/session/session-id/message request
INFO 2025-12-15T20:39:53 +2ms service=server status=completed duration=3 method=POST path=/session/session-id/message request
INFO 2025-12-15T20:39:53 +24ms service=bus type=message.updated publishing
INFO 2025-12-15T20:39:54 +2ms service=provider status=started state
INFO 2025-12-15T20:39:54 +2ms service=models.dev file={} refreshing
INFO 2025-12-15T20:39:54 +11ms service=provider init
INFO 2025-12-15T20:39:54 +0ms service=bus type=message.part.updated publishing
INFO 2025-12-15T20:39:54 +2ms service=bus type=session.updated publishing
INFO 2025-12-15T20:39:54 +2ms service=bus type=session.status publishing
INFO 2025-12-15T20:39:54 +0ms service=session.prompt step=0 sessionID=session-id loop
INFO 2025-12-15T20:39:54 +2ms service=provider providerID=opencode found
INFO 2025-12-15T20:39:54 +0ms service=provider status=completed duration=19 state
INFO 2025-12-15T20:39:54 +30ms service=llm providerID=opencode modelID=gpt-5-nano sessionID=session-id small=true agent=title stream
INFO 2025-12-15T20:39:54 +1ms service=provider status=started providerID=opencode getSDK
INFO 2025-12-15T20:39:54 +0ms service=provider providerID=opencode pkg=@ai-sdk/openai using bundled provider
INFO 2025-12-15T20:39:54 +0ms service=provider status=completed duration=0 providerID=opencode getSDK
INFO 2025-12-15T20:39:54 +0ms service=bus type=message.updated publishing
INFO 2025-12-15T20:39:54 +1ms service=session.prompt status=started resolveTools
INFO 2025-12-15T20:39:54 +2ms service=llm providerID=opencode modelID=gpt-5-nano sessionID=session-id small=true agent=title params={"options":{"reasoningEffort":"minimal","promptCacheKey":"session-id","include":["reasoning.encrypted_content"],"reasoningSummary":"auto"}} params
INFO 2025-12-15T20:39:54 +10ms service=tool.registry status=started invalid
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=started bash
INFO 2025-12-15T20:39:54 +0ms service=bash-tool shell=/bin/zsh bash tool using shell
INFO 2025-12-15T20:39:54 +1ms service=tool.registry status=started read
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=started glob
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=started grep
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=started list
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=started edit
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=started write
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=started task
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=started webfetch
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=started todowrite
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=started todoread
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=started websearch
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=started codesearch
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=completed duration=1 invalid
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=completed duration=0 read
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=completed duration=0 glob
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=completed duration=0 grep
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=completed duration=0 list
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=completed duration=0 edit
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=completed duration=0 write
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=completed duration=0 webfetch
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=completed duration=0 todowrite
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=completed duration=0 todoread
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=completed duration=0 websearch
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=completed duration=0 codesearch
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=completed duration=1 bash
INFO 2025-12-15T20:39:54 +0ms service=tool.registry status=completed duration=0 task
INFO 2025-12-15T20:39:54 +4ms service=session.prompt status=completed duration=17 resolveTools
INFO 2025-12-15T20:39:54 +1ms service=ripgrep cwd=/home/user/repos/testproject limit=200 tree
INFO 2025-12-15T20:39:54 +15ms service=bus type=message.updated publishing
INFO 2025-12-15T20:39:54 +1ms service=llm providerID=opencode modelID=gpt-5-nano sessionID=session-id small=true agent=title stream
INFO 2025-12-15T20:39:54 +0ms service=llm providerID=opencode modelID=gpt-5-nano sessionID=session-id small=true agent=title params={"options":{"reasoningEffort":"minimal","promptCacheKey":"session-id","include":["reasoning.encrypted_content"],"reasoningSummary":"auto"}} params
INFO 2025-12-15T20:39:54 +2ms service=bus type=session.updated publishing
INFO 2025-12-15T20:39:54 +1ms service=bus type=session.diff publishing
INFO 2025-12-15T20:39:54 +9ms service=session.processor process
INFO 2025-12-15T20:39:54 +0ms service=llm providerID=opencode modelID=grok-code sessionID=session-id small=false agent=build stream
INFO 2025-12-15T20:39:54 +0ms service=provider status=started providerID=opencode getSDK
INFO 2025-12-15T20:39:54 +0ms service=provider providerID=opencode pkg=@ai-sdk/openai-compatible using bundled provider
INFO 2025-12-15T20:39:54 +0ms service=provider status=completed duration=0 providerID=opencode getSDK
INFO 2025-12-15T20:39:54 +1ms service=llm providerID=opencode modelID=grok-code sessionID=session-id small=false agent=build params={"options":{}} params
INFO 2025-12-15T20:39:54 +0ms service=bus type=session.status publishing
ERROR 2025-12-15T20:39:54 +774ms service=llm providerID=opencode modelID=grok-code sessionID=session-id small=false agent=build error={"error":{"code":"ConnectionRefused","path":"https://opencode.ai/zen/v1/responses","errno":0}} stream error
ERROR 2025-12-15T20:39:54 +2ms service=acp-command promise={} reason=No output generated. Check the stream for errors. Unhandled rejection
ERROR 2025-12-15T20:39:54 +0ms service=default e=No output generated. Check the stream for errors. rejection
ERROR 2025-12-15T20:39:54 +0ms service=llm providerID=opencode modelID=grok-code sessionID=session-id small=false agent=build error={"error":{"code":"ConnectionRefused","path":"https://opencode.ai/zen/v1/responses","errno":0}} stream error
ERROR 2025-12-15T20:39:54 +0ms service=session.prompt error=No output generated. Check the stream for errors. failed to generate title
ERROR 2025-12-15T20:39:54 +1ms service=llm providerID=opencode modelID=grok-code sessionID=session-id small=false agent=build error={"error":{"code":"ConnectionRefused","path":"https://opencode.ai/zen/v1/chat/completions","errno":0}} stream error
ERROR 2025-12-15T20:39:54 +1ms service=session.processor error=Unable to connect. Is the computer able to access the url? process
INFO 2025-12-15T20:39:54 +0ms service=bus type=session.error publishing
INFO 2025-12-15T20:39:54 +2ms service=bus type=message.updated publishing
INFO 2025-12-15T20:39:54 +1ms service=session.compaction pruning
Error: Error: Unable to connect. Is the computer able to access the url?
INFO 2025-12-15T20:39:54 +37ms service=session.prompt sessionID=session-id cancel
INFO 2025-12-15T20:39:54 +0ms service=bus type=session.status publishing
INFO 2025-12-15T20:39:54 +0ms service=bus type=session.idle publishing

acbk2b avatar Dec 15 '25 20:12 acbk2b

@acbk2b can you try the same with a copilot model?

opencode run hello --print-logs --model github-copilot/gpt-4.1

rekram1-node avatar Dec 15 '25 20:12 rekram1-node

I get a ProviderModelNotFoundError error on that:

ProviderModelNotFoundError: ProviderModelNotFoundError
 data: {
  providerID: "github-copilot",
  modelID: "gpt-4.1",
  suggestions: [],
},

acbk2b avatar Dec 15 '25 20:12 acbk2b

Well are you authenticated to github copilot? Or copilot enterprise?

maybe pick any model from: opencode models that you know has worked in the past that won't get blocked by your proxy

rekram1-node avatar Dec 15 '25 20:12 rekram1-node

Ah yeah, good point. I lost my Copilot auth after reinstalling a couple times. Here you are:

$ opencode run hello --print-logs --model github-copilot/gpt-4.1                                                                                                                        <<<
INFO  2025-12-15T20:59:24 +210ms service=default version=1.0.158 args=["run","hello","--print-logs","--model","github-copilot/gpt-4.1"] opencode
INFO  2025-12-15T20:59:24 +0ms service=default directory=/home/USER/repos/testproject creating instance
INFO  2025-12-15T20:59:24 +1ms service=project directory=/home/USER/repos/testproject fromDirectory
INFO  2025-12-15T20:59:24 +22ms service=default directory=/home/USER/repos/testproject bootstrapping
INFO  2025-12-15T20:59:24 +3ms service=config path=/home/USER/.config/opencode/config.json loading
INFO  2025-12-15T20:59:24 +1ms service=config path=/home/USER/.config/opencode/opencode.json loading
INFO  2025-12-15T20:59:24 +0ms service=config path=/home/USER/.config/opencode/opencode.jsonc loading
INFO  2025-12-15T20:59:24 +2ms service=bun cmd=["/home/USER/.config/nvm/versions/node/v22.21.1/lib/node_modules/opencode-ai/node_modules/opencode-linux-x64/bin/opencode","add","@opencode-ai/[email protected]","--exact"] cwd=/home/USER/.config/opencode running
INFO  2025-12-15T20:59:24 +31ms service=bun code=0 stdout=bun add v1.3.4 (5eb2145b)

installed @opencode-ai/[email protected]

[5.00ms] done
 stderr=Saved lockfile
 done
INFO  2025-12-15T20:59:24 +3ms service=plugin [email protected] loading plugin
INFO  2025-12-15T20:59:24 +3ms service=plugin [email protected] loading plugin
INFO  2025-12-15T20:59:24 +31ms service=bus type=* subscribing
INFO  2025-12-15T20:59:24 +0ms service=bus type=session.updated subscribing
INFO  2025-12-15T20:59:24 +0ms service=bus type=message.updated subscribing
INFO  2025-12-15T20:59:24 +0ms service=bus type=message.part.updated subscribing
INFO  2025-12-15T20:59:24 +1ms service=bus type=session.updated subscribing
INFO  2025-12-15T20:59:24 +0ms service=bus type=message.updated subscribing
INFO  2025-12-15T20:59:24 +0ms service=bus type=message.part.updated subscribing
INFO  2025-12-15T20:59:24 +0ms service=bus type=session.diff subscribing
INFO  2025-12-15T20:59:24 +0ms service=format init
INFO  2025-12-15T20:59:24 +0ms service=bus type=file.edited subscribing
INFO  2025-12-15T20:59:24 +0ms service=lsp serverIds=deno, typescript, vue, eslint, biome, gopls, ruby-lsp, pyright, elixir-ls, zls, csharp, fsharp, sourcekit-lsp, rust, clangd, svelte, astro, jdtls, yaml-ls, lua-ls, php intelephense, dart, ocaml-lsp, bash, terraform, texlab, dockerfile, gleam enabled LSP servers
INFO  2025-12-15T20:59:24 +0ms service=file.watcher init
INFO  2025-12-15T20:59:24 +2ms service=bus type=command.executed subscribing
INFO  2025-12-15T20:59:24 +8ms service=file.watcher platform=linux backend=inotify watcher backend
INFO  2025-12-15T20:59:24 +103ms service=vcs branch=master initialized
INFO  2025-12-15T20:59:24 +0ms service=bus type=file.watcher.updated subscribing
INFO  2025-12-15T20:59:24 +8ms service=server method=POST path=/session request
INFO  2025-12-15T20:59:24 +0ms service=server status=started method=POST path=/session request
INFO  2025-12-15T20:59:24 +2ms service=session id=ses_REDACTED version=1.0.158 projectID=PROJECT_ID_REDACTED directory=/home/USER/repos/testproject title=New session - 2025-12-15T20:59:24.574Z time={"created":1765832364574,"updated":1765832364574} created
INFO  2025-12-15T20:59:24 +1ms service=bus type=session.created publishing
INFO  2025-12-15T20:59:24 +1ms service=bus type=session.updated publishing
INFO  2025-12-15T20:59:24 +1ms service=server status=completed duration=5 method=POST path=/session request
INFO  2025-12-15T20:59:24 +1ms service=server method=GET path=/config request
INFO  2025-12-15T20:59:24 +0ms service=server status=started method=GET path=/config request
INFO  2025-12-15T20:59:24 +0ms service=server status=completed duration=0 method=GET path=/config request
ERROR 2025-12-15T20:59:24 +0ms service=acp-command promise={} reason=Bad file descriptor Unhandled rejection
ERROR 2025-12-15T20:59:24 +0ms service=default e=Bad file descriptor rejection
INFO  2025-12-15T20:59:24 +2ms service=server method=GET path=/event request
INFO  2025-12-15T20:59:24 +0ms service=server status=started method=GET path=/event request
INFO  2025-12-15T20:59:24 +1ms service=server event connected
INFO  2025-12-15T20:59:24 +1ms service=bus type=* subscribing
INFO  2025-12-15T20:59:24 +1ms service=server status=completed duration=3 method=GET path=/event request
INFO  2025-12-15T20:59:24 +3ms service=server method=POST path=/session/ses_REDACTED/message request
INFO  2025-12-15T20:59:24 +0ms service=server status=started method=POST path=/session/ses_REDACTED/message request
INFO  2025-12-15T20:59:24 +2ms service=server status=completed duration=2 method=POST path=/session/ses_REDACTED/message request
INFO  2025-12-15T20:59:24 +24ms service=bus type=message.updated publishing
INFO  2025-12-15T20:59:24 +2ms service=provider status=started state
INFO  2025-12-15T20:59:24 +2ms service=models.dev file={} refreshing
INFO  2025-12-15T20:59:24 +9ms service=provider init
INFO  2025-12-15T20:59:24 +0ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:24 +2ms service=bus type=session.updated publishing
INFO  2025-12-15T20:59:24 +2ms service=bus type=session.status publishing
INFO  2025-12-15T20:59:24 +0ms service=session.prompt step=0 sessionID=ses_REDACTED loop
INFO  2025-12-15T20:59:24 +3ms service=provider providerID=github-copilot found
INFO  2025-12-15T20:59:24 +1ms service=provider providerID=opencode found
INFO  2025-12-15T20:59:24 +0ms service=provider status=completed duration=19 state
INFO  2025-12-15T20:59:24 +25ms service=llm providerID=opencode modelID=gpt-5-nano sessionID=ses_REDACTED small=true agent=title stream
INFO  2025-12-15T20:59:24 +1ms service=provider status=started providerID=opencode getSDK
INFO  2025-12-15T20:59:24 +0ms service=bus type=message.updated publishing
INFO  2025-12-15T20:59:24 +0ms service=provider providerID=opencode pkg=@ai-sdk/openai using bundled provider
INFO  2025-12-15T20:59:24 +0ms service=provider status=completed duration=0 providerID=opencode getSDK
INFO  2025-12-15T20:59:24 +0ms service=session.prompt status=started resolveTools
INFO  2025-12-15T20:59:24 +2ms service=llm providerID=opencode modelID=gpt-5-nano sessionID=ses_REDACTED small=true agent=title params={"options":{"reasoningEffort":"minimal","promptCacheKey":"ses_REDACTED","include":["reasoning.encrypted_content"],"reasoningSummary":"auto"}} params
INFO  2025-12-15T20:59:24 +11ms service=tool.registry status=started invalid
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=started bash
INFO  2025-12-15T20:59:24 +0ms service=bash-tool shell=/bin/zsh bash tool using shell
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=started read
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=started glob
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=started grep
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=started list
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=started edit
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=started write
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=started task
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=started webfetch
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=started todowrite
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=started todoread
INFO  2025-12-15T20:59:24 +1ms service=tool.registry status=completed duration=1 invalid
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=completed duration=1 read
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=completed duration=1 glob
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=completed duration=1 grep
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=completed duration=1 list
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=completed duration=1 edit
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=completed duration=1 write
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=completed duration=1 webfetch
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=completed duration=1 todowrite
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=completed duration=1 todoread
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=completed duration=1 bash
INFO  2025-12-15T20:59:24 +0ms service=tool.registry status=completed duration=1 task
INFO  2025-12-15T20:59:24 +3ms service=session.prompt status=completed duration=17 resolveTools
INFO  2025-12-15T20:59:24 +1ms service=ripgrep cwd=/home/USER/repos/testproject limit=200 tree
INFO  2025-12-15T20:59:24 +16ms service=bus type=message.updated publishing
INFO  2025-12-15T20:59:24 +0ms service=llm providerID=opencode modelID=gpt-5-nano sessionID=ses_REDACTED small=true agent=title stream
INFO  2025-12-15T20:59:24 +1ms service=llm providerID=opencode modelID=gpt-5-nano sessionID=ses_REDACTED small=true agent=title params={"options":{"reasoningEffort":"minimal","promptCacheKey":"ses_REDACTED","include":["reasoning.encrypted_content"],"reasoningSummary":"auto"}} params
INFO  2025-12-15T20:59:24 +1ms service=bus type=session.updated publishing
INFO  2025-12-15T20:59:24 +2ms service=bus type=session.diff publishing
INFO  2025-12-15T20:59:24 +7ms service=session.processor process
INFO  2025-12-15T20:59:24 +0ms service=llm providerID=github-copilot modelID=gpt-4.1 sessionID=ses_REDACTED small=false agent=build stream
INFO  2025-12-15T20:59:24 +0ms service=provider status=started providerID=github-copilot getSDK
INFO  2025-12-15T20:59:24 +0ms service=provider providerID=github-copilot pkg=@ai-sdk/github-copilot using bundled provider
INFO  2025-12-15T20:59:24 +0ms service=provider status=completed duration=0 providerID=github-copilot getSDK
INFO  2025-12-15T20:59:24 +1ms service=llm providerID=github-copilot modelID=gpt-4.1 sessionID=ses_REDACTED small=false agent=build params={"options":{}} params
INFO  2025-12-15T20:59:24 +1ms service=bus type=session.status publishing
INFO  2025-12-15T20:59:25 +602ms service=server method=PUT path=/auth/github-copilot request
INFO  2025-12-15T20:59:25 +1ms service=server status=started method=PUT path=/auth/github-copilot request
INFO  2025-12-15T20:59:25 +6ms service=server status=completed duration=6 method=PUT path=/auth/github-copilot request
ERROR 2025-12-15T20:59:25 +163ms service=llm providerID=github-copilot modelID=gpt-4.1 sessionID=ses_REDACTED small=false agent=build error={"error":{"code":"ConnectionRefused","path":"https://opencode.ai/zen/v1/responses","errno":0}} stream error
ERROR 2025-12-15T20:59:25 +1ms service=acp-command promise={} reason=No output generated. Check the stream for errors. Unhandled rejection
ERROR 2025-12-15T20:59:25 +1ms service=default e=No output generated. Check the stream for errors. rejection
ERROR 2025-12-15T20:59:25 +19ms service=llm providerID=github-copilot modelID=gpt-4.1 sessionID=ses_REDACTED small=false agent=build error={"error":{"code":"ConnectionRefused","path":"https://opencode.ai/zen/v1/responses","errno":0}} stream error
ERROR 2025-12-15T20:59:25 +0ms service=session.prompt error=No output generated. Check the stream for errors. failed to generate title
ERROR 2025-12-15T20:59:25 +78ms service=acp-command promise={} reason=NotFoundError Unhandled rejection
ERROR 2025-12-15T20:59:25 +0ms service=default e=NotFoundError rejection
INFO  2025-12-15T20:59:27 +1809ms service=snapshot initialized
INFO  2025-12-15T20:59:31 +4305ms service=snapshot hash=COMMIT_HASH_REDACTED
 cwd=/home/USER/repos/testproject git=/home/USER/.local/share/opencode/snapshot/PROJECT_ID_REDACTED tracking
INFO  2025-12-15T20:59:31 +2ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +3ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +0ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +0ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +0ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +0ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +0ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +0ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +0ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.part.updated publishing

Hello! How can I help you today? If you have a task for me or any questions, just let me know.

INFO  2025-12-15T20:59:31 +62ms service=snapshot hash=COMMIT_HASH_REDACTED
 cwd=/home/USER/repos/testproject git=/home/USER/.local/share/opencode/snapshot/PROJECT_ID_REDACTED tracking
INFO  2025-12-15T20:59:31 +2ms service=bus type=message.part.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=bus type=message.updated publishing
INFO  2025-12-15T20:59:31 +64ms service=bus type=message.updated publishing
INFO  2025-12-15T20:59:31 +0ms service=bus type=session.status publishing
INFO  2025-12-15T20:59:31 +1ms service=session.prompt step=1 sessionID=ses_REDACTED loop
INFO  2025-12-15T20:59:31 +17ms service=bus type=session.updated publishing
INFO  2025-12-15T20:59:31 +0ms service=bus type=session.diff publishing
INFO  2025-12-15T20:59:31 +4ms service=bus type=message.updated publishing
INFO  2025-12-15T20:59:31 +1ms service=llm providerID=opencode modelID=gpt-5-nano sessionID=ses_REDACTED small=true agent=title stream
INFO  2025-12-15T20:59:31 +0ms service=llm providerID=opencode modelID=gpt-5-nano sessionID=ses_REDACTED small=true agent=title params={"options":{"reasoningEffort":"minimal","promptCacheKey":"ses_REDACTED","include":["reasoning.encrypted_content"],"reasoningSummary":"auto"}} params
INFO  2025-12-15T20:59:31 +7ms service=session.prompt sessionID=ses_REDACTED exiting loop
INFO  2025-12-15T20:59:31 +0ms service=session.compaction pruning
INFO  2025-12-15T20:59:31 +3ms service=session.prompt sessionID=ses_REDACTED cancel
INFO  2025-12-15T20:59:31 +0ms service=bus type=session.status publishing
INFO  2025-12-15T20:59:31 +0ms service=bus type=session.idle publishing
INFO  2025-12-15T20:59:31 +1ms service=default directory=/home/USER/repos/testproject disposing instance
INFO  2025-12-15T20:59:31 +0ms service=state key=/home/USER/repos/testproject waiting for state disposal to complete
INFO  2025-12-15T20:59:31 +0ms service=bus type=file.watcher.updated unsubscribing
ERROR 2025-12-15T20:59:31 +1ms service=state error=Bad file descriptor key=/home/USER/repos/testproject Error while disposing state:
INFO  2025-12-15T20:59:31 +0ms service=state key=/home/USER/repos/testproject state disposal completed

acbk2b avatar Dec 15 '25 21:12 acbk2b

Okay so that works, but if you try normal opencode what happens (without run command)?

Hangs forever? Blank Screen?

rekram1-node avatar Dec 15 '25 22:12 rekram1-node

Exactly, yeah. It just hangs in the terminal after running opencode

acbk2b avatar Dec 15 '25 22:12 acbk2b

and you are on wsl windows right, is that true for everyone in here or is it some wsl and some not?

Hmmm

rekram1-node avatar Dec 15 '25 23:12 rekram1-node

Yep, I am on WSL (stuck on version 1, unfortunately)

acbk2b avatar Dec 16 '25 00:12 acbk2b

@acbk2b unlikely to fix but can you try maybe doing: OPENCODE_DISABLE_AUTOUPDATE=1 opencode

Spawning it w/ that env var?

rekram1-node avatar Dec 16 '25 01:12 rekram1-node

No dice there either, unfortunately I did also change from an npm install to a bun install, but that did not appear to make any noticeable difference

acbk2b avatar Dec 16 '25 14:12 acbk2b

What versions are u trying? I heard some people say that 1.0.152 -> 1.0.153 had issues, maybe try 1.0.152 instead or lower?

rekram1-node avatar Dec 16 '25 15:12 rekram1-node

I was on 1.0.153, and I upgraded to 1.0.163 this morning and still have the same issue. I have tried a number of older ones, 1.0.150 did not work. IIRC 1.0.0 did work, though I didn't go through and incrementally figure out where the issue arose

acbk2b avatar Dec 16 '25 15:12 acbk2b

though I didn't go through and incrementally figure out where the issue arose

Yeah no need to do that.. hm okay

And does this happen no matter where you spawn it?

rekram1-node avatar Dec 16 '25 15:12 rekram1-node

Interestingly no, only most coding projects it seems.

From my testing:

Works

  • bash
  • ruby
  • plain old markdown files

Does not work

  • python
  • go
  • java
  • node

Which makes a guy thing it is an LSP issue, but disabling that did not work for me

{
  "lsp": false,
  "$schema": "https://opencode.ai/config.json"
}

acbk2b avatar Dec 16 '25 15:12 acbk2b

Wait so if you open it in a dir with only markdown files, it works, but in one that has actual like python files, then the tui never shows?

rekram1-node avatar Dec 16 '25 16:12 rekram1-node

Exactly, yeah. Something to do with coding projects/LSPs causes it to choke, it seems

acbk2b avatar Dec 16 '25 16:12 acbk2b

hm that's weird.. does it matter size of the project at all? like if u made a dir with a single python file, what happens?

Is it a git repo? or no?

Sorry lotta questions but i can't replicate

rekram1-node avatar Dec 16 '25 17:12 rekram1-node

All good, I appreciate the help! git repo or not does not appear to make a difference, but I was able to get opencode to open with a single Python/Java/Go file

I did a deeper dive on some of my various repos, and the behavior is a little more inconsistent than I thought...some go/java/python projects work and some do not. The plot thickens! I will check when I get some time if there is any common themes across them, size ironically does not appear to be the distinguishing factor

acbk2b avatar Dec 16 '25 18:12 acbk2b

I have 1 suspect in mind thus far, note that the lsps should only become an issue after file edits (if ever) but if the tui isn't even spawning it has to be something else

rekram1-node avatar Dec 16 '25 20:12 rekram1-node