opencode icon indicating copy to clipboard operation
opencode copied to clipboard

Opencode stuck in "generating"

Open goszczynskip opened this issue 3 months ago • 48 comments

When submitting a prompt, opencode doesn't return any response and is unresponsive to double-esc key press (interrupt). It shows "generating". Stopping process with double Ctrl-C exits tui but doesn't return to terminal. Hitting Ctrl-C the third time exits back to the terminal.

I've tried to change the model or prompt. Still, it produces the same

- kitty 0.37.0
- opencode 0.6.8
- model (tested on Anthropic Sonnet 4 and GPT OSS 120B (groq)).

goszczynskip avatar Sep 09 '25 08:09 goszczynskip

This issue might be a duplicate of existing issues. Please check:

  • #2137: Very similar behavior - getting stuck at 'Generating' state with interruption issues across different models
  • #1418: Getting stuck on 'working' state with double ESC not exiting, also in terminal environment
  • #2494: Interrupt functionality (ESC twice) not working as expected
  • #1179: ESC interruption often doesn't work, leading to unusable sessions
  • #888: ESC interrupt oscillating between states without actually interrupting (closed but may be related)

Feel free to ignore if none of these address your specific case.

github-actions[bot] avatar Sep 09 '25 08:09 github-actions[bot]

Running without tui is also hanging

 redacted-user@redacted-host  ~/redacted   main ±  opencode run "Hello" --print-logs --model="anthropic/claude-sonnet-4-20250514"
INFO  2025-09-09T09:06:20 +52ms service=default version=0.6.8 args=["run","Hello","--print-logs","--model=anthropic/claude-sonnet-4-20250514"] opencode
INFO  2025-09-09T09:06:20 +0ms service=project directory=/Users/redacted-user/redacted-path fromDirectory
INFO  2025-09-09T09:06:20 +24ms service=config path=/Users/redacted-user/.config/opencode/config.json loading
INFO  2025-09-09T09:06:20 +0ms service=config path=/Users/redacted-user/.config/opencode/opencode.json loading
INFO  2025-09-09T09:06:20 +2ms service=config path=/Users/redacted-user/.config/opencode/opencode.jsonc loading
INFO  2025-09-09T09:06:20 +1ms service=plugin [email protected] loading plugin
INFO  2025-09-09T09:06:20 +1ms service=plugin [email protected] loading plugin
INFO  2025-09-09T09:06:20 +10ms service=bus type=* subscribing
INFO  2025-09-09T09:06:20 +0ms service=bus type=session.updated subscribing
INFO  2025-09-09T09:06:20 +0ms service=bus type=message.updated subscribing
INFO  2025-09-09T09:06:20 +0ms service=bus type=message.part.updated subscribing
INFO  2025-09-09T09:06:20 +0ms service=format init
INFO  2025-09-09T09:06:20 +1ms service=bus type=file.edited subscribing
INFO  2025-09-09T09:06:20 +0ms service=session id=ses_6d2461e5fffep9BBGQA63SmHWV version=0.6.8 projectID=eab9d385a6d2211e0051f4dea79ee790bd1e6df1 directory=/Users/redacted-user/redacted-path title=New session - 2025-09-09T09:06:20.704Z time={"created":1757408780704,"updated":1757408780704} created
INFO  2025-09-09T09:06:20 +0ms service=lsp serverIds=typescript, vue, eslint, gopls, ruby-lsp, pyright, elixir-ls, zls, csharp, rust, clangd enabled LSP servers
INFO  2025-09-09T09:06:20 +1ms service=bus type=session.updated publishing
INFO  2025-09-09T09:06:20 +1ms service=bus type=message.part.updated subscribing
INFO  2025-09-09T09:06:20 +0ms service=bus type=session.error subscribing
INFO  2025-09-09T09:06:20 +1ms service=session session=ses_6d2461e5fffep9BBGQA63SmHWV chatting
INFO  2025-09-09T09:06:20 +2ms service=bus type=message.updated publishing
INFO  2025-09-09T09:06:20 +0ms service=bus type=message.part.updated publishing
INFO  2025-09-09T09:06:20 +1ms service=bus type=session.updated publishing
INFO  2025-09-09T09:06:20 +2ms service=models.dev file={} refreshing
INFO  2025-09-09T09:06:20 +1ms service=provider init
INFO  2025-09-09T09:06:20 +0ms service=provider providerID=openrouter found
INFO  2025-09-09T09:06:20 +1ms service=provider providerID=anthropic found
INFO  2025-09-09T09:06:20 +0ms service=provider providerID=groq found
INFO  2025-09-09T09:06:20 +0ms service=provider providerID=opencode found
INFO  2025-09-09T09:06:20 +0ms service=provider providerID=anthropic modelID=claude-sonnet-4-20250514 getModel
INFO  2025-09-09T09:06:20 +0ms service=provider status=started providerID=anthropic getSDK
INFO  2025-09-09T09:06:20 +55ms service=provider status=completed duration=55 providerID=anthropic getSDK
INFO  2025-09-09T09:06:20 +0ms service=provider providerID=anthropic modelID=claude-sonnet-4-20250514 found
INFO  2025-09-09T09:06:20 +2ms service=session session=ses_6d2461e5fffep9BBGQA63SmHWV sessionID=ses_6d2461e5fffep9BBGQA63SmHWV locking
INFO  2025-09-09T09:06:20 +1ms service=provider providerID=anthropic modelID=claude-3-5-haiku-20241022 getModel
INFO  2025-09-09T09:06:20 +0ms service=provider status=started providerID=anthropic getSDK
INFO  2025-09-09T09:06:20 +0ms service=provider status=completed duration=0 providerID=anthropic getSDK
INFO  2025-09-09T09:06:20 +0ms service=provider providerID=anthropic modelID=claude-3-5-haiku-20241022 found
INFO  2025-09-09T09:06:20 +22ms service=bus type=message.updated publishing
INFO  2025-09-09T09:06:20 +1ms service=mcp key=figma type=remote found
INFO  2025-09-09T09:06:21 +1001ms service=bus type=session.updated publishing

[stuck here]

goszczynskip avatar Sep 09 '25 09:09 goszczynskip

I've tried to delete ~/.cache/opencode, but it didn't resolve the issue. It's still stuck on session.update publishing

goszczynskip avatar Sep 09 '25 09:09 goszczynskip

Claude Code also gets stuck like this sometimes, sending just a . can kick it and make it carry on

BrianLeishman avatar Sep 09 '25 14:09 BrianLeishman

Does this hang happen regardless of directory?

rekram1-node avatar Sep 09 '25 15:09 rekram1-node

Yes, I've just tested it in a different directory — same behavior.

goszczynskip avatar Sep 09 '25 16:09 goszczynskip

not sure if it helps, but I ran into this with groq through my litellm proxy. Because it's through litellm, i saw this error in the logs section of the litellm ui:

litellm.RateLimitError: RateLimitError: GroqException - {"error":{"message":"Request too large for model `moonshotai/kimi-k2-instruct-0905` in organization `xxxxxxxxxxxx` service tier `on_demand` on tokens per minute (TPM): Limit 10000, Requested 12874, please reduce your message size and try again. Need more tokens? Upgrade to Dev Tier today at https://console.groq.com/settings/billing","type":"tokens","code":"rate_limit_exceeded"}}

...it turns out my credit card expired so they downgraded me to free tier (with the silly low limit). Since you're also on Groq, check your limits...that might be it

colout avatar Sep 09 '25 20:09 colout

I'm having the same problem. I installed opencode today after an AI_APICallError: The messages parameter is illegal. Please check the documentation. I told it to continue, it read the file and responded but stopped after that it didn't work anymore even though I told it to continue, after I told it to continue several times it gave an error of

DEBUG 2025-09-09T21:19:50 +1ms service=tui timeTakenMs=30 messages. renderView
ERROR 2025-09-09T21:19:50 +24ms service=session session=ses_6cfa6a9a3ffee6MUfm4v6n9461 error=Expected 'id' to be a string.
INFO 2025-09-09T21:19:50 +1ms service=bus type=session.error publishing
INFO 2025-09-09T21:19:50 +3ms service=bus type=message. updated publishing
INFO 2025-09-09T21:19:50 +0ms service=session session=ses_6cfa6a9a3ffee6MUfm4v6n9461 sessionID=ses_6cfa6a9a3ffee6MUfm4v6n9461 unlocking
INFO 2025-09-09T21:19:50 +1ms service=bus type=session. idle publishing
INFO 2025-09-09T21:19:50 +0ms service=server duration=4923 response
INFO 2025-09-09T21:19:50 +1ms service=project directory=/home/hack/Dev/fastdl-source fromDirectory
ERROR 2025-09-09T21:19:50 +0ms service=tui message=AI_InvalidResponseDataError: Expected 'id' to be a string. name=UnknownError Server error

after that it worked again

Provider: z.ai Model: GLM-4.5-Flash

after update 0.6.9 it worked again for a while and stuck again

ViniciusRed avatar Sep 09 '25 21:09 ViniciusRed

I was playing around and resolved this by downgrading versions sequentially to 0.6.5. After running again, I got it working. Now it started working after upgrading back to 0.6.10. I'm not sure if downgrades fixed anything, or if the application was in a bad state and some LLM response got it unstuck.

Although the issue of ignored interrupt signals (esc+esc) during the session.updated publishing event is still present in the code, so I'm leaving this issue open.

goszczynskip avatar Sep 10 '25 08:09 goszczynskip

Sometimes the same problem

NaikSoftware avatar Sep 10 '25 12:09 NaikSoftware

Yeah happens for me too. Unfortunately too often

maciejk-code avatar Sep 11 '25 18:09 maciejk-code

after 3 or 4 prompts, it just get stuck

mauriciojuniorsympla avatar Sep 11 '25 19:09 mauriciojuniorsympla

This seems related to mcp servers, @goszczynskip can you disable the figma mcp and try again?

The remote mcps seem to be hanging

same goes for others in this thread, try disabling your remote mcp

rekram1-node avatar Sep 11 '25 19:09 rekram1-node

Stuck again on the latest 0.9.0. After completing a plan in planning mode. All results are successfully rendered, but I see the message "Generating..."

Image

NaikSoftware avatar Sep 15 '25 14:09 NaikSoftware

I am also getting stuck here sometimes. Even hitting esc to cancel does nothing. I have to ctrl-c out. If I restart opencode with --continue it is stuck back in that loop again. If I start a new instance of opencode it seems OK.

I've let it sit for 3 hours and it was still stuck at this step.

I don't know where the context is stored for --continue but it may make sense to see if that can be accessed and perhaps something in that context is putting opencode into the funky state?

bcardarella avatar Sep 23 '25 14:09 bcardarella

@bcardarella if you do the --continue flag you would need to reprompt llm with at least a "continue" or a "." are you trying that

opencode --continue just opens up last session (just wanna make that clear)

rekram1-node avatar Sep 23 '25 15:09 rekram1-node

@rekram1-node next time this happens I'll do a screen recording so show the behavior, but what I was seeing was Opencode was unresponsive. It was stuck at the Generating... and when I reprompted it just stalled as well. I didn't try . in the prompt though so if I run into this again I'll give that a shot to see what happens.

bcardarella avatar Sep 23 '25 15:09 bcardarella

@bcardarella yeah there is a UI bug but if you exit opencode and reopen it is no longer in a Generating state, you would need to reprompt it in that case

rekram1-node avatar Sep 23 '25 15:09 rekram1-node

This happens when you hit rate limits on calude. opencode does not show the rate limit error and gets stuck in generating instead. If you open claude-code you'll see the error.

imekinox avatar Sep 25 '25 04:09 imekinox

@imekinox I'm on Max plan signed in with the Claude Code login with Opencode. Furthermore, this has happened when using for the first time in over a 24 hour period.

bcardarella avatar Sep 25 '25 09:09 bcardarella

I'm in the max plan too and just happened to me when I posted here. It might be something else in your case but definitely the client is not handling rate limits properly.

imekinox avatar Sep 25 '25 13:09 imekinox

Perhaps? Is there a debug mode for Opencode mode so we could see behind the curtain or is Claude's api just a black box for this stiff?

bcardarella avatar Sep 25 '25 13:09 bcardarella

if you run with opencode run "<prompt>" --print-logs you can see the logs

You could also just read latest log file, regardless we need to figure out what their api returns in this case and surface it in a clearer way....

rekram1-node avatar Sep 25 '25 14:09 rekram1-node

Thank you, if I run into it again I'll try to capture the logs for you all.

bcardarella avatar Sep 25 '25 14:09 bcardarella

I've been getting this consistently on the anthropic plans when i hit my usage limits, not sure when it started occuring but it always just stays stuck on generating now. Opencode used to show the error in the top right toast about api limits.

My workaround at the moment is just opening claude code to check if its my usage limit when its stuck on generating, and what time it resets (would also be nice to see the reset time in opencode).

seaweeduk avatar Sep 25 '25 16:09 seaweeduk

here's a test from cli

opencode run "hi" --print-logs --model anthropic/claude-sonnet-4-20250514 INFO 2025-09-25T17:00:32 +63ms service=default version=0.11.3 args=["run","hi","--print-logs","--model","anthropic/claude-sonnet-4-20250514"] opencode INFO 2025-09-25T17:00:32 +0ms service=project directory=/home/swd/dev fromDirectory INFO 2025-09-25T17:00:32 +5ms service=config path=/home/swd/.config/opencode/config.json loading INFO 2025-09-25T17:00:32 +1ms service=config path=/home/swd/.config/opencode/opencode.json loading INFO 2025-09-25T17:00:32 +3ms service=config path=/home/swd/.config/opencode/opencode.jsonc loading INFO 2025-09-25T17:00:32 +5ms service=plugin [email protected] loading plugin INFO 2025-09-25T17:00:32 +1ms service=plugin [email protected] loading plugin INFO 2025-09-25T17:00:32 +9ms service=bus type=* subscribing INFO 2025-09-25T17:00:32 +0ms service=bus type=session.updated subscribing INFO 2025-09-25T17:00:32 +0ms service=bus type=message.updated subscribing INFO 2025-09-25T17:00:32 +0ms service=bus type=message.part.updated subscribing INFO 2025-09-25T17:00:32 +0ms service=format init INFO 2025-09-25T17:00:32 +0ms service=bus type=file.edited subscribing INFO 2025-09-25T17:00:32 +1ms service=session id=ses_67e2e391bffe2WYWXUKLCXEozO version=0.11.3 projectID=global directory=/home/swd/dev title=New session - 2025-09-25T17:00:32.868Z time={"created":1758819632868,"updated":1758819632868} created INFO 2025-09-25T17:00:32 +0ms service=lsp serverIds=typescript, vue, eslint, gopls, ruby-lsp, pyright, elixir-ls, zls, csharp, rust, clangd, svelte, jdtls enabled LSP servers INFO 2025-09-25T17:00:32 +0ms service=bus type=session.updated publishing INFO 2025-09-25T17:00:32 +1ms service=bus type=message.part.updated subscribing INFO 2025-09-25T17:00:32 +0ms service=bus type=session.error subscribing INFO 2025-09-25T17:00:32 +0ms service=session.prompt session=ses_67e2e391bffe2WYWXUKLCXEozO prompt INFO 2025-09-25T17:00:32 +3ms service=bus type=message.updated publishing INFO 2025-09-25T17:00:32 +0ms service=bus type=message.part.updated publishing INFO 2025-09-25T17:00:32 +1ms service=bus type=session.updated publishing INFO 2025-09-25T17:00:32 +1ms service=models.dev file={} refreshing INFO 2025-09-25T17:00:32 +1ms service=provider init INFO 2025-09-25T17:00:32 +1ms service=provider providerID=openrouter found INFO 2025-09-25T17:00:32 +0ms service=provider providerID=openai found INFO 2025-09-25T17:00:32 +0ms service=provider providerID=opencode found INFO 2025-09-25T17:00:32 +0ms service=provider providerID=github-copilot found INFO 2025-09-25T17:00:32 +0ms service=provider providerID=anthropic found INFO 2025-09-25T17:00:32 +0ms service=provider providerID=anthropic modelID=claude-sonnet-4-20250514 getModel INFO 2025-09-25T17:00:32 +1ms service=provider status=started providerID=anthropic getSDK INFO 2025-09-25T17:00:32 +41ms service=provider status=completed duration=41 providerID=anthropic getSDK INFO 2025-09-25T17:00:32 +0ms service=provider providerID=anthropic modelID=claude-sonnet-4-20250514 found INFO 2025-09-25T17:00:32 +0ms service=session.prompt session=ses_67e2e391bffe2WYWXUKLCXEozO sessionID=ses_67e2e391bffe2WYWXUKLCXEozO locking INFO 2025-09-25T17:00:32 +8ms service=mcp key=playwright type=local found INFO 2025-09-25T17:00:33 +632ms service=mcp key=context7 type=remote found INFO 2025-09-25T17:00:34 +773ms service=provider providerID=anthropic modelID=claude-3-5-haiku-20241022 getModel INFO 2025-09-25T17:00:34 +0ms service=provider status=started providerID=anthropic getSDK INFO 2025-09-25T17:00:34 +0ms service=provider status=completed duration=0 providerID=anthropic getSDK INFO 2025-09-25T17:00:34 +0ms service=bus type=message.updated publishing INFO 2025-09-25T17:00:34 +0ms service=provider providerID=anthropic modelID=claude-3-5-haiku-20241022 found INFO 2025-09-25T17:00:34 +10ms service=session.prompt session=ses_67e2e391bffe2WYWXUKLCXEozO process INFO 2025-09-25T17:00:34 +3ms service=session.prompt session=ses_67e2e391bffe2WYWXUKLCXEozO type=start part ERROR 2025-09-25T17:00:41 +6666ms service=session.prompt session=ses_67e2e391bffe2WYWXUKLCXEozO error=Failed after 3 attempts. Last error: This request would exceed your account's rate limit. Please try again later. model=claude-3-5-haiku-20241022 failed to generate title

opencode just sticks frozen at Generating... no toast shown in top right

seaweeduk avatar Sep 25 '25 17:09 seaweeduk

ill try replicating, thx for the details

rekram1-node avatar Sep 25 '25 17:09 rekram1-node

Same issue. Using Grok Code Fast 1. The only way to bring it back to its senses is by turning the internet off, forcing an error.

After that, switch to Code Supernova 1M to finally make Build mode edit some code once again.

If you are not comfortable with that model, do a revert, switch back to Grok Code Fast 1, and redo the task while praying that it'll never get stuck this time.

EDIT: I also suspect that as long as there is a stuck "Generating..." Build mode for a certain model for a particular session (e.g. Grok Code Fast 1), the succeeding Build modes for the same model and session will be stuck.

I will refer to these stuck in "Generating..." Build modes as dangling Build mode.

I noticed this when I was able to run Build mode with Code Supernova 1M despite a dangling Build mode mode in Grok Code Fast 1.

That might explain why cutting the internet off works in resetting everything, because it practically kills all the dangling Build modes.

raymelon avatar Sep 28 '25 07:09 raymelon

Same here. Happning with Grok 4 Fast on OpenRouter. I also noticed that I cannot stop openai/gpt-5-mini, no matter how much I press ESC, the model will finish its output, unimpressed by my attempts to stop its lengthy explanations of what I have just witnessed it doing.

I'm using the latest opencode in tmux running inside wezterm.

lukidoescode avatar Sep 28 '25 11:09 lukidoescode

It's happening for me as well, but only with GitHub copilot models. I'm getting a "too many requests" error.

fredrikaverpil avatar Sep 28 '25 11:09 fredrikaverpil