[MiniMax M2] Agent stop at the middle of the work
Description
~~1. Incorrect Side Panel Title (Context Header)~~
The title/header of the right-hand side panel, which appears to contain session context, metadata, and task lists, is displaying an outdated or generic title.
- Problem: The header currently reads "New session - 2025-11-07T07:33:05.124Z".
- Context: The title should be updated to reflect the actual status or the user's current task within the agent's environment.
2. Agent Stops in the Middle of the Work (Incomplete Execution)
The interaction flow suggests the AI agent (MiniMax) failed to complete its intended action after a code modification step and is now waiting for user input, indicating a premature halt or failure in the execution pipeline.
Have to manually keep response "Please continue"
OpenCode version
1.0.49
Steps to reproduce
No response
Screenshot and/or share link
Operating System
masOS 26.2 beta
Terminal
Ghostty
This issue might be a duplicate of existing issues. Please check:
- #4073: Both issues describe agents stopping/hanging during execution and not completing their intended work. Issue #4073 specifically mentions agents hanging during tasks with ESC becoming unresponsive, which seems very similar to your MiniMax M2 agent stopping in the middle of work.
Feel free to ignore if none of these address your specific case.
@lst97 do u not have any openrouter credits? When u select a model we use a tiny cheap one for title gen that your provider has if u are using openrouter this would be haiku 4.5
seems like it is failing for u tho
@rekram1-node Ah i see, thank you for your response :)
@lst97 for now id recommend setting “small_model” in your global opencode.json
choose any model from the following output: “opencode models”
ofc choose a model u have access to, it can even be a free one like:
opencode/big-pickle
For the Issue 2, it may not be the issue with opencode.
According to the official announcement: After 24:00 UTC on November 7, the RPM for free users will be reduced. If you are running high-concurrency task, we recommend switching to the MiniMax-M2-Stable in advance or adjusting your task concurrency to ensure uninterrupted execution.
However it's better to display a message if it's the case.
ref: https://x.com/MiniMax__AI/status/1986815058249408541
I'm currently on the new MiniMax Coding Plan and it keeps happening all the time. Execution stops and I need to say something like 'resume' to get it going again.
Needless to say, but in its current state v1.0.55, opencode is almost unusable with MiniMax M2. it halts after nearly every turn.
@bennyzen if the model stops working, that's a model / prompting thing typically unless u are seeing an error? Otherwise the model is saying it is done
https://github.com/MiniMax-AI/MiniMax-M2/issues/37
@rekram1-node not that sure about that, as other coding agents don't show that behavior with MiniMax M2.
I can confirm the same issue and it only happens in Opencode for me. Other harnesses work just fine.
@bennyzen @lst97 what version of oc are u running openrouter fixed some stuff and we updated the package that we use with the newest fixes
@rekram1-node Thank you for the response, I have updated to 1.0.216 and have some simple testing. The result is positive. looks like the issue has been fixed. I will conduct some more testing during this week and will close the issue if the issues fixed. Thanks 💯
UPDATE:
I am not using openrouter for the M2 model but by modify the opencode.json:
"provider": {
"minimax": {
"npm": "@ai-sdk/anthropic",
"options": {
"baseURL": "https://api.minimax.io/anthropic/v1",
"apiKey": "*"
},
"models": {
"MiniMax-M2": {
"name": "MiniMax-M2"
}
}
}
}
I just realized that a offical MiniMax provider added in the auth login in OC, and I will test this provider instead in the coming week.
I still have the same issue appeared. OC 1.0.216 auth login selected to MiniMax
+1 Same Issue Version : 1.0.126 Model: minimax/MiniMax-M2
I have tried all 3 methods, opencode auth login -> MiniMax provider opencode auth login -> Others Added Minimax providers in opencode.json config
In All 3 methods Minimax agent stopped in the middle of the work
I wonder if it's due to maxed out context window. From my shared link: https://opencode.ai/s/adJ4a7bx looks like input token is 232117, which is larger than the context window of this model
Same here. Hope this could be fixed soon.
Thanks!
I’m encountering the same issue with Gemini 3 Pro (from Google APIs directly). #4953 I think it’s more likely an issue within Opencode and not much to do with the specific model.
This issue has more traction so I figured we’d consolidate here. Is there anything in particular I can do to debug?
@0dragosh it has to do with interleavened thinking models, but for anything regarding OpenRouter then it is an issue w/ their API or their sdk. I have been talking with them about several issues and they are still addressing some.
As for minimax outside of openrouter, that's separate and ill have to make an account to check it out
@0dragosh it has to do with interleavened thinking models, but for anything regarding OpenRouter then it is an issue w/ their API or their sdk. I have been talking with them about several issues and they are still addressing some.
As for minimax outside of openrouter, that's separate and ill have to make an account to check it out
Appreciate the insight! For me it happens with Gemini 3 Pro from Google APIs directly (not OpenRouter).
I think this PR https://github.com/sst/opencode/pull/4933 seems to fix this issue, but not sure why it's closed.
Can anyone send me example sessions of this happening? I have a minimax coding plan setup. Trying to get this stop behavior mentioned but minimax through the minimax provider seems to work great?
Maybe if anyone can replicate the issue they are describing, try sending me your session:
opencode export > session.json
OR send me steps to replicate this issue?
Also please tell me:
- are you on minimax or minimax china
- did you set the provider in your opencode.json or are you using the default (using the default works for me you shouldn't need custom config here)
Okay I managed to replicate, setup a proxy to validate and in fact the minimax provider IS telling us the response is done. I still don't think this is an opencode bug, we are doing exactly what their api / model is saying...
Artifacts for evidence:
Share link of the conversation (note the model says it's about to do something, then stops after a read): https://docs.dev.opencode.ai/s/djgvEIGD
Here is a gist containing a conversation where the model stops (both request & response) as well as a basic conversation where I just send "hello"
In both cases their api is responding with the same stop reason, it seems like they aren't correctly doing a tool-calls stop reason?
Stopped randomly excerpt:
event: message_delta
data: {"type":"message_delta","delta":{"stop_reason":"end_turn"},"usage":{"input_tokens":42131,"output_tokens":355}}
event: message_stop
data: {"type":"message_stop"}
Hello excerpt:
event: message_delta
data: {"type":"message_delta","delta":{"stop_reason":"end_turn"},"usage":{"input_tokens":13613,"output_tokens":104}}
event: message_stop
data: {"type":"message_stop"}
Here is my gist w/ full payloads for each: https://gist.github.com/rekram1-node/96907df65c6385d534151f6d35ae8fcd
Okay I presented it to their team and in fact the issue is on their end, they said they will be addressing it soon.
So not an opencode bug
Okay they said they fixed it, doing some testing myself to see if the issue persists. If I cannot replicate I will be closing this but please comment if you continue having issues & we will reopen and continue discussion w/ their team.
Tested quite a bit, couldn't get it to stop randomly!
Now anyone who is using Minimax through other providers, lmk if you hit snags but it should function properly through openrouter & minimax providers
You rock!! Thanks