opencode icon indicating copy to clipboard operation
opencode copied to clipboard

Devstral via LiteLLM OpenAI-compatible fails with invalid_request_message_order on first file edit

Open MaximilianHess opened this issue 3 weeks ago • 2 comments

Description

When OpenCode is configured to use Devstral (devstral-2512) through the OpenAI-compatible provider path (LiteLLM OpenAI-compatible endpoint), the run fails as soon as the agent performs an edit operation (i.e., when it tries to edit a file and then continue). The failure occurs immediately at the point where OpenCode attempts to proceed after the edit.

Error message:

litellm.BadRequestError: MistralException - {"object":"error","message":"Expected last role User or Tool (or Assistant with prefix True) for serving but got assistant","type":"invalid_request_message_order","param":null,"code":"3230"}. Received Model Group=devstral-2512 Available Model Group Fallbacks=None

The same workflow does not fail when Devstral is used through the native Mistral provider path (mistral_v1).

Our config:

{
  "$schema": "https://opencode.ai/config.json",

  "tools": {
    "bash": true,
    "edit": true,
    "write": true,
    "read": true,
    "grep": true,
    "glob": true,
    "list": true,
    "lsp": true,
    "patch": true,
    "skill": true,
    "todowrite": true,
    "todoread": true,
    "webfetch": true

  },

  "provider": {
    "ascii-mistral": {
      "npm": "@ai-sdk/mistral",
      "name": "ASCII LiteLLM (Mistral native)",
      "options": {
        "baseURL": "https://llm.ascii.ac.at/mistral/v1",
        "apiKey": "{env:ASCII_AGENTIC_CODING_KEY}"
      },
      "models": {
        "devstral-2512": {
          "name": "devstral-2512 (via LiteLLM)",
          "limit": { "context": 256000, "output": 256000 },
          "tool_call": true,
        },
        "codestral-2508": {
          "name": "codestral-2508 (via LiteLLM)",
          "limit": { "context": 128000, "output": 128000 },
          "tool_call": true,
        },
      }
    },
    "ascii-oai-compatible": {
      "npm": "@ai-sdk/openai-compatible",
      "name": "ASCII LiteLLM",
      "options": {
        "baseURL": "https://llm.ascii.ac.at/v1",
        "apiKey": "{env:ASCII_AGENTIC_CODING_KEY}"
      },
      "models": {
        "devstral-2512": {
          "name": "devstral-2512 (via LiteLLM)",
          "limit": { "context": 256000, "output": 256000 },
          "tool_call": true,
        },
        "codestral-2508": {
          "name": "codestral-2508 (via LiteLLM)",
          "limit": { "context": 128000, "output": 128000 },
          "tool_call": true,
        },
      }
    }
  },

  "model": "ascii-oai-compatible/devstral-2512"
}


Is this likely a opencode or litellm issue?

OpenCode version

1.0.207

Steps to reproduce

No response

Screenshot and/or share link

No response

Operating System

No response

Terminal

No response

MaximilianHess avatar Dec 29 '25 10:12 MaximilianHess