opencode icon indicating copy to clipboard operation
opencode copied to clipboard

TUI doesn't render messages from prompt_async endpoint

Open xnoto opened this issue 1 day ago • 1 comments

Problem

Messages sent via POST /session/:id/prompt_async are processed by the LLM but don't appear in the TUI. The AI responds, but users see nothing.

Steps to Reproduce

  1. Start opencode (which connects to a running server on port 4096)
  2. Note the session ID
  3. From another process, call POST localhost:4096/session/{id}/prompt_async with a message
  4. LLM processes and responds (confirmed via API response)
  5. TUI remains blank - no prompt, no response shown

Expected

Messages injected via prompt_async should render in TUI like user-typed messages.

Why This Matters

Any external tool or automation that uses the API to send prompts works correctly at the LLM level, but users watching the TUI can't see what's happening. The conversation is invisible.

Use Case

We run multiple opencode instances (agents) against a single server, each in different project directories. A daemon monitors a shared message queue and delivers messages between agents using prompt_async.

The agents collaborate - they receive messages, process them, and respond. It all works at the API level. But when we watch an agent's TUI, we see nothing. The agent handled 4 messages autonomously, but only 2 user-typed exchanges were visible.

It'd be preferred to have LLM activity be more visible to the TUI session rather than completely invisible API calls happening in the background.

Possible Fixes

  1. Render prompt_async in TUI - treat it like user input
  2. New endpoint with explicit TUI rendering
  3. TUI subscribes to all session messages - render regardless of source

Environment

  • OpenCode: latest
  • OS: macOS

xnoto avatar Jan 15 '26 00:01 xnoto