glow icon indicating copy to clipboard operation
glow copied to clipboard

Support streaming input via pipes

Open lanmeibuxie opened this issue 7 months ago • 5 comments

When piping streaming AI output to Glow, it waits until ​​all content is completely transmitted​​ before rendering. This prevents users from seeing progressively generated content in real-time, forcing them to wait until the entire output finishes before seeing any rendered result.

lanmeibuxie avatar May 30 '25 13:05 lanmeibuxie

Second this.

I love using glow together with llm like so:

llm "Say something in markdownish" | glow

Modern LLMs are tuned to output in markdown format by default. Since they are slow, streaming would be ideal use-case here.

ARKAD97 avatar Jun 25 '25 15:06 ARKAD97

This is a very tricky one to solve for in a terminal.

  • In order to properly render markdown you need to look at the entire document and re-render the entire output each time you receive input, therefore it's not just a matter of rendering progressively to stdout (or stderr).
  • This means that you need to manage redrawing, so it requires a TUI to manage output.
  • To further complicate things, when the output exceeds the height of the terminal you need to not output anything taller than the height of the terminal in order to prevent text from jumping around.

That said, this is how we solved for it in mods:

  1. We spin up a mini viewport with Bubble Tea that is the height of the terminal and re-render into it the viewport tails the output, however the user can "scroll up" in the TUI as output's being streamed.
  2. When output is complete we exit the TUI and flush the rendered markdown to the scrollback buffer.

We could potentially add this to Glow, though it's unlikely we'll be able to support such an effort in the short term.

meowgorithm avatar Jun 25 '25 16:06 meowgorithm

Rendering content from the llm tool would be for us the main reason for using glow, so streaming would greatly increase the usability and the reach of this tool. ( when the entire content is already present, it is way more easy to use it with any non-TUI tool to render it, so not that much advantage for glow )

aadrian avatar Jul 08 '25 13:07 aadrian

When piping streaming AI output to Glow, it waits until ​​all content is completely transmitted​​ before rendering. This prevents users from seeing progressively generated content in real-time, forcing them to wait until the entire output finishes before seeing any rendered result.

This is my little toy, feel free to try aimd

n-WN avatar Jul 10 '25 20:07 n-WN

@lanmeibuxie (and others) See https://github.com/charmbracelet/glow/pull/823 and LMK if it works for you; I'm hoping we can rally the various streaming-related issues around this one, because it looks like at least 2-3+ issues will close if this patch lands (possibly more because it adds some process-level quality-of-life things like signal handling) ☺️

anthonyrisinger avatar Sep 12 '25 07:09 anthonyrisinger