deno icon indicating copy to clipboard operation
deno copied to clipboard

automatic body compression for text/event-stream

Open mfulton26 opened this issue 1 year ago • 2 comments

I figured out that I can compress an event stream myself using CompressionStream but I wonder if event streams can be candidates for automatic body compression?

Currently Deno supports gzip and brotli compression. A body is automatically compressed if the following conditions are true:

  • The request has an Accept-Encoding header which indicates the requester supports br for Brotli or gzip. Deno will respect the preference of the quality value in the header.
  • The response includes a Content-Type which is considered compressible. (The list is derived from jshttp/mime-db with the actual list in the code.)
  • The response body is greater than 64 bytes.
  1. gzip and deflate could be supported for text/event-stream
  2. plain text seems to me to generally be considered compressible
    • it is not currently listed in jshttp/mime-db
    • there was once a pull request to mark it explicitly as not compressible, it references buffering & flushing concerns with compression - npm… I don't know if those same concerns apply to CompressionStream (https://github.com/jshttp/mime-db/pull/138)
  3. a stream's response body's size is unknown, but it is unlikely to be less than or equal to 64 bytes

I think it is fine to require stream authors to compress their streams themselves if automatic body compression isn't appropriate here. If that's the case, then I wonder if any documentation/references additions to the manual about automatic body compression would be helpful to call-out that streams are not eligible for automatic compression but can easily be compressed:

const encoding = acceptsEncodings(request, "gzip", "deflate");
const body = encoding
  ? stream.pipeThrough(new CompressionStream(encoding))
  : stream;
const headers = new Headers({ "content-type": "text/event-stream" });
if (encoding) headers.append("content-encoding", encoding);
const response = new Response(body, { headers });

mfulton26 avatar Feb 23 '24 13:02 mfulton26

That would definitely make sense, but we currently don't have a good flushing strategy for streaming bodies. Ideally we'd like to ensure that buffered compressed data gets flushed after a short delay (or in the case of SSE, per-frame), but we cannot guarantee that compressed data will be flushed.

mmastrac avatar Feb 23 '24 14:02 mmastrac

Is the flushing issue around the serving having written an event terminated by multiple new lines but the CompressionStream, due to the way the gzip and/or deflate algorithm works, is awaiting more bytes to potentially better compress the outgoing data?

That makes more sense to me now that I type it out. 🤔

So, as it is right now maybe compressing an event stream isn't a good idea for servers because it could delay events being delivered to clients… I guess events should be small then and link to larger resources where necessary rather than inlining them.

mfulton26 avatar Feb 23 '24 15:02 mfulton26

For whatever it's worth, I fiddled around with Compression Streams for SSE messages in my deno app, before realizing that I could just let caddy handle it automatically - as it would for any response. You can even natively use zstd (and brotli, if you add an extra module for it). It compressed considerably better and faster than my DIY method.

It would be nice to not even need caddy for this though, so hopefully sse/event-stream can get added into Deno for automatic compression.

nickchomey avatar Jan 23 '25 05:01 nickchomey