bun icon indicating copy to clipboard operation
bun copied to clipboard

Add CompressionStream

Open Symbitic opened this issue 2 years ago • 27 comments

What is the problem this feature would solve?

The compress middleware in Hono depends on CompressStream, which isn't implemented by WebKit. By adding it, it would increase the performance of transferring static assets such as CSS and SVG files.

What is the feature you are proposing to solve the problem?

Implement CompressionStream in Bun. That would automatically allow the compress middleware in Hono to work, and it would enable more Bun-optimized frameworks like Elysia to design their own middleware.

What alternatives have you considered?

It might be possible to implement CompressStream in WebKit and use that.

Symbitic avatar Jan 04 '23 23:01 Symbitic

Yeah we need to do this

WebKit has a CompressionStream/DecompressionStream we could use and uWebSockets (our HTTP server lib) also has builtin support for compression. Neither are wired up yet.

Jarred-Sumner avatar Jan 05 '23 00:01 Jarred-Sumner

Prerequisite 3b259211df10e06ddf5844898a6ffa3c9679e4d8

Jarred-Sumner avatar Jan 05 '23 02:01 Jarred-Sumner

Out of curiosity, what is the state of this issue? (My app really depends on Hono's compression middleware 😅)

vesamet avatar Feb 26 '23 09:02 vesamet

My node app uses DecompressionStream to unpack npm packages 'manually', i.e. without official npm tooling - I would very much like to switch to bun, but this is blocking me ATM.

cloudspeech avatar Jul 09 '23 13:07 cloudspeech

We just tried to port our proxy from Deno to Bun as sadly Bun doesn't support compression on its own. So we have to use https://hono.dev/middleware/builtin/compress with Bun. Is there another way to handle HTTP compression in Bun?

marc-barry avatar Jul 16 '23 03:07 marc-barry

any news about this issue? I also use Hono and want to use there compress middleware

niklasgrewe avatar Jul 31 '23 10:07 niklasgrewe

Our backend is unable to run without DecompressionStream, putting in my vote for this!

venkatd avatar Aug 28 '23 14:08 venkatd

Any news about this issue?

tomek-f avatar Sep 10 '23 05:09 tomek-f

Out of curiosity, what is the state of this issue? (My app really depends on Hono's compression middleware 😅)

+1 for the same reason, though I can disable it for now it would be great to have this soon.

ewrogers avatar Sep 10 '23 22:09 ewrogers

One could try https://github.com/101arrowz/compression-streams-polyfill in the meantime.

cloudspeech avatar Sep 15 '23 11:09 cloudspeech

I wired up my own compression stream polyfill that is based upon zlip instead of using fflate

// @bun

/*! MIT License. Jimmy Wärting <https://jimmy.warting.se/opensource> */
import zlib from 'node:zlib'

// fyi, Byte streams aren't really implemented anywhere yet
// It only exist as a issue: https://github.com/WICG/compression/issues/31

const make = (ctx, handle) => Object.assign(ctx, {
  writable: new WritableStream({
    write: chunk => handle.write(chunk),
    close: () => handle.end()
  }),
  readable: new ReadableStream({
    type: 'bytes',
    start (ctrl) {
      handle.on('data', chunk => ctrl.enqueue(chunk))
      handle.once('end', () => ctrl.close())
    }
  })
})

globalThis.CompressionStream ??= class CompressionStream {
  constructor(format) {
    make(this, format === 'deflate' ? zlib.createDeflate() :
    format === 'gzip' ? zlib.createGzip() : zlib.createDeflateRaw())
  }
}

globalThis.DecompressionStream ??= class DecompressionStream {
  constructor(format) {
    make(this, format === 'deflate' ? zlib.createInflate() :
    format === 'gzip' ? zlib.createGunzip() :
    zlib.createInflateRaw())
  }
}

( this is only needed in Bun - all other env have compression stream right now... )

jimmywarting avatar Oct 22 '23 19:10 jimmywarting

I think this might be relevant in here, I was doing stuff with gzip streams in bun and I found a weird bug with following code:

import { createGzip } from 'node:zlib'
import { Readable, pipeline } from 'node:stream'

const gzip = createGzip()

pipeline(
    Readable.toWeb(process.stdin),
    gzip,
    process.stdout,
    () => {},
)

Now, you should be able to run the program with echo <some data> | bun/node <path> > /dev/null and they should behave the same way, but in my case I was doing random data and big streams and found out from dd with bs=1K and count=16 up, the program hangs forever, so:

  • dd if=/dev/urandom bs=1K count=15 | bun std-gzip-pipe.js > /dev/null: works in under 0.002 secs on my machine
  • dd if=/dev/urandom bs=1K count=16 | bun std-gzip-pipe.js > /dev/null: hangs forever on bun, node works however

It turns out to be a stdio issue

javalsai avatar Jan 04 '24 22:01 javalsai

Any news about this issue?

sbenzemame avatar Jan 08 '24 21:01 sbenzemame

This is a bit of a blocker for a use case I have for bun. In the meantime I can use third party packages but I hope this is supported soon.

danthegoodman1 avatar Jan 20 '24 22:01 danthegoodman1

Before this issue is resolved you can use https://www.npmjs.com/package/bun-compression. I just ported the compression middleware that Elysia was using for hono.

sunneydev avatar Feb 03 '24 04:02 sunneydev

Does this bug also cover the DecompressionStream implementation? If so, can someone update the bug title?

codedread avatar Feb 05 '24 05:02 codedread

My deduplication backup [ddb] software uses streams with compression, specifically it .pipe()s to and from a compressor (gzip/gunzip). It operates over http as well as over the file system. Loading an entire file into memory is ofc not an option, so has to use streams.

Mehuge avatar Mar 08 '24 17:03 Mehuge

It's on the roadmap https://github.com/oven-sh/bun/issues/159

I, too, hope for a soon implementation. 🙏☺️

danielniccoli avatar Mar 29 '24 21:03 danielniccoli