node
node copied to clipboard
CompressionStream hangs
Version
v21.6.1
Platform
Darwin ****** 23.0.0 Darwin Kernel Version 23.0.0: Fri Sep 15 14:42:42 PDT 2023; root:xnu-10002.1.13~1/RELEASE_X86_64 x86_64
Subsystem
globals, streams/web
What steps will reproduce the bug?
function from(src) {
return new ReadableStream({
start(controller) {
controller.enqueue(src);
controller.close();
},
});
}
async function read(stream) {
const reader = stream.getReader();
const chunks = [];
while (true) {
const { done, value } = await reader.read();
if (done) break;
chunks.push(value);
}
return chunks;
}
await read(from(new ArrayBuffer([0, 0, 0, 0])).pipeThrough(new CompressionStream('gzip')));
How often does it reproduce? Is there a required condition?
Every time
What is the expected behavior? Why is that the expected behavior?
Should resolve to [Uint8Array(10), Uint8Array(10)]
.
What do you see instead?
Nothing - reader.read()
hangs and never resolves.
Additional information
Running the code snippet in Chrome devtools console produces the correct outcome. Using 'deflate'
rather than 'gzip'
has no effect, it still hangs.
I explored the issue and it appears that Nodejs expects the "chunk" argument in the _write function to be either a string, Buffer, or Uint8Array, not an ArrayBuffer.
A potential workaround involves converting the ArrayBuffer to a Buffer before processing it:
const buffer = Buffer.from(new ArrayBuffer([0, 0, 0, 0]));
@nodejs/streams I experimented with converting ArrayBuffer to Buffer directly within the _write function like this:
if (Stream._isAnyArrayBuffer(chunk)) {
encoding = 'buffer';
chunk = Buffer.from(chunk, encoding);
}
It worked, but I'm not convinced it's the best way to add support ArrayBuffer
Hi @IlyasShabi would you like to open a PR with your solution? I think it's worth giving it a try