bun
bun copied to clipboard
No brotli in zlib
The zlib implementation available doesn't support brotli. It's also significantly different than the current node.js structure of the library.
Bun doesn't currently include a brotli implementation, but it really should
The Node API of zlib isn't very well structured or documented. A better version API being included in Bun with the Node version of it just for compatibility would be great.
The helper function definition here shows the options, but then has a params object inside of that which has the keys as constants from zlib.constants. And there are a lot of constants defined there.
Any progress for Brotli support?
This is the only thing that prevents us from using Bun :/
We're in the same boat, this is the last thing lacking on the compatibility list.
@bmansfie do you work for Bun? Because this should be good news for me, I can't even try to make the binding by myself because event the @axios team didn't say anything that could help me to solve this issue, I'm so disappointed about how both teams has been managing this issue during two years
If you don't be a staff member of Bun, should someone tell me what's the requirements to be met? Or if you have a lib chosen to be bound to Bun now, will be great to write these stuffs in an issue, otherwise I should to deal with it manually
@bmansfie do you work for Bun? Because this should be good news for me, I can't even try to make the binding by myself because event the @axios team didn't say anything that could help me to solve this issue, I'm so disappointed about how both teams has been managing this issue during two years
If you don't be a staff member of Bun, should someone tell me what's the requirements to be met? Or if you have a lib chosen to be bound to Bun now, will be great to write these stuffs in an issue, otherwise I should to deal with it manually
This line determines whether the current environment supports brotli:
const isBrotliSupported = utils.isFunction(zlib.createBrotliDecompress);
The current (imperfect) workaround is to manually edit this line in file node_modules/axios/dist/node/axios.cjs to tell axios not to use brotli.
Not a clean solution, but works for me.
Hey mister @colinhacks @paperdave @Jarred-Sumner @antongolub I was reading that Brotli is implemented, but it's unoptimized, what you know about this? I should end this task
Is this optimization issue related to https://github.com/oven-sh/bun/issues/6299?
@bmansfie do you work for Bun? Because this should be good news for me, I can't even try to make the binding by myself because event the @axios team didn't say anything that could help me to solve this issue, I'm so disappointed about how both teams has been managing this issue during two years If you don't be a staff member of Bun, should someone tell me what's the requirements to be met? Or if you have a lib chosen to be bound to Bun now, will be great to write these stuffs in an issue, otherwise I should to deal with it manually
This line determines whether the current environment supports brotli:
const isBrotliSupported = utils.isFunction(zlib.createBrotliDecompress);The current (imperfect) workaround is to manually edit this line in file
node_modules/axios/dist/node/axios.cjsto tell axios not to use brotli.Not a clean solution, but works for me.
You can disable default axios's compressing and set option responseType to arraybuffer. Also demand only gzip compression from a server (header 'Accept-Encoding': 'gzip'). Then parse buffer data to json, text or html by zlib.gunzip(). Here is code example.
I would rather any other compression algorithm supported in caniuse.com
import pickle
import requests
import timeit
import sys
import zlib
import zstandard as zstd
import brotli
url = 'https://your.url/here'
# Assuming 'data' holds the information you want to compress
response = requests.get(url)
data = response.json()
serialized_data = pickle.dumps(data)
# Benchmarking function
def benchmark(function, iterations=10):
r = timeit.repeat(function, repeat=3, number=iterations)
return sum(r) / len(r) # Average time per iteration
# Benchmark serialization and compression for Gzip across different levels
gzip_results = {}
for level in range(1, 10):
compressed_data = zlib.compress(serialized_data, level)
compression_ratio = (sys.getsizeof(compressed_data) / sys.getsizeof(serialized_data)) * 100
compressed_size = sys.getsizeof(compressed_data)
gzip_results[level] = {
'compression_time': benchmark(lambda: zlib.compress(serialized_data, level), iterations=5),
'decompression_time': benchmark(lambda: zlib.decompress(compressed_data), iterations=5),
'compression_ratio': compression_ratio,
'compressed_size': compressed_size
}
# Benchmark serialization and compression for Deflate across different levels
deflate_results = {}
for level in range(1, 10):
compressed_data = zlib.compress(serialized_data, level)
compression_ratio = (sys.getsizeof(compressed_data) / sys.getsizeof(serialized_data)) * 100
compressed_size = sys.getsizeof(compressed_data)
deflate_results[level] = {
'compression_time': benchmark(lambda: zlib.compress(serialized_data, level), iterations=5),
'decompression_time': benchmark(lambda: zlib.decompress(compressed_data), iterations=5),
'compression_ratio': compression_ratio,
'compressed_size': compressed_size
}
# Benchmark serialization and compression for Zstandard across different levels
zstd_results = {}
for level in range(1, 23):
cctx = zstd.ZstdCompressor(level=level)
compressed_data = cctx.compress(serialized_data)
compression_ratio = (sys.getsizeof(compressed_data) / sys.getsizeof(serialized_data)) * 100
compressed_size = sys.getsizeof(compressed_data)
zstd_results[level] = {
'compression_time': benchmark(lambda: cctx.compress(serialized_data), iterations=5),
'decompression_time': benchmark(lambda: zstd.ZstdDecompressor().decompress(compressed_data),
iterations=5),
'compression_ratio': compression_ratio,
'compressed_size': compressed_size
}
# Benchmark serialization and compression for Brotli across different quality levels
brotli_results = {}
for quality in range(1, 12):
compressed_data = brotli.compress(serialized_data, quality=quality)
compression_ratio = (sys.getsizeof(compressed_data) / sys.getsizeof(serialized_data)) * 100
compressed_size = sys.getsizeof(compressed_data)
brotli_results[quality] = {
'compression_time': benchmark(lambda: brotli.compress(serialized_data, quality=quality),
iterations=5),
'decompression_time': benchmark(lambda: brotli.decompress(compressed_data), iterations=5),
'compression_ratio': compression_ratio,
'compressed_size': compressed_size
}
# Print benchmark results for Gzip
print("Gzip Serialization (compression/decompression) results:")
for level, result in gzip_results.items():
print(f"Gzip (Level {level}):")
print(f" Compression Time: {result['compression_time']:.6f} seconds")
print(f" Decompression Time: {result['decompression_time']:.6f} seconds")
print(f" Compression Ratio: {result['compression_ratio']:.2f}%")
print(f" Compressed Size: {result['compressed_size']} bytes\n")
# Print benchmark results for Deflate
print("\nDeflate Serialization (compression/decompression) results:")
for level, result in deflate_results.items():
print(f"Deflate (Level {level}):")
print(f" Compression Time: {result['compression_time']:.6f} seconds")
print(f" Decompression Time: {result['decompression_time']:.6f} seconds")
print(f" Compression Ratio: {result['compression_ratio']:.2f}%")
print(f" Compressed Size: {result['compressed_size']} bytes\n")
# Print benchmark results for Zstandard
print("\nZstandard Serialization (compression/decompression) results:")
for level, result in zstd_results.items():
print(f"Zstandard (Level {level}):")
print(f" Compression Time: {result['compression_time']:.6f} seconds")
print(f" Decompression Time: {result['decompression_time']:.6f} seconds")
print(f" Compression Ratio: {result['compression_ratio']:.2f}%")
print(f" Compressed Size: {result['compressed_size']} bytes\n")
# Print benchmark results for Brotli
print("\nBrotli Serialization (compression/decompression) results:")
for quality, result in brotli_results.items():
print(f"Brotli (Quality {quality}):")
print(f" Compression Time: {result['compression_time']:.6f} seconds")
print(f" Decompression Time: {result['decompression_time']:.6f} seconds")
print(f" Compression Ratio: {result['compression_ratio']:.2f}%")
print(f" Compressed Size: {result['compressed_size']} bytes\n")
Gzip Serialization (compression/decompression) results:
Gzip (Level 1):
Compression Time: 0.131746 seconds
Decompression Time: 0.046795 seconds
Compression Ratio: 36.89%
Compressed Size: 1167235 bytes
Gzip (Level 2):
Compression Time: 0.148521 seconds
Decompression Time: 0.044622 seconds
Compression Ratio: 35.26%
Compressed Size: 1115833 bytes
Gzip (Level 3):
Compression Time: 0.192732 seconds
Decompression Time: 0.046630 seconds
Compression Ratio: 33.97%
Compressed Size: 1074807 bytes
Gzip (Level 4):
Compression Time: 0.213282 seconds
Decompression Time: 0.047072 seconds
Compression Ratio: 32.40%
Compressed Size: 1025204 bytes
Gzip (Level 5):
Compression Time: 0.348723 seconds
Decompression Time: 0.046816 seconds
Compression Ratio: 31.36%
Compressed Size: 992237 bytes
Gzip (Level 6):
Compression Time: 0.493173 seconds
Decompression Time: 0.045413 seconds
Compression Ratio: 30.97%
Compressed Size: 980071 bytes
Gzip (Level 7):
Compression Time: 0.541998 seconds
Decompression Time: 0.043903 seconds
Compression Ratio: 30.92%
Compressed Size: 978442 bytes
Gzip (Level 8):
Compression Time: 0.609642 seconds
Decompression Time: 0.044365 seconds
Compression Ratio: 30.90%
Compressed Size: 977862 bytes
Gzip (Level 9):
Compression Time: 0.589004 seconds
Decompression Time: 0.042475 seconds
Compression Ratio: 30.90%
Compressed Size: 977862 bytes
Deflate Serialization (compression/decompression) results:
Deflate (Level 1):
Compression Time: 0.128636 seconds
Decompression Time: 0.046544 seconds
Compression Ratio: 36.89%
Compressed Size: 1167235 bytes
Deflate (Level 2):
Compression Time: 0.148510 seconds
Decompression Time: 0.043960 seconds
Compression Ratio: 35.26%
Compressed Size: 1115833 bytes
Deflate (Level 3):
Compression Time: 0.191041 seconds
Decompression Time: 0.043908 seconds
Compression Ratio: 33.97%
Compressed Size: 1074807 bytes
Deflate (Level 4):
Compression Time: 0.197751 seconds
Decompression Time: 0.044402 seconds
Compression Ratio: 32.40%
Compressed Size: 1025204 bytes
Deflate (Level 5):
Compression Time: 0.309715 seconds
Decompression Time: 0.043225 seconds
Compression Ratio: 31.36%
Compressed Size: 992237 bytes
Deflate (Level 6):
Compression Time: 0.478999 seconds
Decompression Time: 0.041243 seconds
Compression Ratio: 30.97%
Compressed Size: 980071 bytes
Deflate (Level 7):
Compression Time: 0.523563 seconds
Decompression Time: 0.041796 seconds
Compression Ratio: 30.92%
Compressed Size: 978442 bytes
Deflate (Level 8):
Compression Time: 0.591571 seconds
Decompression Time: 0.044189 seconds
Compression Ratio: 30.90%
Compressed Size: 977862 bytes
Deflate (Level 9):
Compression Time: 0.591415 seconds
Decompression Time: 0.043805 seconds
Compression Ratio: 30.90%
Compressed Size: 977862 bytes
Zstandard Serialization (compression/decompression) results:
Zstandard (Level 1):
Compression Time: 0.014630 seconds
Decompression Time: 0.004104 seconds
Compression Ratio: 13.06%
Compressed Size: 413334 bytes
Zstandard (Level 2):
Compression Time: 0.011026 seconds
Decompression Time: 0.002880 seconds
Compression Ratio: 7.55%
Compressed Size: 238832 bytes
Zstandard (Level 3):
Compression Time: 0.010607 seconds
Decompression Time: 0.002330 seconds
Compression Ratio: 5.35%
Compressed Size: 169132 bytes
Zstandard (Level 4):
Compression Time: 0.010927 seconds
Decompression Time: 0.002252 seconds
Compression Ratio: 5.30%
Compressed Size: 167705 bytes
Zstandard (Level 5):
Compression Time: 0.022500 seconds
Decompression Time: 0.002075 seconds
Compression Ratio: 4.75%
Compressed Size: 150148 bytes
Zstandard (Level 6):
Compression Time: 0.030029 seconds
Decompression Time: 0.002032 seconds
Compression Ratio: 4.44%
Compressed Size: 140424 bytes
Zstandard (Level 7):
Compression Time: 0.035606 seconds
Decompression Time: 0.001970 seconds
Compression Ratio: 4.38%
Compressed Size: 138676 bytes
Zstandard (Level 8):
Compression Time: 0.042404 seconds
Decompression Time: 0.001982 seconds
Compression Ratio: 4.30%
Compressed Size: 135926 bytes
Zstandard (Level 9):
Compression Time: 0.041453 seconds
Decompression Time: 0.001819 seconds
Compression Ratio: 3.97%
Compressed Size: 125507 bytes
Zstandard (Level 10):
Compression Time: 0.055232 seconds
Decompression Time: 0.001803 seconds
Compression Ratio: 3.91%
Compressed Size: 123756 bytes
Zstandard (Level 11):
Compression Time: 0.067030 seconds
Decompression Time: 0.001763 seconds
Compression Ratio: 3.88%
Compressed Size: 122782 bytes
Zstandard (Level 12):
Compression Time: 0.074708 seconds
Decompression Time: 0.001779 seconds
Compression Ratio: 3.88%
Compressed Size: 122773 bytes
Zstandard (Level 13):
Compression Time: 0.320558 seconds
Decompression Time: 0.001800 seconds
Compression Ratio: 3.86%
Compressed Size: 122225 bytes
Zstandard (Level 14):
Compression Time: 0.514192 seconds
Decompression Time: 0.001885 seconds
Compression Ratio: 3.82%
Compressed Size: 120726 bytes
Zstandard (Level 15):
Compression Time: 0.776974 seconds
Decompression Time: 0.001669 seconds
Compression Ratio: 3.78%
Compressed Size: 119665 bytes
Zstandard (Level 16):
Compression Time: 0.745292 seconds
Decompression Time: 0.001817 seconds
Compression Ratio: 3.72%
Compressed Size: 117706 bytes
Zstandard (Level 17):
Compression Time: 0.882279 seconds
Decompression Time: 0.001811 seconds
Compression Ratio: 3.68%
Compressed Size: 116327 bytes
Zstandard (Level 18):
Compression Time: 0.785868 seconds
Decompression Time: 0.001999 seconds
Compression Ratio: 3.65%
Compressed Size: 115567 bytes
Zstandard (Level 19):
Compression Time: 1.235092 seconds
Decompression Time: 0.001685 seconds
Compression Ratio: 3.62%
Compressed Size: 114656 bytes
Zstandard (Level 20):
Compression Time: 1.140070 seconds
Decompression Time: 0.001782 seconds
Compression Ratio: 3.62%
Compressed Size: 114656 bytes
Zstandard (Level 21):
Compression Time: 1.689877 seconds
Decompression Time: 0.001728 seconds
Compression Ratio: 3.62%
Compressed Size: 114409 bytes
Zstandard (Level 22):
Compression Time: 2.818275 seconds
Decompression Time: 0.001952 seconds
Compression Ratio: 3.61%
Compressed Size: 114149 bytes
Brotli Serialization (compression/decompression) results:
Brotli (Quality 1):
Compression Time: 0.031643 seconds
Decompression Time: 0.018773 seconds
Compression Ratio: 17.61%
Compressed Size: 557157 bytes
Brotli (Quality 2):
Compression Time: 0.025418 seconds
Decompression Time: 0.007376 seconds
Compression Ratio: 5.78%
Compressed Size: 183002 bytes
Brotli (Quality 3):
Compression Time: 0.028109 seconds
Decompression Time: 0.006453 seconds
Compression Ratio: 5.55%
Compressed Size: 175494 bytes
Brotli (Quality 4):
Compression Time: 0.040346 seconds
Decompression Time: 0.005264 seconds
Compression Ratio: 4.38%
Compressed Size: 138659 bytes
Brotli (Quality 5):
Compression Time: 0.071548 seconds
Decompression Time: 0.006076 seconds
Compression Ratio: 4.03%
Compressed Size: 127586 bytes
Brotli (Quality 6):
Compression Time: 0.080487 seconds
Decompression Time: 0.005724 seconds
Compression Ratio: 3.89%
Compressed Size: 123166 bytes
Brotli (Quality 7):
Compression Time: 0.090228 seconds
Decompression Time: 0.005666 seconds
Compression Ratio: 3.82%
Compressed Size: 120802 bytes
Brotli (Quality 8):
Compression Time: 0.100304 seconds
Decompression Time: 0.005359 seconds
Compression Ratio: 3.77%
Compressed Size: 119300 bytes
Brotli (Quality 9):
Compression Time: 0.158134 seconds
Decompression Time: 0.005358 seconds
Compression Ratio: 3.74%
Compressed Size: 118328 bytes
Brotli (Quality 10):
Compression Time: 1.688359 seconds
Decompression Time: 0.005695 seconds
Compression Ratio: 3.51%
Compressed Size: 111109 bytes
Brotli (Quality 11):
Compression Time: 5.849162 seconds
Decompression Time: 0.006769 seconds
Compression Ratio: 3.74%
Compressed Size: 118307 bytes
I'm surprised that Zstandard with level 22 is closer than Brotli with a quality of 11, I did other benchmarks with algorithms in theory I should have 3.xx% of compression ratio but for any reason I can't replicate that results, Bun already support gzip, but with those steps you should implement another algorithm or let Bun with Axios deal with this responsability
How is bun announcing v1.x.x given that its stated purpose is to be a nodejs drop-in replacement and there's still hundreds of breaking issues like this one? Feels misleading at best
Actually the only problem here is they have not documented what are the goals to be met before re-enable Brotli
For anyone here having the issues with axios, here is a quick workaround to make Axios use gzip instead of Brotli:
axios.defaults.headers.common["Accept-Encoding"] = "gzip";
Then all your axios instances will use gzip.
Not ideal, but that's the only way I found to make it work with Bun.
For anyone here having the issues with axios, here is a quick workaround to make Axios use gzip instead of Brotli:
axios.defaults.headers.common["Accept-Encoding"] = "gzip";Then all your axios instances will use gzip.
Not ideal, but that's the only way I found to make it work with Bun.
You are the best!!! Been struggling for a while.
So for this problem that has existed for almost 2 years, there is no solution?
So for this problem that has existed for almost 2 years, there is no solution?
You should expect this issue to be fixed in the next 2 months. We are really focused on Windows right now. I have a branch that mostly implements this already, but we cannot prioritize finishing it until Windows + more bugs are fixed.
So for this problem that has existed for almost 2 years, there is no solution?
You should expect this issue to be fixed in the next 2 months. We are really focused on Windows right now. I have a branch that mostly implements this already, but we cannot prioritize finishing it until Windows + more bugs are fixed.
got it. thanks
For anyone here having the issues with axios, here is a quick workaround to make Axios use gzip instead of Brotli:
axios.defaults.headers.common["Accept-Encoding"] = "gzip";
I'm using a library that uses axios. Is there a way to implement the above so all instances of axios use this setting?
For anyone here having the issues with axios, here is a quick workaround to make Axios use gzip instead of Brotli:
axios.defaults.headers.common["Accept-Encoding"] = "gzip";I'm using a library that uses axios. Is there a way to implement the above so all instances of axios use this setting?
that's exactly what it does
Shouldn't it be set by default then?
Shouldn't it be set by default then?
That's for axios project to consider, I guess.
that's exactly what it does
Thats what I thought. Unfortunately I'm still getting the error.
Please fix this bun...
So for this problem that has existed for almost 2 years, there is no solution?
You should expect this issue to be fixed in the next 2 months. We are really focused on Windows right now. I have a branch that mostly implements this already, but we cannot prioritize finishing it until Windows + more bugs are fixed.
I think the two months are more than over now! :) Any progress?
@ZeldOcarina I think you must have missed the date on the comment:
@bgmort he said in two months, two months are over now..
I have a small project using bun which works fine under MacOS, and it works absolutely fine. I wanted to build a binary for amd64, so I downloaded bun on my WSL machine, tried to build it, and came across this issue. How is this possible?
We still haven’t shipped Bun for Windows, but that is happening on April 1. I’m sorry this has taken us so long, and I expect we will fix this shortly after Bun v1.1. The team is very focused on Windows support and it’s hard for us to work on much else until that ships.
We still haven’t shipped Bun for Windows, but that is happening on April 1. I’m sorry this has taken us so long, and I expect we will fix this shortly after Bun v1.1. The team is very focused on Windows support and it’s hard for us to work on much else until that ships.
Thanks for the update, I was not whining about the issue itself, I was more baffled about the fact that it worked on my other machine.
Turns out the machine where I initially developed my little project, where everything seemed to work fine, is running v1.0.11. I was able to reproduce this by downgrading my other machine