zipson icon indicating copy to clipboard operation
zipson copied to clipboard

Question: what would be a use-case for this?

Open pkyeck opened this issue 4 years ago • 2 comments

First of all: thank you for your lib!
This is not meant as a rude question just to better understand when it would make sense to use your lib.

When I have a pretty big JS object (client) and I want to send it to the server, is it useful then or is gzip doing the same thing and I don't really have to worry about it. Does zipson+gzip still save more than just gzip?

Or does it "only" make sense when you are streaming big chunks of data to/from client <> server?

pkyeck avatar Apr 30 '20 16:04 pkyeck

As an example, in my case I want to save a json file to avoid generating it every time on the browser, so instead of taking the raw data and transforming it verytime, now I take the pre-made json object and start from there

That object is ~50mb when stored in a file, my use-case here is to use zipson to make it a <3mb file

I'm reading it through a local fetch

*and I'm figuring out how I can import it within react (it doesn't seem to be possible)

kuworking avatar Feb 11 '21 10:02 kuworking

Usually, compression on the Web is best done with the platform features: content-encoding, CompressionStream, etc. And I wouldn’t use Zipson for compressing large <form> uploads, like blog posts or <input type="file" accept=".csv"> (seriously, how is <form accept-encoding> not a platform feature???)

But for my constraints, Zipson ended up perfect. I found it works great to compress uploaded RUM data (performance monitoring/analytics/error tracking/etc.) via navigator.sendBeacon():

  • navigator.sendBeacon() and its modern replacement fetch('…', { keepalive: true}) don’t accept streams, promises, or anything else that would require spinning the event loop again. (This is because sendBeacon is designed to fire at the last possible moment before the page unloads, so there won’t be that JS execution context once the current event loop is over.) The worst part? That means the new built-in CompressionStream API won’t work at all.

    Zipson can compress synchronously, so that’s no problem.

  • However, Zipson can compress incrementally, which is great for analytics: over the page lifetime, tiny bits of data can be compressed as they happen, instead of locking up the main thread right before the beacon. That way I can fit a great deal of highly-repetitive data into only a few network requests, respecting the user’s battery and data plan.

  • Zipson’s algorithm isn’t quite as “just do it, it always works” as general-purpose compression, but for repetitive strings and object shapes it’s a heck of a lot faster and simpler. RUM data has exactly those characteristics: the same Error happening multiple times, scroll events, PerformanceEntry, etc.

  • That simpler compression algorithm also means that import { stringifyTo, ZipsonStringWriter } from 'zipson' doesn’t hurt bundle size as much as other JS compression libraries. I’m also going to experiment with having my bundler predefine configuration constants like fullPrecisionFloats to see if I can eliminate the branches I’m not using.

tigt avatar May 01 '21 15:05 tigt