js-binary
js-binary copied to clipboard
Addition to benchmarks
Hey, first of all, good work on this module here!
I was reviewing your module, as I was interested to utilize it to speed up communication within micro service networks. I also compared it to major solutions like msgpack. While I looked at your benchmarks I quickly saw, that it does not put any strings into account yet. I slightly modified the benchmark to also process some strings and this are the results:
Encode
JSON
Time: 1.6408ms
Size: 249KiB
js-binary
Time: 1.9183ms (17%)
Size: 92KiB (2.7x less)
Decode
JSON
Time: 2.2956ms
js-binary
Time: 0.6941ms (-70%)
Encoding got 17% slower than JSON.stringify, but decode is still fast. Still pretty good though.
Just wanted to have that shared, will do one to one comparisions next.
Here are the timings:
Benchmark without strings:
Encode
JSON
Time: 1.482ms
Size: 190KiB
js-binary
Time: 1.145ms (-23%)
Size: 51KiB (3.7x less)
msgpack-lite
Time: 5.344ms (261%)
Size: 126KiB (1.5x less)
msgpack
Time: 8.449ms (470%)
Size: 126KiB (1.5x less)
Decode
JSON
Time: 2.167ms
js-binary
Time: 0.408ms (-81%)
msgpack-lite
Time: 9.642ms (345%)
msgpack
Time: 4.804ms (122%)
Benchmark with strings:
Encode
JSON
Time: 1.6ms
Size: 249KiB
js-binary
Time: 1.846ms (15%)
Size: 92KiB (2.7x less)
msgpack-lite
Time: 5.855ms (266%)
Size: 178KiB (1.4x less)
msgpack
Time: 8.567ms (435%)
Size: 178KiB (1.4x less)
Decode
JSON
Time: 2.402ms
js-binary
Time: 0.79ms (-67%)
msgpack-lite
Time: 12.292ms (412%)
msgpack
Time: 5.554ms (131%)
Run on Intel(R) Core(TM) i7-7820HK CPU @ 2.90GH real clock while benchmark @ ~3.4GH
So msgpack in both of the major libraries is significantly slower. Your module is a bit slower than JSON.stringify on the encode side.
I also made another test with a bit more real world data and smaller payloads, which results in:
Encode
JSON
Time: 0.004822ms
Size: 250B
js-binary
Time: 0.002744ms (-43%)
Size: 76B (3.3x less)
msgpack-lite
Time: 0.013257ms (175%)
Size: 163B (1.5x less)
msgpack
Time: 0.018754ms (289%)
Size: 163B (1.5x less)
Decode
JSON
Time: 0.002693ms
js-binary
Time: 0.001217ms (-55%)
msgpack-lite
Time: 0.00991ms (268%)
msgpack
Time: 0.005288ms (96%)
Seems like in all cases msgpack is a bad decision, only if the only thing that matters is network. Still your module delivers a good performance here. Also on the encoding side again this time. Suprisingly enough, the payload mainly consists out of strings and now it performs better again.
All in all it seems like this module could be good choice to improve performance. On the wire as well as on the encoding/decoding side.
Facts on the median:
- Around 2x data reduction on the wire
- Around ~34% overall performance improvement
For me, dealing with 177 kB of JSON + Buffers I saw that it was slower than plain JSON+blob for both encode and decode. It did reduce the JSON size a bit though.
@jdalton Seems to depend, had no problems yet in the simulations. Do you have an example on how your payloads were structured?
Arrays of objects and a buffer property.
Ok, had these in my tests too with documents as big as 4MiB, could be a construction issue or something. How deep do your objects go? How big are your buffers? Would be cool to know this to see where the limitations here are since msgpack seems to be a really bad option compared to v8s json en/decoding capabilities.
Just one level deep.
Do you have an example object? I would then write a generator function for it and run some tests against it.
And thanks for your input from your experiences here @jdalton :)