protobuf-ts
protobuf-ts copied to clipboard
Show current benchmarking results
The manual currently provides a comparison between code size vs speed, but it only shows the resulting size of the generated code so we don't know the runtime difference. There is already a benchmarks package which provides code to perform benchmarks, but it leave something to be desired. Namely: the current results. :)
It would be helpful to show this information (especially the perf.ts results) somewhere. Ideally it could be included in the code size vs speed section of the manual.
P.S. A comparison against protobuf.js would be very nice to compare the performance against since that library tends to be the fastest out there at the moment.
P.P.S. I've already taken a crack at adding protobuf.js to the perf.ts benchmarks locally and it seems like protobuf.js can decode/encode the binary about twice as fast as [email protected]. Any ideas how that gap could be closed? Is protobuf.js taking shortcuts that aren't conformant to the proto spec? Are there any techniques that could be copied from protobuf.js?
Thank you again for this wonderful project!
Coincidentally, I just updated the code size benchmarks this morning, cleanup up the code as well.
It's easy to measure code size, not so much the performance. For example, google-protobuf deserializes into an intermediate state. So a simple roundtrip might show different results, compared to when all fields are set with the getter / setter methods.
That being said, having performance benchmarks for protobuf.js would be great.
Thanks for the PR! The makefile bug from this comment is fixed in commit 326299f0335a0157d5c278369cf4e467809a1d53, making sure the performance benchmarks run with the same payload every time. Not saying benchmarks should only run with the large payload, this was just fixing the obvious makefile bug.
All testees are in the same ballpark with the large payload size:
### read binary
google-protobuf : 11.26 ops/s
ts-proto : 25.34 ops/s
protobuf-ts (speed) : 27.98 ops/s
protobuf-ts (speed, bigint) : 25.9 ops/s
protobuf-ts (size) : 24.43 ops/s
protobuf-ts (size, bigint) : 23.38 ops/s
### write binary
google-protobuf : 16.15 ops/s
ts-proto : 13.11 ops/s
protobuf-ts (speed) : 12.12 ops/s
protobuf-ts (speed, bigint) : 11.8 ops/s
protobuf-ts (size) : 10.1 ops/s
protobuf-ts (size, bigint) : 9.76 ops/s
### from partial
ts-proto : 25.39 ops/s
protobuf-ts (speed) : 22.25 ops/s
protobuf-ts (size) : 21.27 ops/s
### read json
ts-proto : 41.19 ops/s
protobuf-ts (speed) : 16.16 ops/s
protobuf-ts (size) : 16.55 ops/s
### write json
ts-proto : 138.19 ops/s
protobuf-ts (speed) : 23.38 ops/s
protobuf-ts (size) : 23.19 ops/s
### read json string
ts-proto : 11.74 ops/s
protobuf-ts (speed) : 7.78 ops/s
protobuf-ts (size) : 7.77 ops/s
### write json string
ts-proto : 16.13 ops/s
protobuf-ts (speed) : 16.15 ops/s
protobuf-ts (size) : 16.82 ops/s
I think the benchmarks should run on several payload sizes. There are some factor 10 gaps with the smaller payload you were measuring that are worth closer investigation.
For reference:
Large payload: 1.2MiB - FileDescriptorSet for packages/test-fixtures/**/*.proto
Small payload: 49KiB - FileDescriptorSet just for google/protobuf/descriptor.proto
Great results!
One thing that caught my eyes is the massive difference in writing JSON between ts-proto
and protobuf-ts
do you know the reason why protobuf-ts
is way slower?
ts-proto
uses protobuf.js for JSON, which doesn't fully the official JSON format, see protobufjs/protobuf.js#1304. I am sure they are skipping a few things.
But look closely at the numbers. They are put into relation when you realize that they measure turning the internal representation into a JSON object. What you need in practice is a JSON string:
### read json string
ts-proto : 11.74 ops/s
protobuf-ts (size) : 7.77 ops/s
### write json string
ts-proto : 16.13 ops/s
protobuf-ts (size) : 16.82 ops/s
I think the manual deserves a performance comparison table at the end of the section Code size vs speed. It should just show numbers for binary I/O and JSON (string) I/O. It should show generator version number and parameters, preferably in a one simple table. It should be mentioned how and where this is measured, and with what payload size. The table should be generated by a script, similar to the code size report.
These are the results including protobuf.js:
### read binary
google-protobuf : 9.938 ops/s
ts-proto : 23.604 ops/s
protobuf-ts (speed) : 23.742 ops/s
protobuf-ts (speed, bigint) : 23.066 ops/s
protobuf-ts (size) : 24.891 ops/s
protobuf-ts (size, bigint) : 23.829 ops/s
protobufjs : 28.464 ops/s
### write binary
google-protobuf : 15.118 ops/s
ts-proto : 13.626 ops/s
protobuf-ts (speed) : 12.078 ops/s
protobuf-ts (speed, bigint) : 12.036 ops/s
protobuf-ts (size) : 10.554 ops/s
protobuf-ts (size, bigint) : 10.672 ops/s
protobufjs : 12.305 ops/s
### from partial
ts-proto : 40.744 ops/s
protobuf-ts (speed) : 26.53 ops/s
protobuf-ts (size) : 27.213 ops/s
### read json string
ts-proto : 14.237 ops/s
protobuf-ts (speed) : 8.307 ops/s
protobuf-ts (size) : 8.469 ops/s
protobufjs : 15.367 ops/s
### write json string
ts-proto : 18.328 ops/s
protobuf-ts (speed) : 18.403 ops/s
protobuf-ts (size) : 18.34 ops/s
protobufjs : 23.837 ops/s
### read json object
ts-proto : 34.747 ops/s
protobuf-ts (speed) : 17.509 ops/s
protobuf-ts (size) : 17.0 ops/s
protobufjs : 46.793 ops/s
### write json object
ts-proto : 182.47 ops/s
protobuf-ts (speed) : 30.375 ops/s
protobuf-ts (size) : 30.049 ops/s
protobufjs : 47.009 ops/s
Looks like there has been a regression in v2.0.0-alpha.9. We stopped generating create
for speed optimized code. Thanks to @odashevskii-plaid, this is fixed. It bumps up the performance of read and create methods a bit:
### read binary
google-protobuf : 11.525 ops/s
ts-proto : 26.28 ops/s
protobuf-ts (speed) : 31.584 ops/s
protobuf-ts (speed, bigint) : 33.79 ops/s
protobuf-ts (size) : 24.935 ops/s
protobuf-ts (size, bigint) : 25.073 ops/s
protobufjs : 32.129 ops/s
### write binary
google-protobuf : 16.832 ops/s
ts-proto : 14.168 ops/s
protobuf-ts (speed) : 12.636 ops/s
protobuf-ts (speed, bigint) : 12.769 ops/s
protobuf-ts (size) : 10.969 ops/s
protobuf-ts (size, bigint) : 11.045 ops/s
protobufjs : 12.902 ops/s
### from partial
ts-proto : 40.707 ops/s
protobuf-ts (speed) : 29.767 ops/s
protobuf-ts (size) : 27.98 ops/s
### read json string
ts-proto : 14.963 ops/s
protobuf-ts (speed) : 8.485 ops/s
protobuf-ts (size) : 8.272 ops/s
protobufjs : 15.59 ops/s
### write json string
ts-proto : 18.633 ops/s
protobuf-ts (speed) : 19.347 ops/s
protobuf-ts (size) : 18.997 ops/s
protobufjs : 27.291 ops/s
### read json object
ts-proto : 37.691 ops/s
protobuf-ts (speed) : 18.774 ops/s
protobuf-ts (size) : 16.267 ops/s
protobufjs : 44.944 ops/s
### write json object
ts-proto : 200.92 ops/s
protobuf-ts (speed) : 31.027 ops/s
protobuf-ts (size) : 32.502 ops/s
protobufjs : 45.461 ops/s
Was this this discovery made in some off-github discussion? I'm curious more than anything since I can't find any issue or PR mentioning this. Glad it was spotted though!
Completely unrelated, but what on Earth is going on with ts-proto's "write json object" benchmark? Being nearly an order of magnitude faster than the underlying library it uses (protobufjs) seems odd.
Was this this discovery made in some off-github discussion?
See https://github.com/timostamm/protobuf-ts/pull/147#discussion_r699233192 and https://github.com/timostamm/protobuf-ts/pull/147#issuecomment-909186392
Completely unrelated, but what on Earth is going on with ts-proto's "write json object" benchmark?
It's impressive, right? I don't think ts-proto is sharing any code with protobufjs for JSON.