js-libp2p-gossipsub
js-libp2p-gossipsub copied to clipboard
feat: switch back to protons
Motivation
- We should switch back to protons when it's performance is comparable or better than protobufjs
- Prepare a branch to track the performance of different versions of protons
Description
As with protons 5.1.0, the performance is comparable to protobufjs, however it's not stable enough
Protobufjs
| # | decode (ns/op) | encode (ns/op) |
|---|---|---|
| 0 | 8.14 | 10.92 |
| 1 | 8.65 | 11.77 |
| 2 | 8.58 | 12.38 |
| 3 | 8.29 | 13.73 |
| 4 | 8.94 | 12.74 |
Protons 5.1.0
| # | decode (ns/op) | encode (ns/op) |
|---|---|---|
| 0 | 7.45 | 10.92 |
| 1 | 7.48 | 12.81 |
| 2 | 12.37 | 17.94 |
| 3 | 19.92 | 38.98 |
| 4 | 9.56 | 11.24 |
Codecov Report
Patch coverage: 74.78% and project coverage change: -2.31% :warning:
Comparison is base (
23bf0ee) 82.30% compared to head (ba73f6a) 80.00%. Report is 2 commits behind head on master.
Additional details and impacted files
@@ Coverage Diff @@
## master #327 +/- ##
==========================================
- Coverage 82.30% 80.00% -2.31%
==========================================
Files 48 48
Lines 12039 11105 -934
Branches 1286 1152 -134
==========================================
- Hits 9909 8884 -1025
- Misses 2130 2221 +91
| Files Changed | Coverage Δ | |
|---|---|---|
| src/metrics.ts | 16.43% <3.98%> (-2.57%) |
:arrow_down: |
| src/index.ts | 70.11% <63.73%> (+0.14%) |
:arrow_up: |
| src/message/rpc.ts | 88.05% <88.05%> (ø) |
|
| src/message/decodeRpc.ts | 93.71% <94.24%> (+0.99%) |
:arrow_up: |
| src/message-cache.ts | 78.06% <100.00%> (ø) |
|
| src/types.ts | 94.57% <100.00%> (ø) |
|
| src/utils/buildRawMessage.ts | 92.21% <100.00%> (ø) |
|
| src/utils/create-gossip-rpc.ts | 100.00% <100.00%> (ø) |
|
| test/decodeRpc.spec.ts | 100.00% <100.00%> (ø) |
|
| test/message-cache.spec.ts | 100.00% <100.00%> (ø) |
|
| ... and 2 more |
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
not sure why using proton-runtime 5.1.0 the performance is 2x slower
protobuf
✔ decode Attestation message using protons 5.1.0 1645793 ops/s 607.6100 ns/op - 983127 runs 60.1 s
✔ encode Attestation message using protons 5.1.0 1630497 ops/s 613.3100 ns/op - 972079 runs 60.0 s
should be good to go with proton-runtime 3.1.0
this is not ready, got memory leak issue as noted in https://github.com/ChainSafe/js-libp2p-gossipsub/issues/318#issuecomment-1666407160
rss is 1GB-1.5GB more than protobuf version, this is from a lodestar test mainnet node
message ControlPrune {
optional string topicID = 1;
repeated PeerInfo peers = 2;
optional uint64 backoff = 3;
}
while with protobuf it's mostly <=6.2GB. I think the main change is backoff field being intepreted as bigint while protobufjs used number instead
interface IControlPrune {
/** ControlPrune topicID */
topicID?: (string|null);
/** ControlPrune peers */
peers?: (RPC.IPeerInfo[]|null);
/** ControlPrune backoff */
backoff?: (number|null);
}
https://github.com/ipfs-shipyard/protons-gossipsub-benchmark
@tuyennhv might be worth testing again with the latest protons
we'll need to go with latest protons 7 anyway in order to use backoff field as a number, track this work in #468 instead