undici
undici copied to clipboard
Add diagnostic on consume finish
On my way to close this https://github.com/nodejs/undici/issues/1173
Didn't know how to pass the request context to the Readable object, @mcollina obviously suggested to pass it on the constructor. :)
Using the "onConnect" context to include the reference to the socket in the context.
Bench main:
[bench:run] │ Tests │ Samples │ Result │ Tolerance │ Difference with slowest │
[bench:run] |─────────────────────|─────────|───────────────|───────────|─────────────────────────|
[bench:run] │ http - no keepalive │ 10 │ 7.76 req/sec │ ± 0.74 % │ - │
[bench:run] |─────────────────────|─────────|───────────────|───────────|─────────────────────────|
[bench:run] │ http - keepalive │ 10 │ 7.90 req/sec │ ± 0.38 % │ + 1.79 % │
[bench:run] |─────────────────────|─────────|───────────────|───────────|─────────────────────────|
[bench:run] │ undici - request │ 10 │ 51.38 req/sec │ ± 1.45 % │ + 561.73 % │
[bench:run] |─────────────────────|─────────|───────────────|───────────|─────────────────────────|
[bench:run] │ undici - pipeline │ 10 │ 51.70 req/sec │ ± 1.20 % │ + 565.88 % │
[bench:run] |─────────────────────|─────────|───────────────|───────────|─────────────────────────|
[bench:run] │ undici - stream │ 10 │ 52.96 req/sec │ ± 1.81 % │ + 582.15 % │
[bench:run] |─────────────────────|─────────|───────────────|───────────|─────────────────────────|
[bench:run] │ undici - dispatch │ 10 │ 57.64 req/sec │ ± 1.60 % │ + 642.46 % │
[bench:run]
[bench:run] │ Tests │ Samples │ Result │ Tolerance │ Difference with slowest │
[bench:run] |─────────────────────|─────────|─────────────────|───────────|─────────────────────────|
[bench:run] │ http - no keepalive │ 10 │ 4134.68 req/sec │ ± 2.23 % │ - │
[bench:run] |─────────────────────|─────────|─────────────────|───────────|─────────────────────────|
[bench:run] │ http - keepalive │ 10 │ 5133.15 req/sec │ ± 1.35 % │ + 24.15 % │
[bench:run] |─────────────────────|─────────|─────────────────|───────────|─────────────────────────|
[bench:run] │ undici - request │ 10 │ 6337.97 req/sec │ ± 2.84 % │ + 53.29 % │
[bench:run] |─────────────────────|─────────|─────────────────|───────────|─────────────────────────|
[bench:run] │ undici - pipeline │ 15 │ 6388.91 req/sec │ ± 2.45 % │ + 54.52 % │
[bench:run] |─────────────────────|─────────|─────────────────|───────────|─────────────────────────|
[bench:run] │ undici - stream │ 10 │ 6862.55 req/sec │ ± 2.03 % │ + 65.98 % │
[bench:run] |─────────────────────|─────────|─────────────────|───────────|─────────────────────────|
[bench:run] │ undici - dispatch │ 15 │ 7207.77 req/sec │ ± 2.17 % │ + 74.32 % │
bench branch
[bench:run] │ Tests │ Samples │ Result │ Tolerance │ Difference with slowest │
[bench:run] |─────────────────────|─────────|───────────────|───────────|─────────────────────────|
[bench:run] │ http - no keepalive │ 10 │ 7.71 req/sec │ ± 0.80 % │ - │
[bench:run] |─────────────────────|─────────|───────────────|───────────|─────────────────────────|
[bench:run] │ http - keepalive │ 10 │ 7.84 req/sec │ ± 0.52 % │ + 1.71 % │
[bench:run] |─────────────────────|─────────|───────────────|───────────|─────────────────────────|
[bench:run] │ undici - request │ 10 │ 52.34 req/sec │ ± 1.87 % │ + 578.72 % │
[bench:run] |─────────────────────|─────────|───────────────|───────────|─────────────────────────|
[bench:run] │ undici - pipeline │ 10 │ 53.23 req/sec │ ± 1.14 % │ + 590.24 % │
[bench:run] |─────────────────────|─────────|───────────────|───────────|─────────────────────────|
[bench:run] │ undici - stream │ 10 │ 53.98 req/sec │ ± 1.92 % │ + 600.06 % │
[bench:run] |─────────────────────|─────────|───────────────|───────────|─────────────────────────|
[bench:run] │ undici - dispatch │ 10 │ 54.13 req/sec │ ± 1.21 % │ + 601.97 % │
[bench:run]
[bench:run] │ Tests │ Samples │ Result │ Tolerance │ Difference with slowest │
[bench:run] |─────────────────────|─────────|─────────────────|───────────|─────────────────────────|
[bench:run] │ http - no keepalive │ 10 │ 4318.55 req/sec │ ± 2.06 % │ - │
[bench:run] |─────────────────────|─────────|─────────────────|───────────|─────────────────────────|
[bench:run] │ http - keepalive │ 15 │ 5360.76 req/sec │ ± 2.16 % │ + 24.13 % │
[bench:run] |─────────────────────|─────────|─────────────────|───────────|─────────────────────────|
[bench:run] │ undici - pipeline │ 15 │ 6352.82 req/sec │ ± 2.14 % │ + 47.11 % │
[bench:run] |─────────────────────|─────────|─────────────────|───────────|─────────────────────────|
[bench:run] │ undici - request │ 15 │ 6540.60 req/sec │ ± 2.59 % │ + 51.45 % │
[bench:run] |─────────────────────|─────────|─────────────────|───────────|─────────────────────────|
[bench:run] │ undici - stream │ 10 │ 6858.88 req/sec │ ± 2.61 % │ + 58.82 % │
[bench:run] |─────────────────────|─────────|─────────────────|───────────|─────────────────────────|
[bench:run] │ undici - dispatch │ 15 │ 7286.48 req/sec │ ± 2.39 % │ + 68.73 % │
Codecov Report
Merging #1184 (dd0b500) into main (5d52423) will decrease coverage by
0.07%
. The diff coverage is83.33%
.
@@ Coverage Diff @@
## main #1184 +/- ##
==========================================
- Coverage 93.73% 93.65% -0.08%
==========================================
Files 41 41
Lines 4020 4037 +17
==========================================
+ Hits 3768 3781 +13
- Misses 252 256 +4
Impacted Files | Coverage Δ | |
---|---|---|
lib/client.js | 97.91% <ø> (ø) |
|
lib/api/readable.js | 87.21% <80.95%> (-1.58%) |
:arrow_down: |
lib/api/api-request.js | 100.00% <100.00%> (ø) |
|
lib/core/request.js | 98.52% <100.00%> (ø) |
|
lib/handler/redirect.js | 94.93% <100.00%> (ø) |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 5d52423...dd0b500. Read the comment docs.
Can you add a couple of tests?
Sure, I just to make sure u are ok with current implementation.
I feel like there is better solution to implement this, there is "onMessageComplete" callback comin from http parser (llhttp) but both llhttp lib and its ancestor "http_parser" dont have clear documentation when this callback is called.
Maybe sending the event from "resume" method, where the socket is detached from the request, is the right place to put it
There is a small cost in having functions being called from llhttp. However I think it might be worth a try!