envelop
envelop copied to clipboard
[DO NOT MERGE] Simpler LRU-ish cache implementation
What I realized in Yoga benchmarks, existing LRU cache implementations are blocking the event loop for set/clear operations.
With the usage of queueMicrotask, we can push those operations at the end of the current event loop after the request is finished.
Why not setTimeout or sth like that because those are pushing the task in the next loop (needs more investigation), so they break serverless envs as I experienced.
Unfortunately I couldn't find enough time to come with real numbers and so on. I'll do once I have time.
🦋 Changeset detected
Latest commit: d42d0482a4dee0765ff0ff35f62a5dadb1531ce5
The changes in this PR will be included in the next version bump.
This PR includes changesets to release 32 packages
| Name | Type |
|---|---|
| @envelop/core | Minor |
| @envelop/graphql-jit | Major |
| @envelop/parser-cache | Major |
| @envelop/validation-cache | Major |
| @envelop/testing | Major |
| @envelop/apollo-federation | Major |
| @envelop/apollo-server-errors | Major |
| @envelop/apollo-tracing | Major |
| @envelop/auth0 | Major |
| @envelop/dataloader | Major |
| @envelop/depth-limit | Major |
| @envelop/disable-introspection | Major |
| @envelop/execute-subscription-event | Major |
| @envelop/extended-validation | Major |
| @envelop/filter-operation-type | Major |
| @envelop/fragment-arguments | Major |
| @envelop/generic-auth | Major |
| @envelop/graphql-middleware | Major |
| @envelop/graphql-modules | Major |
| @envelop/live-query | Major |
| @envelop/newrelic | Major |
| @envelop/opentelemetry | Major |
| @envelop/operation-field-permissions | Major |
| @envelop/persisted-operations | Major |
| @envelop/preload-assets | Major |
| @envelop/prometheus | Major |
| @envelop/rate-limiter | Major |
| @envelop/resource-limitations | Major |
| @envelop/response-cache | Major |
| @envelop/sentry | Major |
| @envelop/statsd | Major |
| @envelop/response-cache-redis | Patch |
Not sure what this means? Click here to learn what changesets are.
Click here if you're a maintainer who wants to add another changeset to this PR
This pull request is being automatically deployed with Vercel (learn more).
To see the status of your deployment, click below or on the icon next to each commit.
🔍 Inspect: https://vercel.com/theguild/envelop/GDrqAFhwaD7YtYBq28Uo9CRzVzp8
✅ Preview: https://envelop-git-faster-lru-theguild.vercel.app
The latest changes of this PR are available as alpha in npm (based on the declared changesets):
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
@envelop/[email protected]
✅ Benchmark Results
✓ no_errors
✓ expected_result
checks.............................................: 100.00% ✓ 614878 ✗ 0
data_received......................................: 2.4 GB 16 MB/s
data_sent..........................................: 134 MB 892 kB/s
envelop_init.......................................: avg=500ns min=99ns med=299ns max=299.1µs p(90)=1.5µs p(95)=2µs
✓ { mode:envelop-cache-jit }.......................: avg=263ns min=99ns med=200ns max=75.3µs p(90)=400ns p(95)=500ns
✓ { mode:envelop-just-cache }......................: avg=355ns min=99ns med=299ns max=50.1µs p(90)=599ns p(95)=800ns
✓ { mode:graphql-js }..............................: avg=546ns min=99ns med=300ns max=185.2µs p(90)=1.99µs p(95)=2.29µs
✓ { mode:prom-tracing }............................: avg=2.42µs min=1.29µs med=1.9µs max=299.1µs p(90)=2.69µs p(95)=6.9µs
envelop_total......................................: avg=195.93µs min=0s med=17.9µs max=16.61ms p(90)=450.7µs p(95)=1.36ms
✓ { mode:envelop-cache-jit }.......................: avg=17.9µs min=11.4µs med=16.2µs max=8.28ms p(90)=22.2µs p(95)=24.8µs
✓ { mode:envelop-just-cache }......................: avg=174.67µs min=136.3µs med=168.8µs max=16.61ms p(90)=182.3µs p(95)=193.1µs
✓ { mode:graphql-js }..............................: avg=486.74µs min=370.7µs med=444.3µs max=12.6ms p(90)=489.36µs p(95)=530.3µs
✓ { mode:prom-tracing }............................: avg=1.57ms min=1.29ms med=1.4ms max=13.51ms p(90)=1.62ms p(95)=2.77ms
event_loop_lag.....................................: avg=0s min=0s med=0s max=0s p(90)=0s p(95)=0s
✓ { mode:envelop-cache-and-no-internal-tracing }...: avg=0s min=0s med=0s max=0s p(90)=0s p(95)=0s
✓ { mode:envelop-cache-jit }.......................: avg=0s min=0s med=0s max=0s p(90)=0s p(95)=0s
✓ { mode:envelop-just-cache }......................: avg=0s min=0s med=0s max=0s p(90)=0s p(95)=0s
✓ { mode:graphql-js }..............................: avg=0s min=0s med=0s max=0s p(90)=0s p(95)=0s
✓ { mode:prom-tracing }............................: avg=0s min=0s med=0s max=0s p(90)=0s p(95)=0s
graphql_context....................................: avg=3.65µs min=1.49µs med=2.89µs max=7.19ms p(90)=5.89µs p(95)=6.89µs
✓ { mode:envelop-cache-jit }.......................: avg=2.7µs min=1.49µs med=2.49µs max=1.34ms p(90)=3.49µs p(95)=4.19µs
✓ { mode:envelop-just-cache }......................: avg=3.19µs min=2µs med=2.89µs max=7.19ms p(90)=3.59µs p(95)=4.09µs
✓ { mode:graphql-js }..............................: avg=5.46µs min=3.29µs med=4.69µs max=1.02ms p(90)=7.99µs p(95)=9.3µs
✓ { mode:prom-tracing }............................: avg=6.96µs min=5.49µs med=6.69µs max=428.1µs p(90)=7.7µs p(95)=8.8µs
graphql_execute....................................: avg=201.79µs min=6.69µs med=153.1µs max=16.6ms p(90)=184.2µs p(95)=1.32ms
✓ { mode:envelop-cache-jit }.......................: avg=10.01µs min=6.69µs med=9.2µs max=6.62ms p(90)=11.3µs p(95)=12.4µs
✓ { mode:envelop-just-cache }......................: avg=166.6µs min=130.5µs med=160.9µs max=16.6ms p(90)=172.7µs p(95)=182.3µs
✓ { mode:graphql-js }..............................: avg=183.99µs min=136.1µs med=165.6µs max=11.6ms p(90)=179.1µs p(95)=189.8µs
✓ { mode:prom-tracing }............................: avg=1.51ms min=1.24ms med=1.34ms max=13.44ms p(90)=1.54ms p(95)=2.71ms
graphql_parse......................................: avg=8.73µs min=1.79µs med=3.6µs max=7.75ms p(90)=16.1µs p(95)=44.2µs
✓ { mode:envelop-cache-jit }.......................: avg=4.04µs min=1.99µs med=3.4µs max=1.24ms p(90)=5.69µs p(95)=6.4µs
✓ { mode:envelop-just-cache }......................: avg=3.81µs min=1.79µs med=3.29µs max=1.37ms p(90)=4.89µs p(95)=6.2µs
✓ { mode:graphql-js }..............................: avg=12.83µs min=7.6µs med=10.59µs max=7.75ms p(90)=16.79µs p(95)=18.9µs
✓ { mode:prom-tracing }............................: avg=48.38µs min=37.7µs med=46.19µs max=1.38ms p(90)=55.6µs p(95)=60.1µs
graphql_validate...................................: avg=50.14µs min=299ns med=799ns max=12.35ms p(90)=260.4µs p(95)=267.6µs
✓ { mode:envelop-cache-jit }.......................: avg=869ns min=399ns med=799ns max=1.74ms p(90)=1.19µs p(95)=1.3µs
✓ { mode:envelop-just-cache }......................: avg=693ns min=299ns med=599ns max=1.13ms p(90)=899ns p(95)=1µs
✓ { mode:graphql-js }..............................: avg=283.89µs min=207.2µs med=261.9µs max=12.35ms p(90)=285.4µs p(95)=305.6µs
✓ { mode:prom-tracing }............................: avg=4.64µs min=3.19µs med=4.19µs max=1.65ms p(90)=5.49µs p(95)=7.1µs
http_req_blocked...................................: avg=3.25µs min=600ns med=1.3µs max=24.53ms p(90)=2.1µs p(95)=2.5µs
http_req_connecting................................: avg=457ns min=0s med=0s max=11.95ms p(90)=0s p(95)=0s
http_req_duration..................................: avg=4.47ms min=180.5µs med=2.85ms max=83.96ms p(90)=10.23ms p(95)=15ms
{ expected_response:true }.......................: avg=4.47ms min=180.5µs med=2.85ms max=83.96ms p(90)=10.23ms p(95)=15ms
✓ { mode:envelop-cache-and-no-internal-tracing }...: avg=3.41ms min=327.3µs med=2.75ms max=45.55ms p(90)=5.61ms p(95)=6.78ms
✓ { mode:envelop-cache-jit }.......................: avg=2.66ms min=180.5µs med=1.96ms max=35.61ms p(90)=5.13ms p(95)=9.4ms
✓ { mode:envelop-just-cache }......................: avg=3.58ms min=328.4µs med=2.83ms max=39.11ms p(90)=5.98ms p(95)=7.93ms
✓ { mode:graphql-js }..............................: avg=7.2ms min=744.71µs med=5.59ms max=66.78ms p(90)=11.85ms p(95)=13.31ms
✓ { mode:prom-tracing }............................: avg=17.63ms min=1.85ms med=15.59ms max=83.96ms p(90)=29.39ms p(95)=31.81ms
http_req_failed....................................: 0.00% ✓ 0 ✗ 307439
http_req_receiving.................................: avg=49.39µs min=10.8µs med=21.2µs max=28.76ms p(90)=32.4µs p(95)=39.5µs
http_req_sending...................................: avg=37.46µs min=4.3µs med=7.7µs max=26.01ms p(90)=14.2µs p(95)=23.5µs
http_req_tls_handshaking...........................: avg=0s min=0s med=0s max=0s p(90)=0s p(95)=0s
http_req_waiting...................................: avg=4.38ms min=153.6µs med=2.81ms max=83.89ms p(90)=9.92ms p(95)=14.92ms
http_reqs..........................................: 307439 2049.517648/s
iteration_duration.................................: avg=4.87ms min=415.6µs med=3.15ms max=88.62ms p(90)=11.05ms p(95)=15.39ms
iterations.........................................: 307439 2049.517648/s
vus................................................: 10 min=10 max=19
vus_max............................................: 20 min=20 max=20
No more TTL, right? TTL makes not a lot of sense for caching parse and validate anyway.
No more TTL, right? TTL makes not a lot of sense for caching
parseandvalidateanyway.
That would mean breaking change (MAJOR release), right?
Hey, @ardatan and @n1ru4l I'm seeing this PR has already been reviewed and I think that'd cause a breaking change release as @n1ru4l has mentioned. what should the next steps be?