bun
bun copied to clipboard
Slow down when use `await`
Version
0.1.1
Platform
Darwin Salt.local 21.5.0 Darwin Kernel Version 21.5.0: Tue Apr 26 21:08:37 PDT 2022; root:xnu-8020.121.3~4/RELEASE_ARM64_T6000 arm64
What steps will reproduce the bug?
Bun is slowed down while using await
keyword.
TLDR; Reproducible repo
In the real-world app, async
is required (eg. querying from a database).
However, using it in Bun, for some reason slowed down the execution by at least 2 times.
To reproduce, let's create 2 file serving HTTP, and loadtest it.
// sync.ts
const handle = () => new Response("Hi")
export default {
port: 8080,
fetch: handle
}
// async.ts
const handle = async () => new Response("Hi")
export default {
port: 8080,
fetch: async () => await handle()
}
Result:
However, it runs fast if we don't use await
Let's edit async.ts
to return the async function without await.
// ? Will not defers
const handle = async () => new Response("Hi")
export default {
port: 8081,
fetch: handle
}
![Screen Shot 2565-07-11 at 11 35 47](https://user-images.githubusercontent.com/35027979/178189361-ea73aaf6-2ed7-441a-ac57-e9c7a34776c1.png)
Although, we can work around it by not using await
keyword or using Promise.resolve
, it doesn't offer a good Developer Experience, because we can't unwrap
the value without using then.
In the real-world app, the developer will have to unwrap the value and performs some action on it.
This situation will have the most problem affecting significant performance when using with Bun as HTTP web server.
Here's reproducible repo if you want to test the problem.
How often does it reproduce? Is there a required condition?
Always reproducible when using await
on any platform.
What is the expected behavior?
Bun shouldn't be slowed down by a lot when performing await
What do you see instead?
Quote from What steps will reproduce the bug?
Serving sync, and async server, then perform loadtest, the server is slowed down.
However, if we remove await
but remain async
, the speed goes up again.
Additional information
The performance slowed down is vary, this problem is tested on:
- MacOS with Apple Silicon Chip (M1 Max 32 core)
- Darwin Salt.local 21.5.0 Darwin Kernel Version 21.5.0: Tue Apr 26 21:08:37 PDT 2022; root:xnu-8020.121.3~4/RELEASE_ARM64_T6000 arm64
- Slowed down by 2.5x
- WSL Debian 11 (AMD Ryzen 5 3500X)
- Linux Chiffon 4.19.1280microsoft-standard
#1
SMP Tue Jun 23 12:58:10 UTC 2020 x86_64 GNU/Linux - Slowed down by 1.5x
- Linux Chiffon 4.19.1280microsoft-standard
made more clear example (to exclude http process) - indeed there is big difference if you run it with or without await, howver node has almost the same ratio, also - node runs faster in this case.
async function run() {
function noop() {
return new Promise(r=>r());
}
const startAwait = performance.now();
for (let i=0;i<10000000;i++) {
await noop(0);
}
console.log(String(performance.now()-startAwait));
const startNoAwait = performance.now();
for (let i=0;i<10000000;i++) {
noop(0);
}
console.log(String(performance.now()-startNoAwait));
}
run();
runs
$ time bun /tmp/a.js
3268127068
396293209
real 0m3.714s
user 0m3.593s
sys 0m0.213s
$ time node /tmp/a.js
1170.480182999745
411.93061799928546
real 0m1.618s
user 0m1.718s
sys 0m0.017s
There is still more work to be done to reduce Bun's promise/await overhead.
async
without await
is unaffected is because Bun checks if the promise is already resolved and skips delaying to the next microtask.
The main problem is a bug with bun's event loop implementation (which differs from the normal JSC way), which may be making the situation worse than default in JSC.
JSC plans to move from a C++ microtask queue to a mix of C++ and JS (cc @Constellation) eventually, which will have a very nice speed bump. I have a branch that moves to a JS-only microtask queue, but it was unfortunately too unreliable to ship without many small changes to JSC that didn't seem good
WebKit/JSC tracking bug: https://bugs.webkit.org/show_bug.cgi?id=243472
It turns out, an optimization was unintentionally disabled when using async / await in Bun's HTTP server.
For Bun.serve
specifically, this is partially fixed as of Bun v0.1.11!
Bun v0.1.11
❯ oha -z 5s http://localhost:8080 -c 20
Summary:
Success rate: 1.0000
Total: 5.0003 secs
Slowest: 0.0144 secs
Fastest: 0.0000 secs
Average: 0.0002 secs
Requests/sec: 128107.6040
Total data: 1.22 MiB
Size/request: 2 B
Size/sec: 250.21 KiB
Bun v0.1.10
❯ oha -z 5s http://localhost:8080 -c 20
Summary:
Success rate: 1.0000
Total: 5.0003 secs
Slowest: 0.0058 secs
Fastest: 0.0002 secs
Average: 0.0004 secs
Requests/sec: 51270.3948
Total data: 500.72 KiB
Size/request: 2 B
Size/sec: 100.14 KiB
A 2.49x improvement
The more general case of async
/ await
performance being worse in JSC than V8 is still an issue.