Bun Benchmark?
Interesting project!
As a team we're considering getting off Node, and up until now we've only considered Bun. I'd like to know if llrt could be a better fit, but right now the README is only measuring against Node.
Could a Bun/llrt benchmark comparison be done? If there's other interest I could find some capacity to help but wanted to gauge interest first.
I don't know a ton about LLRT so I can't really comment on performance differences between Bun and LLRT (is LLRT JIT'd?) , but @weklund if you guys run into any issues with Bun or have any questions am happy to hop on a call or do a shared slack/discord workspace - feel free to email [email protected]
Hey @Jarred-Sumner @weklund, thanks for your comments! Happy to provide some clarity here.
LLRT was designed with a very specific use case in mind, and it's not intended to compete directly with Bun (or Node/Deno), rather to complement them. In fact, for many traditional server-side workloads or computationally intensive JavaScript, Bun’s performance, thanks to its use of a highly optimized JIT engine and native implementations going to be significantly better. That’s a major strength of Bun and one of the reasons it’s such a compelling runtime for a broad range of use cases.
LLRT takes a different approach. It’s built on QuickJS, a smaller, interpreter-based engine, and implements a much more limited API surface, however intentionally so. This minimalism is actually a benefit in serverless environments, where cold start time and runtime size can have a major impact on user experience. LLRT’s focus is on fast startup, small footprint (<5MB executable), and just enough functionality to make it useful for a lot of things, such as data transformations or API orchestration in environments like AWS Lambda.
So while it's not an apples-to-apples comparison, it can be useful benchmarking LLRT and Bun for YOUR specific use case. Especially if those are aligned with serverless workloads. I’d be more than happy to support that effort you have any questions or issues :)
Using ARM 1024MB RAM, Bun (same for Deno) has almost 3x slower cold start ~304ms than provided by AWS Node.js 22 runtime ~135ms. On a side note, LLRT got almost 2x slower (compared to initial ~1yr ago LLRT numbers ~20ms, now ~40ms) due to continuous size increments. But LLRT is still not provided by AWS, meaning we should compare it to Bun, not Node.js, and there’s almost 8x difference between the two. There is an option to llrt compile to bytecode the source code for LLRT, but Maxday’s benchmark doesn’t have it. (Bun also has —bytecode but it cannot come close given the existing difference) https://maxday.github.io/lambda-perf/
It’s impossible for Bun to compete with a provided runtime due to caching, etc. whereas Docker’ized (Deno in their blog like to compare performance in Docker environments) Lambda functions are much slower than provided + Lambda Layers.
The above means there’s no place for Bun/Deno in AWS Lambda, until they convince AWS to offer them as AWS provided runtimes. Cold starts aside, Lambda processes one request at a time, so there’s no real-world performance benefit between Node.js and Bun, whereas there are major stability differences between the two