sst
sst copied to clipboard
feat(lambda): support custom runtimes (e.g. bun/llrt)
I was thinking, since ion is based on bun, it would be cool if it enabled to use bun also remotely.
Moreover, the new llrt from AWS might also be an interesting alternative for the lambdas!
this is definitely something we're interested in but would like ion to stabilize before we introduce additional runtimes. i just added support for cloudflare workers and the abstractions we have kind of worked but not perfectly so needs some time to bake
the other side of it is bun on aws lambda isn't super compelling yet - custom runtimes just do not work well: https://maxday.github.io/lambda-perf/
this is definitely something we're interested in but would like ion to stabilize before we introduce additional runtimes. i just added support for cloudflare workers and the abstractions we have kind of worked but not perfectly so needs some time to bake
the other side of it is bun on aws lambda isn't super compelling yet - custom runtimes just do not work well: https://maxday.github.io/lambda-perf/
Back on this, I want to point out that, while it's true that bun's startup time is quite bad, llrt's one is very promising (check https://maxday.github.io/lambda-perf/#:~:text=14.1MB%20%E2%9A%A1%201.44ms-,llrt%20(prov.al2023),-%E2%9D%84%2034.96ms%20%F0%9F%92%BE)