wolverine
wolverine copied to clipboard
Optimize Cold Start Times
When we do code generation, also generate a handler/endpoint loader type that can be used to speed up cold starts by eliminating any reflection at bootstrapping. Mostly worried about making Wolverine better for serverless scenarios, but this might get us a bit closer to AOT compliance too
Longer term thing. Might pair this w/ Lamar improvements as well.
This could potentially:
- Turn off assembly scanning to find handlers and/or HTTP endpoints
- Assembly scanning for Fluent Validation validators
- Maybe generates an executor finder that eliminates the usage of Reflection at runtime (just where Wolverine does a GetType(), then looks up in an ImHashMap).
- Eliminates having to scan an assembly to find pre-built types
- Maybe eliminates all assembly scanning for IoC registrations, but only for Lamar most likely
+1 for AOT support & removing reflection
A big +1 from me.
I’m currently integrating Wolverine into an Azure Functions project, and the cold start times for non-generated code (dynamic type loading) are a real performance issue. In some cases, HTTP requests take more than 30 seconds to respond before having it actively cached in the instance, which forces me to rely on static pre-generated types (something I’m actively working to implement).
Reducing this startup time would be a huge win, not only for AWS Lambda and Azure Functions users, but also for those running on Azure Container Apps or other serverless container platforms where cold starts are equally problematic.