ts2asl icon indicating copy to clipboard operation
ts2asl copied to clipboard

docs: where to store runtime state such as service clients across lambda invocations

Open dmeehan1968 opened this issue 1 year ago • 1 comments

It seems normal practice when implementing a Lambda to put client instantiation outside of the handler, so that they are performed once at cold start, and have their endpoint connections recycled as needed, rather than instantiating them on a per invocation basis (which could be very non-performant regards endpoint discovery).

Is there a mechanism to do this when using deploy.asLambda? Is there a context argument and is it reasonable to attach clients to this, and will they be persisteded across calls?

dmeehan1968 avatar Oct 23 '23 20:10 dmeehan1968

The sourceFile is passed to the NodeJsFunction constructor as the entry file, with the exported lambda name as the entry point (potentially multiple times for each defined Lambda. This means that client instances can be cached as module scope variable as is normally the case. e.g.

// s3client is created on cold start is preserved within the same runtime instance, 
// so the same client instance is available for each invocation of the handler

const s3client = new S3Client({})

export const myLambda = asl.deploy.asLambda((input: any) => {

    await s3client.send(new PutObjectCommand({ ... })

})

NB: When defining multiple lambdas in the same sourceFile, each is turned into an AWS lambda and each of those lambdas will have an instance of the client (even if they are not using them). An optimisation to this might be some form of lazy loading, so the client instance is created on first use and subsequently cached.

This could also be sub-optimal in the case of clients such as Lambda Powertools (logger, metrics) which might need overrides for log levels/metric namespaces etc.

dmeehan1968 avatar Oct 26 '23 17:10 dmeehan1968