aws-lambda-dotnet
aws-lambda-dotnet copied to clipboard
Add response streaming support
Describe the feature
Add support for using lambda response streaming when returning streams in a dotnet lambda function.
https://aws.amazon.com/blogs/compute/introducing-aws-lambda-response-streaming/
Use Case
I would like to stream large S3 objects to http clients using a lambda URL.
Proposed Solution
Add functionality to enable returned streams to be sent using application/vnd.awslambda.http-integration-response
content type and prefixed with required JSON prelude / null bytes. Allow status code / http headers to be configured and added to JSON prelude.
Other Information
No response
Acknowledgements
- [ ] I may be able to implement this feature request
- [ ] This feature might incur a breaking change
AWS .NET SDK and/or Package version used
Amazon.Lambda.RuntimeSupport 1.10.0
Targeted .NET Platform
.net 8
Operating System and version
debian container in lambda
Discussion https://github.com/aws/aws-lambda-dotnet/discussions/1632 opened few days ago.
I was able to get this working with pretty minimal changes. I'll push it somewhere in case anyone wants to replicate until it's officially supported.
Certainly not hardened but is working for my use case of returning dynamic http content from a lambda URL: https://github.com/plaisted/aws-lambda-dotnet/commit/3d45f5a3db8e98030b502ca9574b6cf36ff2c5cc
Usable by returning new StreamedResponse
class which is a basic wrapper of a normal .net stream + http info:
[LambdaFunction]
public async Task<StreamedResponse> CustomerCDN(APIGatewayHttpApiV2ProxyRequest req, ILambdaContext ctx)
{
// dummy content
await Task.Yield();
var content = new MemoryStream(Encoding.UTF8.GetBytes("Example content!!".ToString()));
// return using lambda response streaming
return new StreamedResponse(content)
{
Headers = new Dictionary<string, string>
{
["Content-Type"] = "text/plain; charset=utf-8",
["Cache-Control"] = "no-store, private, stale-if-error=0"
}
};
}
Another use case would be streaming an LLM response (e.g. with Anthropic Claude)
@paolofulgoni Good use case. Definitely feature I want to get to.
As far as I can see, currently all dotnet lambdas work based on Stream
response underneath.
Therefore, are there any downsides to enabling streaming for all requests?
@Dreamescaper The streams you are seeing are different then Lambda's response stream. Currently the .NET Lambda runtime client only supports invoking the .NET code and taking the return from the invoke, if a POCO convert to a stream first, then upload the complete content of the stream back to the Lambda service. With response streaming we need a new programming model that provides access inside the .NET Lambda function the ability to write data back to the user without the function returning.
@normj But why Stream type isn't suitable for this purpose? AspNetCore uses it for this purpose, you can use IAsyncEnumerable<...> in your code, which is selialized to response Stream afterwards. Why a new programming model is needed, when existing Stream-based model already supports streaming (and it's only needed to wire it up in RuntimeSupport)?