aspnetcore
aspnetcore copied to clipboard
ASP.NET HTTPS request "hangs" when reading request stream on certain requests
Is there an existing issue for this?
- [X] I have searched the existing issues
Describe the bug
In testing a custom input/output formatter that I've developed I encountered what I believe might be a bug in ASP.NET.
Request processing hangs when reading from the request stream (Microsoft.AspNetCore.Server.Kestrel.Core.Internal.Http.HttpRequestStream.ReadAsync
). This happens when a custom TextInputFormatter binds the request body to an IAsyncEnumerable
controller action argument, which causes the reading of the request and the writing of the response to overlap (which is probably unusual). This only happens when the request exceeds a certain size. (NOT the asp.net request size limit).
The issue is best understood with a minimal repro, which is available in this repository: https://github.com/MarkPflug/HttpStreamMinRepro
The repository also includes more detailed description of the issue.
Expected Behavior
The request proceeds without hanging regardless of the request size, assuming it doesn't exceed the asp.net request size limit. Alternately, an exception is thrown informing me that I've done something silly.
Steps To Reproduce
See the minimal repro in the linked repository: https://github.com/MarkPflug/HttpStreamMinRepro
Run the web app, then the ReproClient to generate a large request.
Exceptions (if any)
There isn't really an exception thrown, other than the exception thrown when the client disconnects due to timeout, but that exception isn't an issue.
.NET Version
7.0.100-rc.2.22477.23
Anything else?
This repros on .net 6 as well. Repros on both kestrel and iisexpress. This only seems to happen on HTTPS requests. Over HTTP, the response is properly received. The PlainTextOutputFormatter isn't critical to the repro, it also hangs with the standard JSON output. However, I don't see the server unwind in the debugger when the JSON formatter is used (I can't explain this either).
The bottleneck you're hitting here is on the client. You're doing bidirectional streaming, but the client isn't actually consuming the response until the request finishes uploading, so all the network buffers fill up and the response stalls. I don't know why that would be different without TLS, but it's likely just that the buffer boundary is different.
I was able to get it working with the following tweaks. HttpClient supports bidirectional streaming only over HTTP/2+ and with a custom HttpContent type.
var req = new HttpRequestMessage()
{
+ VersionPolicy = HttpVersionPolicy.RequestVersionOrHigher,
RequestUri = new Uri("Echo/Post", UriKind.Relative),
Method = HttpMethod.Post,
};
- req.Content = new StringContent(requestString, Encoding.UTF8, "text/plain");
+ // req.Content = new StringContent(requestString, Encoding.UTF8, "text/plain");
+ req.Content = new CustomContent(requestString, "text/plain");
- resp = await client.SendAsync(req);
+ resp = await client.SendAsync(req, HttpCompletionOption.ResponseHeadersRead);
+public class CustomContent : HttpContent
+{
+ string _requestString;
+
+ public CustomContent(string requestString, string contentType)
+ {
+ _requestString = requestString;
+ Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue(contentType);
+ Headers.ContentLength = _requestString.Length;
+ }
+
+ protected override async Task SerializeToStreamAsync(Stream stream, TransportContext? context)
+ {
+ var bytes = Encoding.UTF8.GetBytes(_requestString);
+ await stream.WriteAsync(bytes);
+ }
+
+ protected override bool TryComputeLength(out long length)
+ {
+ length = 0;
+ return false;
+ }
}
https://github.com/dotnet/runtime/blob/3968ceb4297d87109ea32401da02fe4432188a4a/src/libraries/System.Net.Http/src/System/Net/Http/SocketsHttpHandler/Http2Connection.cs#L1968 https://github.com/dotnet/runtime/blob/9768606ea1f0aa1be6098143ded330dadac8cf91/src/libraries/System.Net.Http/src/System/Net/Http/HttpContent.cs#L349
This issue has been resolved and has not had any activity for 1 day. It will be closed for housekeeping purposes.
See our Issue Management Policies for more information.
@Tratcher Hey Chris, I just want to say thanks for taking the time to look at this. I think I see what I need to do to fix my issue, which is to buffer the entire request before starting the response. I realized that this same construction behaved as-expected when using the standard Json formatter, so I investigated how it was different. Inside the S.T.Json serializer it will end up buffering the entire request body into a "BufferedAsyncEnumerable" which avoids entering the controller code before the request is completely received. Makes sense, since the very last character of the request body might be the difference between a 200 and 400 response code. As things were, my only option would have been to kill the connection to signal an error, which would have been less than ideal.
Note the S.T.Json doesn't buffer the whole body all at once, it's a streaming parser that buffers the next segment and then parses it. The end result is a full S.T.Json model, but you never had a byte[] containing all the raw data at once.
Right, it ends up buffering a List<T>
, not the actual input bytes. https://github.com/dotnet/runtime/blob/9aaa21dde5da67c5b686d49a8ef38c982410e899/src/libraries/System.Text.Json/src/System/Text/Json/Serialization/Converters/Collection/IAsyncEnumerableOfTConverter.cs#L135-L148