Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException
We have deployed an Azure Function App with the Python 3.8 stack. For some requests, we have received a BadHttpRequestException
Exception while executing function: Functions. Exception binding parameter 'req' Reading the request body timed out due to data arriving too slowly. See MinRequestBodyDataRate.
But this exception is thrown from ASP .Net core . We are not sure what is causing this exception, network connectivity seems to fine.
Is it something to do with request payload size, if it is big and it resulting in slowness.
In Python Http Trigger Azure function, is it possible to change the MinRequestBodyDataRate?
The error message is shown below:
Exception while executing function: Functions.Claims_Processing Exception binding parameter 'req' Reading the request body timed out due to data arriving too slowly. See MinRequestBodyDataRate.
Hi @hjain5164, Thank you for your feedback! We will check for the possibilities internally and update you with the findings.
Thanks @v-bbalaiagar for the response. Can we please see the possible solution for this issue as it is impacting the production application.
Hi @hjain5164, I checked this internally. We would like to understand if this is a just with a specific set of clients that you are getting. Because that will most likely be the case is that data is not actually the connection speed is not allowing for us to read data that would meet the data transfer rates.
Yes it happens when there are like 10 request sent parallely to the function app, and all the request are failed with this error. But the next batches are processed successfully. So is it the case that Function app is not able to handle 10 request immediately when it is idle for too long
If connection speed is a problem, what are the possible solutions for that as the source system (calling the azure function) is deployed in AWS and the destination system (azure functions) is deployed in Azure.
On Mon, 27 Sep, 2021, 10:43 pm v-bbalaiagar, @.***> wrote:
Hi @hjain5164 https://github.com/hjain5164, I checked this internally. We would like to understand if this is a just with a specific set of clients that you are getting. Because that will most likely be the case is that data is not actually the connection speed is not allowing for us to read data that would meet the data transfer rates.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Azure/azure-functions-host/issues/7689#issuecomment-928083585, or unsubscribe https://github.com/notifications/unsubscribe-auth/AE4H45JRTGSGUATP7WSRBN3UECQZXANCNFSM5ETHCQ6A . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.
This issue has been automatically marked as stale because it has been marked as requiring author feedback but has not had any activity for 4 days. It will be closed if no further activity occurs within 3 days of this comment.
Tagging @fabiocav for further insights
Adding this issue as a feature enhancement request so that team can expose the configuration.
I am also facing this issue on python 3.13 and 3.12 as well. This linux function app is running on S1 app service plan.
I have configured my function to allow the file upload till 200 MB by using: FUNCTIONS_REQUEST_BODY_SIZE_LIMIT: 200MB AzureFunctionsJobHost__extensions__http__minRequestBodyDataRate__bytesPerSecond: 100 AzureFunctionsJobHost__extensions__http__minRequestBodyDataRate__gracePeriod: 00:00:30
When I am trying to upload to a small file like 30-40 MB it's working fine but for 171 MB I am getting this error.
Could this be due to the number of concurrent request being processed at that time or anything else?