fastapi
fastapi copied to clipboard
"request.json()" hangs indefinitely in middleware
First Check
- [X] I added a very descriptive title to this issue.
- [X] I used the GitHub search to find a similar issue and didn't find it.
- [X] I searched the FastAPI documentation, with the integrated search.
- [X] I already searched in Google "How to X in FastAPI" and didn't find any information.
- [X] I already read and followed all the tutorial in the docs and didn't find an answer.
- [X] I already checked if it is not related to FastAPI but to Pydantic.
- [X] I already checked if it is not related to FastAPI but to Swagger UI.
- [X] I already checked if it is not related to FastAPI but to ReDoc.
Commit to Help
- [X] I commit to help with one of those options 👆
Example Code
from fastapi import FastAPI, Request, Response
from starlette.middleware.base import BaseHTTPMiddleware, RequestResponseEndpoint
class Middleware(BaseHTTPMiddleware):
async def dispatch(self, request: Request, call_next: RequestResponseEndpoint) -> Response:
await request.json()
response = await call_next(request)
return response
app = FastAPI()
app.add_middleware(Middleware)
@app.post("/test")
async def test(test: dict) -> dict:
return {"data": "test"}
Description
- Run script with:
uvicorn main:app --reload
- Run CURL with POST request:
curl -X 'POST' 'http://localhost:8000/test' -H 'accept: application/json' -H 'Content-Type: application/json' -d '{"test": "test"}'
The application keeps hanging indefinitely and don't send a response to the request. I tested the equivalent code using app = Starlette()
and it answers correctly.
Operating System
Linux
Operating System Details
Ubuntu 22.04
FastAPI Version
0.79.0
Python Version
Python 3.10.4
Additional Context
No response
I have exactly the same problem on fastapi 0.81.0, python 3.10.6 ubuntu 22.04
you should remove line 8 code:
await request.json()
it is not this code pending your execution, it's following code pending your execution:
@app.post("/test") # here pending your code
async def test(test: dict) -> dict:
return {"data": "test"}
when fastapi run your test func, it tries to fill the params from request.body, but you already read the body out from stream, thus later read will pending. ref following fastapi code:
In my case I actually want to read the json in the middleware (for HMAC validation) before forwarding to the endpoint, is this possible?
@csrgxtu I need to read the body in JSON format in several middleware in order to process the data for various reasons (logging, security, sanitizing, etc.) so I wonder what the correct implementation would be in order to avoid halting the program?
In case you want to do this, you need to iterate re.stream()
.
I would rather suggest writing a pure ASGI middleware instead.
@araujo88 you can ref this issue: https://github.com/tiangolo/fastapi/issues/394
in summary it is a bug or design issue from starlette, and there is work around:
from typing import Mapping
from starlette.requests import Request
from fastapi import FastAPI, APIRouter, Depends
app = FastAPI()
api_router = APIRouter()
@api_router.post("/")
def read_root(arg: Mapping[str, str]):
return {"Hello": "World"}
async def auth_middleware(request: Request):
# you can implement your auth here
print(await request.json())
# the trick here is including auth_middleware in the dependencies:
app.include_router(api_router, dependencies=[Depends(auth_middleware)])
thus this middleware logic will work on every requests, we use it same way in our production env, works fine.
note: Dependencies
will inject your auth logic into each request, for more detail ref: https://fastapi.tiangolo.com/tutorial/dependencies/
Is it possible to stream a request multiple times in FastAPI using self.stream()
? I tried various approches, like copying the Request
object, but none of them worked.
@araujo88 It can't be using self.stream()
, for per request, only one time can use self.stream()
.
Or you will hang forever because of await self.event_message.wait()
You can see here: uvicorn/protocols/http/h11_impl.py#L543
This is because that when self.stream()
done, there are no action to notify self.event_message
This asyncio.Event
will be notified of four actions:
- when Connection Lost
- when body_data received
- when body_data readed done
- when write reponse data done
And you can try this:
async for chunk in request.stream():
print(chunk)
await request.json()
@XingDongZhe I'm trying to process the request body through different middleware for different purposes (such as logging, sanitizing, etc.) however once the request.stream()
has been called once by the first middleware, the next middleware keeps awaiting for the request due to the way request.stream()
is implemented. What would be the correct approach for this? I've seen workarounds such as implementing a new middleware class instead of using BaseHTTPMiddleware
but that seems too far-off to me.
If you want to keep using BaseHTTPMiddleware, I would suggest you create a new Request object at the end of your middleware to avoid such indefinite waits.
@araujo88 Yes, You can workaround with override parameter (or function): recevie
to cache request.stream()
data
There is exmple code in issue, but I can't find it now.
You can look for it