slowapi
slowapi copied to clipboard
Return a custom JSON-like object when rate limit is hit
I know it is possible to specify the error message in the limit
decorator, but I'm building a very complex API and all of my requests need to be consistent and follow a specific schema that I represented using a pydantic.BaseModel
subclass (basically a dataclass on steroids). Is it possible to return such a custom object instead of the default starlette.responses.JSONResponse
?
My response looks like this (once converted to a dictionary)
{"request_id" : "foobarbaz", "status_code": 416, "response_data": None, "error_message" : "Unsupported media type"}
@nocturn9x I just realised that I've left this issue abandoned for months, sorry. Did you ever figure out a solution to your problem? Is it still relevant?
As a first thought, I imagine you could subclass Limiter
and overload the _inject_headers
method to do what you need. I'm not sure there's much value in doing this in slowapi, as I suspect it's a fairly uncommon use case, but I might be wrong :)
Hi. Yeah this issue would still be relevant, can you illustrate any way of customizing that method?
Have you tried something like this? (adapting the FastAPI example from the docs)
from fastapi import FastAPI
from slowapi import Limiter, _rate_limit_exceeded_handler
from slowapi.util import get_remote_address
from slowapi.errors import RateLimitExceeded
class LimiterWithCustomResponse(Limiter):
def _inject_headers(...) -> YourResponseModel:
# your custom code here that overloads the existing method
pass
limiter = LimiterWithCustomResponse(key_func=get_remote_address)
app = FastAPI()
app.state.limiter = limiter
app.add_exception_handler(RateLimitExceeded, _rate_limit_exceeded_handler)
@app.get("/home")
@limiter.limit("5/minute")
async def homepage(request: Request):
return PlainTextResponse("test")
@app.get("/mars")
@limiter.limit("5/minute")
async def homepage(request: Request, response: YourResponseModel):
return {"key": "value"}
Let me know if this helps.
@nocturn9x Can we close this issue?
@nocturn9x Can we close this issue?
Yup, sorry if I never replied back. Thanks for the solution, btw!