delay excessive requests instead of serving 429s
Chess.com has a very great promise that not only reduces the amount of code needed, it seems to be smoother.
https://www.chess.com/news/view/published-data-api#pubapi-general
"If you send an API request after the results of the old API request arrived, you will NEVER encounter rate limits"
Unfortunately I don't think that's a realistic goal for our APIs, even if we artificially hold back responses.
that could be easily achieved I think, by changing the nginx ratelimit strategy from reject to delay, i.e. removing the nodelay option
That would work, but plenty of the stricter rate limits are in lila.
That would work, but plenty of the stricter rate limits are in lila.
I'm using a discord bot. If I could switch to a queue that doesn't even handle 429 I would love to.
that could be easily achieved I think, by changing the nginx ratelimit strategy from reject to delay, i.e. removing the
nodelayoption
If possible to make, would be nice if there was an authentication that makes an exchange "You don't get 429 and in exchange you cannot send two API requests at once, after you get any response to your old request, you may send another"
that could be easily achieved I think, by changing the nginx ratelimit strategy from reject to delay, i.e. removing the
nodelayoptionIf possible to make, would be nice if there was an authentication that makes an exchange "You don't get 429 and in exchange you cannot send two API requests at once, after you get any response to your old request, you may send another"
that could be easily achieved I think, by changing the nginx ratelimit strategy from reject to delay, i.e. removing the
nodelayoption
Can you do this?
It's not that simple in fact, because most ratelimiting is more complex and handled at the lila level.