fastify-rate-limit icon indicating copy to clipboard operation
fastify-rate-limit copied to clipboard

Slowing down of responses instead of returning 429

Open Beebeeoii opened this issue 4 years ago • 5 comments

🚀 Feature Proposal

Have an option to slow down responses instead of returning 429 errors.

Motivation

By having the option to slow down responses would deter scrapping of data from a server which would put an additional load on the server while still enabling responses for cases where user may not be actually scrapping, but just that there are many machines surfing the webpage under the same IP

Example

express-slow-down

Beebeeoii avatar Jul 18 '21 08:07 Beebeeoii

I think we can have this as an option. Would you like to send a Pull Request to address this issue? Remember to add unit tests.

mcollina avatar Jul 18 '21 09:07 mcollina

I think we can have this as an option. Would you like to send a Pull Request to address this issue? Remember to add unit tests.

How would you do this? Having a sleep as a hook? We should also monitor the size of the sleeping queue otherwise we will face too many stalling requests. Don't you think so?

zekth avatar Jul 18 '21 09:07 zekth

I think we can have this as an option.

Would you like to send a Pull Request to address this issue? Remember to add unit tests.

How would you do this? Having a sleep as a hook? We should also monitor the size of the sleeping queue otherwise we will face too many stalling requests. Don't you think so?

Exactly all of that. It's not "easy" and a solution that I would implement in my servers, however it's something that somebody might want to add.

In other terms: it is useful if you want to rate limit for business purposes but not protecting against attacks.

mcollina avatar Jul 18 '21 09:07 mcollina

I think we can have this as an option. Would you like to send a Pull Request to address this issue? Remember to add unit tests.

How would you do this? Having a sleep as a hook? We should also monitor the size of the sleeping queue otherwise we will face too many stalling requests. Don't you think so?

Yup agree that theres some complexities involved for this implementation. will see what i can come up with as this is a feature that i believe will help many

Beebeeoii avatar Jul 18 '21 10:07 Beebeeoii

This feature assumes that the client calls sequentially the server, moreover, the client must be configured to don't rise a timeout on its side and the net stuff in the middle (proxies, firewall etc..) must be configured to don't cut the starving HTTP connection and raise up (by default nginx has a 60 seconds timeout) This requires comprehensive documentation to avoid issues about these topics.

Eomm avatar Jul 18 '21 15:07 Eomm

There are two approaches: one is to implement the solution in this package, and the second is to create a new package just for this improvement. I am working on implementing a new package.

ghost avatar Sep 09 '22 07:09 ghost

hi thanks @CristiTeo for looking into this !

Beebeeoii avatar Sep 10 '22 01:09 Beebeeoii

I made the first version of fastify plugin to slow down the responses. You can check it here

ghost avatar Sep 20 '22 16:09 ghost