api.webmaker.org
api.webmaker.org copied to clipboard
Rate Limiting
We should apply rate limits to most API routes to prevent abuse and spam. I'm not sure what kind of rate limiting tools are available for Hapi, but they're sure to be out there.
Not sure if this helps, but here is the token-bucket based throttle I worked on for a few other projects: https://github.com/thisandagain/micron-throttle
Allows you to specify any arbitrary tokenTable
so connecting to redis
, memcached
, etc. is trivial.
That looks useful, but it looks like it won't work in Hapi
@thisandagain, @jbuck do these seem like reasonable limits? (I'm picking these arbitrarily)
Method | Path | Limit | Period(min) | Key On |
---|---|---|---|---|
GET | /discover | 2500 | 15 | IP |
GET | /projects | 2500 | 15 | IP |
GET | /users/{user} | 25 | 15 | User ID |
PATCH | /users/{user} | 25 | 15 | User ID |
DELETE | /users/{user} | 25 | 15 | User ID |
GET | /users/{user}/projects | 2500 | 15 | IP |
POST | /users/{user}/projects | 25 | 15 | User ID |
GET | /users/{user}/projects/{project} | 2500 | 15 | IP |
PATCH | /users/{user}/projects/{project} | 100 | 15 | User ID |
DELETE | /users/{user}/projects/{project} | 25 | 15 | User ID |
PATCH | /users/{user}/projects/{project}/feature | 100 | 15 | User ID |
GET | /users/{user}/projects/{project}/pages | 1000 | 15 | IP |
POST | /users/{user}/projects/{project}/pages | 100 | 15 | User ID |
GET | /users/{user}/projects/{project}/pages/{page} | 1000 | 15 | IP |
PATCH | /users/{user}/projects/{project}/pages/{page} | 100 | 15 | User ID |
DELETE | /users/{user}/projects/{project}/pages/{page} | 100 | 15 | User ID |
GET | /users/{user}/projects/{project}/pages/{page}/elements | 1000 | 15 | IP |
POST | /users/{user}/projects/{project}/pages/{page}/elements | 250 | 15 | User ID |
GET | /users/{user}/projects/{project}/pages/{page}/elements/{element} | 1000 | 15 | IP |
PATCH | /users/{user}/projects/{project}/pages/{page}/elements/{element} | 250 | 15 | User ID |
DELETE | /users/{user}/projects/{project}/pages/{page}/elements/{element} | 250 | 15 | User ID |
GET | /users/{user}/projects/{project}/remixes | 1000 | 15 | IP |
POST | /users/{user}/projects/{project}/remixes | 25 | 15 | User ID |
That seems kinda complicated. Could we simplify it by just setting it to x requests over a 60 minute period for all of them?
I don't think it should be the same for all of them. Some routes should allow for more (pages, elements), some less (projects), and non-authed routes need to allow for lots from one IP (/discover, /projects)
I also don't think it's complicated, it's just configuration
hey! i'm taking over this. i think i agree with both @jbuck and @cadecairos. what i'd like is to create a small, medium, and large limit, and just assign each endpoint as small medium or large. this will be a temporary config until we have data to really decide what the numbers should be. thoughts?
@ashleygwilliams sounds good to me!
:+1: