ratelimit
ratelimit copied to clipboard
Setting same ratelimit across different API calls
Hi Tom,
This package has been pretty helpful.
I am writing python code, where I call different end points in different methods. Is there a way that I could enforce same total API call limits across methods ?
Example: Total rate limit : 200 / Min
API Call 1 --> Method 1 This used up 150 / Min
API Call 2 --> Method 2
Want to restrict API Call 2 to 50 /Min inorder to avoid overall limit
@tstrinadhreddy I recently came across this issue while working on a project. This is a major concern for me as there's no way two methods to point to the same restriction. However, I thought of a few workarounds listed below
Method 1: You could simply abstract out the actual API call method and apply rate limit over there and whichever method needs to call the API they would have to be called through the rate-limited function.
Example:
Instead of this
def func1():
# API Call 1 goes here
def func2():
# API Call 2 goes here
try to abstract out the API call like this
class LimitedRequest:
@limits(calls=200, period=60)
def get(self, url: str, params=None, **kwargs):
return requests.get(url, params, **kwargs)
# Other methods as required
This could be abstracted out further to include app the HTTP methods required by your API. This didn't work out well for me as I wanted persistence as well across server restart while playing with certain strict 3rd party API's which wasn't possible here.
Method 2: @deckar-01 has a fork of this library as mentioned in Issue #31 here which implements persistence which you would help you by having the 2 methods separate like this by using the same database.
@limits(calls=200, period=60, storage='api_site_name.db')
def func1():
# API Call 1 goes here
@limits(calls=200, period=60, storage='api_site_name.db')
def func2():
# API Call 2 goes here
This worked like a charm but has some issues while using multiple threads due to limitations of SQLite as reported here
Method 3: As a work-around to Method 2, I've implemented the same with Redis instead of SQLite which was made keeping Django in mind but can be adapted to any Python application by passing a Redis cache instance to the method or configured directly.
Here's the gist
Example:
@limits(calls=200, period=60, name='api_site_name')
def func1():
# API Call 1 goes here
@limits(calls=200, period=60, name='api_site_name')
def func2():
# API Call 2 goes here
@limits(calls=100, period=60, name='another_api_site_name')
def func3():
# API Call 3 goes here
Another (possibly simpler) method for sharing a rate limit between multiple functions or methods might look like this:
@limits(calls=200, period=60)
def api_limit():
pass
def func1():
api_limit()
# API Call 1 goes here
def func2():
api_limit()
# API Call 2 goes here
@irgeek
The code tested is the following:
@limits(calls=4000, period=60)
def api_limit():
pass
def func1():
# Imported and called from REPL1
count = 0
while True:
api_limit()
print(count)
count += 1
def func2():
# Imported and called from REPL2
count = 0
while True:
api_limit()
print(count)
count += 1
If two different REPLs simultaneously call each function (func1() and func2()) they will both exhaust the calls/minute.
On the other hand, deckar01-ratelimit will work slower but as expected. The concurrency issues have been solved after release 3.0.1.
Use case:
@limits(calls=4000, period=60, storage="limit.db")
def api_limit():
pass
def func1():
...
Nonetheless, there are actively maintained apps that one could use such as PyrateLimiter and iolimiter.