django-redis
django-redis copied to clipboard
Support for MAX_ENTRIES request
I am using Redis for caching in my multi-tenant system. And for testing purpose of the MAX_ENTRIES, I put 'MAX_ENTRIES': 2 in Both filebased.FileBasedCache and cache.RedisCache. Filebased is working fine ( it doesn't save more than 2 items as expected) But Redis is saving all the data and it is not deleting after reaching the limit.
Hello @rayvikram correct me I am wrong,
I had a look at the documentation about this feature, which I am quite sure it is not supported by django-redis.
As of my understanding there is a misleading claim, in the documentation, because reading the code it seems to me that it does not delete the oldest entry but a random one here.
Are you sure file based.FileBasedCache deletes the oldest entry?
It is possible to see how many entries in a redis database in many ways, but knowing which one is the oldest is not easy.
Do you have any idea on how to implement it without incurring in huge performance degradation and without creating a complex writing tactic?
I correct myself, it can be done if the timeout is set, because you could look at the smallest ttl, but what if there is no timeout, is deleting a random one acceptable? I do not think so
It seems to me that if there is no logic to delete the oldest even if there is a timeout DB based cache deletes in order of cache_key. here and here
I think it can be something acceptable to delete a random one if timeout is not set.
I need to read this before, to see what happens when no expiration is set. Might be a solution without adding lots of logic for this feature.
FileBasedCache
Yes it deletes a random cache item. But as per django-docs, caches uses a least-recently-used (LRU) culling strategy. So I was expecting this behavior. But even it deletes LRU or Oldest, my main challenge is to limit the maximum item in the cache. I have verified that MAX_ENTIES is not working in Redis.
Yes, I have read the docs again, there is no such claim to support MAX_ENTRIES.
Did not test this but maybe it can help someone make a PR.
-
Create a custom cache class that inherits from cache.RedisCache and overrides the add() and set() methods to implement the MAX_ENTRIES logic.
-
In the custom cache class, use the lru list to store the cache keys in the order they were added. This will help you to identify the oldest cache key to delete when the limit is reached.
Here's an example implementation:
from django.core.cache.backends import redis as cache
class RedisCacheWithMaxEntries(cache.RedisCache):
def __init__(self, server, params):
super().__init__(server, params)
self.lru = []
self.max_entries = params.get('MAX_ENTRIES', 2)
def add(self, key, value, timeout=cache.DEFAULT_TIMEOUT, version=None):
result = super().add(key, value, timeout, version)
if result:
self.lru.append(key)
self._enforce_max_entries()
return result
def set(self, key, value, timeout=cache.DEFAULT_TIMEOUT, version=None):
super().set(key, value, timeout, version)
if key not in self.lru:
self.lru.append(key)
self._enforce_max_entries()
def _enforce_max_entries(self):
while len(self.lru) > self.max_entries:
oldest_key = self.lru.pop(0)
self.delete(oldest_key)
To use this custom cache class in your Django project, update the CACHES setting in your project's settings.py file:
CACHES = {
'default': {
'BACKEND': 'path.to.your.RedisCacheWithMaxEntries',
'LOCATION': 'redis://127.0.0.1:6379/1',
'OPTIONS': {
'MAX_ENTRIES': 500, # Set your desired max entries limit here
'CLIENT_CLASS': 'django_redis.client.DefaultClient',
},
},
}
Replace 'path.to.your.RedisCacheWithMaxEntries'` with the actual path to the custom cache class in your project.
This implementation will ensure that the number of cache entries does not exceed the specified MAX_ENTRIES limit. When the limit is reached, the oldest cache entry will be removed to make room for the new entry.
hello @some1ataplace
I'm sorry but your implementation would not work with multiple web servers/processes, there is a need of a little bit more complex writing strategy using directly redis to ensure only N keys are written to redis
Please leave the default value MAX_ENTRIES to None if you are going to change this..
Facing this issue, MAX_ENTRIES not working for redis-cache but working for file based cache or memcached. Is there any solution??
there's no plan to support this feature, if anyone is interested in proposing a solution I'm all ears but I don't think it's worth the effort hence why I don't plan supporting it anytime soon