memorised
memorised copied to clipboard
Feature Request: Create a clear function
Your module is almost perfect for my needs. I'm using celery with many daemons and workers, and I needed a centralized cache to replace the memoise pattern. This is working really great. However, I do need the ability to target cache deletes so I can refresh them the next time a worker calls it without having to restart workers or flush all cached entries.
I did something that of course feels a little dirty but I can't come up with a better way. Basically I add a function to the wrapper function on the __call__
. This way I can target specific keys to be deleted from cache. Maybe you have a better way?
I would also really like to figure out a way to delete keys by some kind of grouping (like all for a specific function), but I'm a bit torn on the implementation as I can imagine some issues with a lot of cached entries.
Do you see any issues with this approach? Is this a feature you'd like to see added?
from memorised import decorators
class my_memorise(decorators.memorise):
def __init__(self, *args, **kwargs):
super(my_memorise, self).__init__(*args, **kwargs)
def __call__(self, fn):
self.func = fn
wrapped_func = super(my_memorise, self).__call__(fn)
wrapped_func.clear = self.clear
return wrapped_func
def clear(self, *args, **kwargs):
key = self.key(self.func, args, kwargs)
self.mc.delete(key)
@my_memorise()
def a_test(p):
print 'Executed'
return p + 1
>>> a_test(1)
Executed
2
>>> a_test(1)
2
>>> a_test.clear(1)
>>> a_test(1)
Executed
2
>>> a_test.clear(1)
>>> a_test.clear(1)
Hi, I'm glad you like the module.
I think that's definitely something that would be handy in memorised itself, but you'd probably want to avoid conflicting with clear methods in the wrapped object, just in case, so either using something less likely to conflict like _memorised_clear()
, or going the utility route:
from memorised.utils import clear
clear(a_test)
import memorised.utils
memorised.utils.clear(a_test)
from memorised.utils import clear as mclear
mclear(a_test)
I don't have an awful lot of time to work on memorised these days, but if I get some spare time I could look at writing this in, I would also be perfectly happy to accept a pull request to add the feature. :smiley:
I like the utility route better. I didn't go that route because I was having a hard time figuring out how to access the memorise class to execute a method from the outside since the wrapper was a function. Seems like I'm always learning something when it comes to Python metadata.
I'll keep looking into how to do it and if I can figure it out I'll send a pull request.
Thanks for the response!
I think this feature is important when working with @property
and @setter
. The problem is that if I memorise the getter's result, I have to clear or invalidate the cached value in the setter. I worked out an example to illustrate how to do this and it is based on @mverrilli suggestion. But I am still wondering how the suggested utility can be done.
import memcache
from memorised.decorators import memorise
class my_memorise(memorise):
def __init__(self, *args, **kwargs):
super(my_memorise, self).__init__(*args, **kwargs)
def __call__(self, fn):
self.func = fn
wrapped_func = super(my_memorise, self).__call__(fn)
wrapped_func._unmemorize = self._unmemorize
return wrapped_func
def _unmemorize(self, *args, **kwargs):
key = self.key(self.func, args, kwargs)
self.mc.delete(key)
class TestClass(object):
def __init__(self, x):
self.__x__ = x
@my_memorise(ttl=10)
def funct1(self):
print 'called'
return self.__x__
@property
@my_memorise(ttl=5)
def x(self):
print 'called'
return self.__x__
@x.setter
def x(self, value):
tmp = TestClass.x.fget
#import pprint
#pprint.pprint(tmp)
tmp._unmemorize(self)
self.__x__ = value
a = TestClass(10)
print a.x
print 'set x to 20'
a.x = 20
print 'get (@property)'
print a.x
print a.__x__
However, the proposed clear feature is problematic. Since part of the key is generated from the arguments, to find the right cache to be cleared, we have to remember the arguments as well...