python-memoization icon indicating copy to clipboard operation
python-memoization copied to clipboard

A powerful caching library for Python, with TTL support and multiple algorithm options.

Results 14 python-memoization issues
Sort by recently updated
recently updated
newest added

All things works fine when I used cache wrapper in the examples. But I met a big problem if I want to cache a inner function when design a lazy...

Hi folks, I really love your module and we even use it at work quit a lot :D I wanted to contribute and add support for async methods but got...

Are pandas dataframes supported as function arguments in a @cached decorated function? I tried to simplify this example with a smaller dataframe but @cached does seem to behave as one...

bug
help wanted

Having the cache info is very useful, but I'm missing an entry to the cache itself so I could serialize it and reuse it later. Is there any way to...

enhancement
high risk
priority-1-normal
size-large

Example source: ```python from memoization import cached import inspect class A: @cached(ttl=1) def b(self, name: str) -> int: return len(name) * 2 def a(self) -> bool: return self.b('hello') == 10...

bug
priority-0-high

In the numpy-universe stochastic functions's seed can be either fixed by setting it to an int, or deterministic behavior can be switched off by setting the seed to None. My...

## Overview When determining how large the cache using `max_size` is it may be useful to treat some items as larger than other to provide a better proxy for their...

I was expecting cache entries to be removed after TTL expires. This is useful when we want know how many entries are are actually cached currently.

priority-2-low
size-medium

I've a need to get/set values from an LFU cache directly, rather than as a function decorator. The need is as such: ```python def slow_function(*args, **kwargs) cache = choose_cache_out_of_many(*args) found...

Is there some way to Switch ON/OFF programatically the cache ? I have a scenario where I have the following logic : train() - cycle batch predict() - cycle predict()...

enhancement