Peter Bengtsson

Results 314 comments of Peter Bengtsson

Bleh! The small amount that we're using a HTML parser it might *never* make sense to make it faster. Apart from having to rewrite the code we might go from...

I'm tempted to just close this. What alerted me was around topics (i.e. performance, memory leaks, forgiveness) that might not be relevant or help us. We can just take a...

How do you think that could be solved? If you run something like `uwsgi` you could use a global variable which I think is best solved with https://docs.python.org/3/library/functools.html#functools.lru_cache

The whole point of `django-cache-memoize` is to use a central memoization. It already does that. What I think you're talking about is potentially locking the function for other callers whilst...

It's not easy. Especially when you're distributed across Python processes, but the only way it can be solved is with a central cache. One possible solution is to be aware...

That's interesting actually. I always/only use [Redis for my Django caching](https://niwinz.github.io/django-redis/latest/) so we could make it configurable. I.e. if you know you have Redis and you worry about executing a...

Turns out, I needed this. And since I use Redis for my Django cache backend, I could use `cache.lock()`. See this blog post: https://www.peterbe.com/plog/django-lock-decorator-with-django-redis Basically, we could add an option...

Oh. @lamby you were the one who RT'ed this on Twitter. Hi!

Good point. A "solution" is to use both `django-cache-memoize` as always and use something like that `@lock_decorator` from the blog post. Right? ```python @lock_decorator() @cache_memoize(100) def expensive_function(start, end): return random.randint(start,...

I don't know if I understand. Perhaps we're talking about different things. The example just above works very well in theory. Two highly concurrent requests (even one different web heads)...