cached-property
                                
                                
                                
                                    cached-property copied to clipboard
                            
                            
                            
                        Make `cached_property` derive from `property`
Hi @pydanny ! Since I'm using this library on a daily basis I figured it would be time to contribute at last :wink:
Class hierarchy (#26, #27)
cached_propertynow derives fromproperty.threaded_cached_propertyandcached_property_with_ttlarecached_propertysubclasses.threaded_cached_property_with_ttlis a subclass of bothcached_property_with_ttlandthreaded_cached_property.
This is not as drastic as #30 suggests but still helps avoiding code duplication.
Support __slots__ and stop using obj.__dict__ (#69, #123)
The cached_property does not use the object __dict__ to store the cached value, instead using a WeakKeyDictionary with the object as the key. This allows supportings slotted classes as long as we can create weakref to the object, i.e. "__weakref__" is in __slots__. Not using weakrefs would cause objects not to be garbage collected.
Better function wrapping (#72)
In Python 3, functools.update_wrapper is now used to extract attributes from the wrapped function. This means the cached_property will also expose the __qualname__ and __annotations__ of the wrapped function.
Miscellaneous (#124)
Supersedes #124, so that you don't have to resolve the merge yourself (I added @VadimPushtaev to the AUTHORS.rst since I discovered the __set_name__ protocol thanks to his PR, and I think he deserves some praise :smile: )
Rebased to latest master (Dec. 28).
@pydanny : Any reason to stall this ?
Sounds like a clear improvement.
For anyone interested (@carver, @bashtage ?): this has been released on PyPI (property-cached) until development resumes in here.
Looks good to me. Should also fix #168
Thanks for the heads up @vbraun . I gave it a try, but no dice:
  File "/home/quatro/.local/share/virtualenvs/scenographer/lib/python3.7/site-packages/property_cached/__init__.py", line 54, in __get__
    value = self.cache.get(obj, self._sentinel)
  File "/home/quatro/.local/share/virtualenvs/scenographer/lib/python3.7/weakref.py", line 433, in get
    return self.data.get(ref(key),default)  File "/home/quatro/.local/share/virtualenvs/scenographer/lib/python3.7/site-packages/property_cached/__init__.py", line 54, in __get__
    value = self.cache.get(obj, self._sentinel)
  File "/home/quatro/.local/share/virtualenvs/scenographer/lib/python3.7/weakref.py", line 433, in get
    return self.data.get(ref(key),default)
TypeError: cannot create weak reference to 'RelationDAG' object
                                    
                                    
                                    
                                
@Qu4tro : seems like you have a slotted object but it doesn't have __weakref__ in its slots.
I've updated https://github.com/Qu4tro/cachedproperty_immutable_classes with property_cached.
Yeah, it's derived from NamedTuple. I guess the only to do it now, since python 3.5.1, is to maintain an external store to save the cache to?
@Qu4tro : maybe you could open an issue on python/typing? I don't know if NamedTuple is not weakrefable because of implementation details or just because it's missing the feature; this might be worth a try.
Otherwise, you could write a metaclass that does the same as NamedTuple, but also adds a __weakref__ slot. I know this is not ideal but I assume this would be better than the external cache solution.
@althonos : I've created the ticket. Thanks for the advice.
@althonos Sorry about that - seems to be something else!
Wasn't that bad. Notified me of this thread which I had forgotten to type into some findings.
Basically, if anyone has the same issue as I do: caching NamedTuples properties, you can totally use functools lru_cache decorator. This has a caveat: The cache is shared per method across instances and the cache-key used is self - the NamedTuple itself. You may need to add a bigger limit or let it all in with @lru_cache(None). Works with @classmethod as well if anyone's wondering (cache is still stored across instances, but the cache-key is the Class, so the default limit should be okay).
If anyone knows any other issue to this approach it would also be great to hear.
By using the weak dictionary in the decorating class, compatibility with any non-hashable types is lost.
@matfax : but in exchange you gain support for slotted classes, that cached_property do not support.
@althonos I do not doubt that it would be a valuable addition. But maybe having both options in one library would be the wiser choice instead of superseding the current one so that dependents that rely on this library for having support for non-hashables do not become trapped in an old version. After all, there are valid use cases in that hashes can not be more than a purely random number. This package seems to be widely adopted. There would be quite an affair if existing builds stopped working after a version bump.