@functools.cache destroys the function signature
A function decorated with @functools.cache loses its signature:
@functools.cache
def my_func(foo: int | list) -> str:
return str(foo)
my_func('completely', 'incorrect', 'arguments')
# Success: no issues found in 1 source file
There is an error if you try to pass an argument that isn't hashable:
my_func([])
# error: Argument 1 to "__call__" of "_lru_cache_wrapper" has
# incompatible type "list[Never]"; expected "Hashable" [arg-type]
But this completely ruins Intellisense:
I understand that the type system isn't powerful enough to express the correct semantics, which would be this:
@functools.cache
def my_func(foo: int | list) -> str:
return str(foo)
reveal_type(my_func) # Callable[[int], str]
That said, I would much rather have functioning Intellisense than a warning about unhashable inputs.
This should now be expressible using ParamSpec.
See also:
- https://github.com/python/typeshed/pull/7771
- #9768
- #6347
I think the conclusion we reached in previous attempts was that ParamSpec is unsufficient to solve this, due to the complex interaction with classmethod.
I think the best bet for getting stuff like this working is a potential future Intersection construct, to extend the Callable that's passed in with the additional attributes, rather than try to implement a descriptor that works transparently in regular functions, methods and classmethods, which does not appear to be possible.
I'm confused. Is there something about functools.cache that makes it particularly difficult to type, or does every function decorator have this problem?
I noticed functools.wraps also isn't annotated correctly:
class Demo1:
def foo(self) -> None: ...
@functools.wraps(foo)
def bar(self): ...
Demo1().bar() # Argument missing for parameter "self"
But contextlib.contextmanager is correct:
class Demo2:
@contextlib.contextmanager
def foo(self) -> Iterator[int]: ...
with Demo2().foo() as output:
reveal_type(output) # int
What gives? Why are some decorators correct and some incorrect/impossible?
Also, allow me to repeat/rephrase what I said earlier:
If a perfect solution isn't possible, I'd strongly prefer a solution that gives me useful IntelliSense.
contextlib.contextmanager takes a callable and returns another callable that is in no way distinguished from most other callables. This makes it pretty easy to type:
https://github.com/python/typeshed/blob/c51de8ee6859b3b1b411d48e56cd537716893e81/stdlib/contextlib.pyi#L76
functools.lru_cache and functools.cache, on the other hand, take a callable and return a custom callable object (an instance of functools._lru_cache_wrapper) that also has several special methods present on it (cache_info(), cache_clear() and cache_parameters()):
https://github.com/python/typeshed/blob/c51de8ee6859b3b1b411d48e56cd537716893e81/stdlib/functools.pyi#L201
Unfortunately, this does indeed make it quite a bit harder to type these particular decorators than contextmanager. Ideally we'd just have _lru_cache_wrapper be generic over a ParamSpec as well as the return type, but this isn't as easy as it sounds. There are ambiguities in the spec regarding how ParamSpecs should be captured with decorated methods that mean that this results in false positives on user code. See https://github.com/python/typeshed/pull/6356.
If a perfect solution isn't possible, I'd strongly prefer a solution that gives me useful IntelliSense.
I agree the current situation is deeply unsatisfactory (I think we all do); it's just a question of whether we can find a solution that doesn't cause an unacceptable number of false positives when type checkers are run on user code.
FWIW I think https://github.com/python/typeshed/pull/7771 came close, and I'd be interested in looking at that approach again (though I don't have time right now)
Wait, contextmanager returns a Callable? But it can still be used on methods? So Callable doesn't mean "it's callable", it actually means "it's callable and is a descriptor that returns a bound method"? That's... odd.
@Aran-Fey Callable is special cased inside a class, so foo: Callable[[int], None] behaves (almost) the same as if you had written def foo(self, x: int, /) -> None: ...
The almost concerns calling foo from the class instead of an instance, where the two differ, the former will expect you to pass in an instance of the class, while the latter will not.
But things are yet again different when you use a decorator and return a Callable with the same signature from that, in that case it will actually be equivalent to the def case.
All of this special casing is also part of what makes it so difficult to write a descriptor that transparently works the same way.
I see, thanks! But boy, what a mess.
You can fix this by running:
import functools
def my_function() -> None:
pass
my_function = functools.wraps(my_function)(functools.cache(my_function))
Instead of:
import functools
@functools.cache
def my_function() -> None:
pass
While this fixes the incorrect signature, you lose the extra attributes that functools.cache introduces this way, so it's at best trading one problem for another. But it can be a reasonable workaround in code that does not care about the presence of cache_clear/cache_info/cache_parameters.
While this fixes the incorrect signature, you lose the extra attributes that
functools.cacheintroduces this way, so it's at best trading one problem for another.
Those all still work for me...
I assumed this would fall under no type reassignments, but it looks like there's special casing to make this equivalent to decorating the function, but it also looks like your fix doesn't actually preserve the function signature, at least not in mypy or pyright: https://mypy-play.net/?mypy=latest&python=3.12 https://pyright-play.net/?pythonVersion=3.12&code=JYWwDg9gTgLgBAMwK4DsDGMIQDYGcCwAUEQCYCmCcIAngPrLozAQoAUAlHALQB8cAcizIAuInHFwwAQ1wFihGvVQZmKOAF5EyzDlwA6AO5QpYXK0UMVLdq0s68etFLQALMubp3V7dkShkANzIpbFoYajB3C21vP0Dg0PDIjyVGVT1aWidsUNp2IA
In fact there's no way to have both the extra methods and the correct signature at the same time with the current type stubs, since _lru_cache_wrapper hard codes them to *args: Hashable, **kwargs: Hashable and I postulate there's no way to frame this as a descriptor using ParamSpec either, due to the special-casing for classmethod, so intersection remains our only hope for solving this at this point, unless someone can demonstrate a solution that works in every possible case.
In pyright/pylance you actually have the worst of both worlds with your fix, because you both erase the function signature and _lru_cache_wrapper. So you don't get the correct signature and you also can't access the extra methods.
Perhaps the reason you were fooled into thinking this was a fix for the issue, is due to the fact that Jedi or any LSP built on top of it would show the correct function signature this way. But type checkers still infer the wrong signature.
Also just so we're clear, when I say "there's no way to make this work" I purely am talking about from a type checking perspective. At runtime this will work, but runtime isn't the issue here. Also you can work around this with your own custom descriptor that doesn't need to be able to handle functions, methods and classmethods all at the same time, if you split these into separate descriptors you can make it work right now, but we don't have this luxury with the standard library.
I strongly agree with @Aran-Fey that until this is solved, it would be much better to preserve the function signature at the cost of not checking that parameters are hashable than what is currently there.
If you decorate a function that has non-hashable parameters, you'll have a runtime error the first time you run your code. With the current state, you can pass wrong parameter types to your function possibly resulting in much subtler/harder to notice bugs.
I strongly agree with @Aran-Fey that until this is solved, it would be much better to preserve the function signature at the cost of not checking that parameters are hashable than what is currently there.
Unfortunately that is not the only cost. The additional cost and the main reason this is annotated the way it currently is, is that we will see false positives anywhere people try to call cache_clear/cache_info/cache_parameters. This will generate a lot of noise and confusion. Typeshed generally prefers false negatives over false positives.
It would be easy to preserve the function signature if we didn't need to make sure those methods are available.
In pyright/pylance you actually have the worst of both worlds with your fix, because you both erase the function signature and _lru_cache_wrapper. So you don't get the correct signature and you also can't access the extra methods.
Perhaps the reason you were fooled into thinking this was a fix for the issue, is due to the fact that Jedi or any LSP built on top of it would show the correct function signature this way. But type checkers still infer the wrong signature.
@Daverball My issue was that functools.cache disabled Vscode's type and docstring hints, which my fix reenables. I use Pylance not Jedi and have not experienced any issues. However, it seems that this would not suffice for your use cases and regardless I would like to see this fixed without the need for workarounds.