typing
typing copied to clipboard
Allow Ellipsis as annotation for inferred type
PEP 484 says that if an annotation is missing then the type assumed to be Any
def f(x) -> None:
reveal_type(x) # Revealed type is 'Any'
def g(x: int):
pass
reveal_type(g(1)) # Revealed type is 'Any'
However, it is not clear how to say to type checker that it should infer a missing type, rather than assume that it is Any. It was proposed by @ncoghlan to use Ellipsis for this purpose:
def f(x: ...) -> None:
...
def g(x: int) -> ...:
...
I am opening this issue, so that this idea will not be forgotten.
This kind of inference could be very costly -- you may have to trace arbitrary many calls deep.
The checker could still give up and use Any if inference proved too difficult, though. From the PEP 525 discussions, the main places where it seemed like this could be useful is to allow type inference on instance variables while still declaring them in the class body, allowing a function's return type to be implied by the return statements it contains, as well as allowing an implied Union for types initialised conditionally:
x: ...
if initial_values is not None:
x = list(initial_values)
else:
x = None
# x would be considered Optional[List[Any]] here, or potentially a more specific type if the inference engine has a more precise type for initial_values than Iterable[Any]
I don't have any insight into how practical that kind of thing would be to implement, though.
def f(x: ...) -> None:
Assumes that you can infer an accurate call graph, which is highly unlikely.
In general, attempting infer types globally, which is required to infer parameter types, is highly inaccurate.
Maybe we could still allow the second form (with Nick's addition about inference cost)?
def g(x: int) -> ...:
return x
reveal_type(g(1)) # Revealed type is 'Any'
I think this is something where we should first come up with an experimental implementation (maybe a flag for mypy) before we decide to modify PEP 484. (And what should get_type_hints() do?)
@gvanrossum I agree there is no hurry for this. I just wanted to open an issue so that this idea will not be forgotten.
BTW in PyCharm we just infer types for values that are missing (thus not conforming formally to PEP 484).
Or use pytype to generate the type information. ;)
Related example:
import dataclasses
import typing
class Class:
def __init__(self, x):
self.x = x
@dataclasses.dataclass
class Dataclass:
x: ... # none of these work: object, ..., typing.Any, None, str | int
typing.reveal_type(Class("test").x) # str
typing.reveal_type(Dataclass("test").x) # ...
typing.reveal_type(Class(3).x) # int
typing.reveal_type(Dataclass(3).x) # ...
Is there anything that can be substituted for ... in the type annotation for Dataclass.x that is equivalent to no annotation, and lets the type system infer a specific type for each individual case, like it can with Class.x?
There seems to be a tension between the type-hints-never-required philosophy and the way dataclasses work:
PEP 526 – Syntax for Variable Annotations
It should also be emphasized that Python will remain a dynamically typed language, and the authors have no desire to ever make type hints mandatory, even by convention.
The member variables to use in these generated methods are defined using PEP 526 type annotations.
The @dataclass decorator examines the class to find fields. A field is defined as a class variable that has a type annotation.
Another place where @dataclass inspects a type annotation is to determine if a field is an init-only variable.
@elenakrittik
Thanks. That approach works for this example, but has a few drawbacks:
- It still seems to subvert the philosophy that type annotations should never be necessary.
- It is somewhat awkward to have to create a distinct generic type variable for each field in the dataclass and then annotate the corresponding field with it, especially when there are many fields.
- It still doesn't quite work like the absence of an annotation:
from typing import reveal_type
from dataclasses import dataclass
class Class:
def __init__(self, x):
self.x = x
def f(self):
return self.x + 3
@dataclass
class Dataclass[T]:
x: T
def f(self):
return self.x + 3
# Operator "+" not supported for types "T@Dataclass" and "Literal[3]"
from typing import reveal_type
from dataclasses import dataclass
class Class:
def __init__(self, x, y):
self.x = x
self.y = y
def f(self):
return self.x * self.y
@dataclass
class Dataclass[T, U]:
x: T
y: U
def f(self):
return self.x * self.y
# Operator "*" not supported for types "T@Dataclass" and "U@Dataclass"
Fields might interact with each other and expressions inside the class in a way that gives rise to nontrivial type constraints that cannot be expressed via a simple generic.
What I'd really like to do is to say nothing about the type of the field, and to let type inference/checking "pass through" it as much as possible, as if the annotation didn't exist, so to speak.