typing
typing copied to clipboard
[feature] Get the type from a variable
Feature A similar feature in typescript
const foo: number = 1
type Foo = typeof foo // type Foo = number
function bar(x: string): void {
}
type Bar = typeof bar // type Bar = (x: string) => void
Pitch
The expected way in future python.
Possible implementation: implement __class_getitem__ for type :
foo: int = 1
Foo = type[foo] # equivalent to Foo = int
def bar(x: string) -> None :
...
Bar = type[bar] # equivalent to Bar = Callable[[str], None]
I don't see the use case for your first example, but I do see some use for the second -- it might give us a nice way to spell complex function types (e.g. using defaults or keyword args).
@gvanrossum
yes, it's a nice way to distint types and variables. let's say we have an utility generic type ReturnType, which receive a type parameter rather than an variable, then return a type:
def fn() -> int:
pass
assert ReturnType[type[fn]] == int # rather than ReturnType[fn]
I like the idea of inferring the (structural) type of a given object a lot, especially for functions!
Obviously, one would have to make sure to return a Protocol instead of a mere Callable in situations where the function carries keyword arguments, default values and the like. I think this would also address @gvanrossum's suggestion in https://github.com/python/typing/issues/264#issuecomment-497453851 :
@type
def MyFunctionType(fruits: dict, *args, **kwargs) -> Any: ...
As for the syntax, I'm not sure I like type[] a lot – not just because it can't be used as a decorator and type is a built-in with a meaning not related to type hints, but also because [] is already being used in the context of generics. Here, however, we're switching from an object to its corresponding type object so to speak, so a function would be more fitting in my eyes.
I'll be calling this function type_of() in the following, though I have to admit type_of would look rather confusing as a decorator if I choose to append a "…Type" suffix to my function name (because, by virtue of the decorator, that name will ultimately not refer to a function but to its type):
@type_of # A type of a function type? Uhh…?
def MyFunctionType(fruits: dict, *args, **kwargs) -> Any: ...
(Maybe a longer name like type_from_hints() would kill both birds with one stone and also make it clear that the returned type is derived (only) from the given type hints and no types are inferred. At least, this is how I understand your proposal @wlf100220 ?)
Anyway, finding a decent name is certainly the smallest challenge here. The big question for me is whether type_of(foo) (or whatever it would be called) should be looking at the variable foo (and thus __annotations__ at the module or class level, as suggested by @wlf100220 's first example) or at the object the variable is referencing. The former case doesn't make much sense to me, as the annotation of a variable at the module or class level can already be retrieved from __annotations__ (or through get_type_hints()). Meanwhile, at the function level annotations are not stored anyway and their only purpose is to support static type checkers, so one would have a hard time implementing type_of in the first place.
For this reason, I think it would make much more sense for type_of(foo) to look at the object which foo is referencing. This would be consistent with @wlf100220 's second example where we want to determine the signature of a function,
def func() -> int:
pass
type_of(func)
(Note that the module's __annotations__ is empty and the annotations we care about are stored inside func.)
Then, however, the question is whether and, if so, how type_of() should be generalized to more complicated objects than functions. As a first generalization, let's consider callable objects:
class Foo:
a: int = 1
b: str = "hello, world"
def __call__(self, param1: bool, param2):
...
class FooProtocol(Protocol):
a: int
b: str
def __call__(self, param1: bool, param2: Any) -> Any:
...
my_foo = Foo()
assert FooProtocol == type_of(my_foo)
I guess it would be quite natural for this assertion to hold.
But what if we add some non-annotated variables and methods to the class Foo? Should they also appear in the protocol returned by type_of(my_foo) and come with Any annotations? What about magic methods (whether defined explicitly or built-in)? What about inherited methods? What about attributes we add to my_foo through monkey-patching?
Notably, PEP-544 says that what is important is the presence of a method in the body of the protocol class (and similarly for variables, though in contrast to methods they must come with an annotation or they will be ignored).
Maybe one could do a similar thing here and only analyze the body of my_foo.__class__ (and, possibly, of its parent classes)? In this case, the class Foo and the protocol FooProto = type_of(my_foo) would appear to be very similar, except that Foo would represent a nominal type and FooProto would be a structural one.
In a way, this seems similar to the (rejected) idea for PEP 544 to turn classes into protocols. While in the PEP it was discussed whether this should be done automatically and this would have obviously brought about all kinds of problems, a feature like type_of() would give us a way to do this manually. Moreover, it would also avoid the issue of transitivity discussed here as the protocol returned by type_of() would have nothing to do with the class it was derived from.
I'm starting to like this idea. What does everyone else think?
After a good night's sleep, I've come to the conclusion that in my last comment I ended up talking about three related but rather different ideas:
1. Dynamically determining the type of a function (not a generic callable)
…based on type hints the function object carries (and default parameter values). Again, this would also give a really nice and concise way to write down complex function types:
@function_type
def ApplesVsOrangesComparator(apples: Sequence[Apple], oranges: Sequence[Orange]) -> bool: ...
One could then also think about enforcing such a type during function definition time:
@implements_type(ApplesVsOrangesComparator)
def my_comparator(apples: List[Apple], oranges: List[Orange]) -> bool:
# Put implementation here
...
This seems very useful to me and I've been in the situation where I wanted to do this a number of times now.
2. Dynamically converting a class into a protocol
…based on type hints in the class body. As much as I like this idea on theoretical grounds, I'm somewhat unsure how useful this is, given that we can always add Protocol as parent class in the class definition. Obviously, this assumes that we're in control of the class, so maybe a case could be made for situations where one wants to derive a protocol from a 3rd-party class? I'm not sure.
And then, of course, there's a third idea (which I somewhat sidestepped in my previous post) which is:
3. Dynamically determining the type of an object
…based on the object alone (as opposed to its class – i.e. after inheritance, considering built-in magic methods, monkey-patching the object etc. etc.). As I laid out in my previous post, I'm not sure at all how this would work, given that we don't have any annotations we can build upon: So where do we get type hints from? Should we infer them? If yes, how concrete vs. how abstract should the types be that we infer? Consider
my_dict = { "a": 1, "b": 2 }
T = type_of(my_dict)
some_types_my_dict_conforms_to = [
Dict[str, int],
Dict[str, Union[Literal[1], Literal[2]] ],
TypedDict("foo", { "a": Literal[1], "b": Literal[2] }),
# ... (other combinations thereof)
]
print(some_types_my_dict_conforms_to.find(T)) # ??
In any case, I don't see a real benefit of this third option. Structural types are used to precisely define programming interfaces and do static type checking. But interfaces are rarely designed dynamically based on given dynamically defined objects and dynamically derived types and interfaces also don't make much sense for static type checking.
I have some use cases for this.
I have a PanedWindow class that automatically sets the background color of a tkinter.PanedWindow in a specific way. So I do this:
class PanedWindow(tkinter.PanedWindow):
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
...
Would be nice to do this instead, and actually take advantage of all the widget option types I added to typeshed last year instead of ruining it with **kwargs: Any:
class PanedWindow(tkinter.PanedWindow):
__init__: TypeOf[tkinter.PanedWindow.__init__]
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
...
As a workaround, I can also do:
class PanedWindow(tkinter.PanedWindow):
if not TYPE_CHECKING:
def __init__(self, *args, **kwargs):
...
but then I get no type checking inside the __init__ method.
Another use case is a context manager named backup_open() which is just like the built-in open(), but it makes a backup copy before opening, and restores from the backup on error. Typeshed defines open() with a pile of overloads that are copy/pasted to other functions that wrap it, such as pathlib.Path.open. I don't want to maintain that copy/pasta, and I would like to do this instead:
backup_open: TypeOf[open]
def backup_open(*args, **kwargs):
...
I also thought about extending ParamSpec in some way, but it wouldn't really work for open and its many overloads, because overloads make the return type depend on the arguments.
I found a perfectly working workaround for my needs:
_T = TypeVar("_T")
def copy_type(f: _T) -> Callable[[Any], _T]:
return lambda x: x
class PanedWindow(tkinter.PanedWindow):
@copy_type(tkinter.PanedWindow.__init__)
def __init__(self, *args: Any, **kwargs: Any) -> None:
...
@copy_type(open)
@contextlib.contextmanager
def backup_open(file: Any, *args: Any, **kwargs: Any) -> Any:
...
I hope someone finds this useful or considers adding this to typing.py :)
@Akuli While I like your solution, it only helps to enforce the correct type of backup_open() on the "outside", i.e. when backup_open() is invoked. Meanwhile, there's no enforcement in the function declaration itself, meaning you can put any arguments into your definition of backup_open(), no matter whether they are compatible with the signature of open(), e.g.
@copy_type(open)
def backup_open(not_a_file: NotAFile, *args: Any, **kwargs: Any) -> NotAFileLikeObject:
...
Ideally, I would like to have a means to enforce both, in the spirit of the @implements_type decorator I suggested above.
I see a lot of value in this as well. It would be especially nice to have some form of dependency to follow when working with dataclasses for example.
import dataclasses
@dataclasses.dataclass
class Thing:
id: int
# Currently (not connected to source)
def get_thing_by_id(id: int) -> Thing:
pass
# Or, currently (static type checkers can't infer from this runtime value)
def get_thing_by_id(id: Thing.__dataclass_fields__['id'].type) -> Thing:
pass
It would be nice to have something like this to access property types:
def get_thing_by_id(id: type[Thing.id]) -> Thing:
pass
Then, with a code change to update the definition could automatically update usage at the same time:
@dataclasses.dataclass
class Thing:
id: str
# get_thing_by_id would now have an inferred static type `Callable[[str], Thing]`
I know that's incredibly hard to implement given how python works right now, but that's still my dream.
Edit: I guess that's what TypeAlias or sometimes TypeVar is for, but it would be nice to have the ability to infer them in some cases.
Another use-case...
Currently I do this:
ParameterName = Literal["MONGODB_URI", "SNOWFLAKE_PASSWORD"]
PARAMETER_NAMES: tuple[ParameterName, ...] = ParameterName.__args__ # type: ignore
(the # type: ignore is there because pylance doesn't recognise __args__ member of Literal type... makes the whole thing feel a bit hacky, and also have to add the explicit annotation to it)
but I really wanted to do this, and have a way to infer the resulting Literal type where needed using something like typeof:
PARAMETER_NAMES: Final = ("MONGODB_URI", "SNOWFLAKE_PASSWORD")
e.g. with that definition I can write a function like:
def value_mapping(values: list[str]):
return list(zip(PARAMETER_NAMES, values, strict=True))
...and Pylance is able to infer the return type as list[tuple[Literal['MONGODB_URI', 'SNOWFLAKE_PASSWORD'], str]]
which is great!
but if I wanted to annotate the return type explicitly I'd have to write out the members of the Literal again myself
in TypeScript you can do:
const animals = ['cat', 'dog', 'mouse'] as const
type Animal = typeof animals[number]
// type Animal = 'cat' | 'dog' | 'mouse'
the animals[number] part is a bit weird but the end result is exactly what I wanted above