Sigurd Spieckermann
Sigurd Spieckermann
I've submitted a PR that fixes this problem: #1739
@dberenbaum It's not necessarily the entire _dependency_ tree in a stage, it would be the _import_ tree, so a subset of the dependency tree that is used by the script....
But I see there might be complications with using `pydeps` when, e.g., conditional dependencies are used such as different versions of a dependency depending on the used Python version because...
But some kind of import tree analysis is also necessary because a locally imported module may not depend on a third-party dependency, so only relying on `requirements.txt` or similar (as...
@dberenbaum I recognize that it is non-trivial to compute a correct cache key that takes into account imports, complex dependency specifications, multiple supported Python versions etc. But e.g. when running...
Yes, I believe #4363 would be a better solution than introducing a stage that computes the cache key. :+1: Ultimately, I'd still love to see the cache key computation based...
Just a few more projects I've come across that might be worth looking into for inspiration or to help in implementing a solution: * https://github.com/asottile/reorder_python_imports * https://github.com/asottile/classify-imports
Any reason not to merge this, even if you consider migrating away from Poetry at some point?
Thanks for your prompt and detailed reply, @dmontagu! :bow: I hadn't noticed that the validation is also affected, which is much worse than the JSON Schema generation. I'm not so...
The hack based on `.__pydantic_generic_metadata__` doesn't work on non-generic child models though: ```python from typing import * from pydantic import * T1 = TypeVar('T1') class ABCModel(BaseModel, Generic[T1]): f1: str class...