Code generation workflow for the Slint Python bindings
Summary
- add the slint.codegen package with a CLI entry point to generate Python runtime and stub files from .slint
- surface property and callback metadata from the interpreter, refine the callback decorator overloads, and generate modules/stubs with accurate type hints
- document the generator, ship a counter example project, and cover the new flow with unit and integration tests while retaining the legacy import loader
Testing
- pytest api/python/slint/tests/codegen -q
Follow-ups
- update slint-ui/slint-python-template and slint-ui/material-python-template once codegen becomes the primary path
When i apply slint.codegen to ui-libraries/material then more problems occured... Let me see
So, isn't it enough to distribute @material on PyPI directly using importlib.resources...?
Hi! Could you elaborate a bit what this PR does?
As part of https://github.com/slint-ui/slint/issues/4136 we've been working on generating a .py file for .slint files (along with embedding typings) in https://github.com/slint-ui/slint/commits/simon/python-stubs/ . That's work I progress, but it can generate typings for all our test suite and the missing bits are verification of signature changes when the .slint files have changed as well as documentation. You can see an example of how the generated code looks like here: https://github.com/slint-ui/slint/blob/simon/python-stubs/demos/printerdemo/python/printerdemo.py
Hi! Could you elaborate a bit what this PR does?
As part of #4136 we've been working on generating a
.pyfile for.slintfiles (along with embedding typings) insimon/python-stubs/ (commits) . That's work I progress, but it can generate typings for all our test suite and the missing bits are verification of signature changes when the .slint files have changed as well as documentation. You can see an example of how the generated code looks like here:simon/python-stubs/demos/printerdemo/python/printerdemo.py
thx for taking a look. this PR is trying to accomplish:
- It adds a Python-side code generator (
slint.codegen) that walks one or more inputs (files or directories), preserves their package layout, and emits both the runnable module (.py) and a matching stub (.pyi). The generator can either drop the output next to the source or into an explicit target tree so the generated code can be checked into a wheel or vendor-ed alongside application code. - The runtime module we generate keeps loading the compiled component at execution time (so we still benefit from the flexibility the current loader gives us), but places the Python API surface in a normal module so editors can offer completions, go-to-definition, etc. without relying on import hooks.
the idea of checked load asserting that the generated artifact matches the .slint input... LGTM, but having to ship the JSON blob in every generated module. We’re experimenting with keying the tool.uv.cache-key via uv/pyproject metadata so that we can invalidate when the source changes; if that approach works out I’d be keen to wire it in here.
BTW, the python_type_name implementation is much better: supporting enums and user-defined structs/tuples directly. porting planned.
WHY api/python/compiler is a downloader??
A maturin-powered pyproject.toml is enough for this case.
[build-system]
requires = ["maturin>=1.9,<2.0"]
build-backend = "maturin"
[project]
name = "slint-compiler"
requires-python = ">=3.10"
classifiers = [
"Programming Language :: Rust",
"Programming Language :: Python :: Implementation :: CPython",
]
dynamic = ["version"]
[tool.maturin]
bindings = "bin"
manifest-path = "Cargo.toml"
WHY
api/python/compileris a downloader??A maturin-powered
pyproject.tomlis enough for this case.
The idea was to re-use the same binaries we've already built for the release, instead of doing another build for every host combination for Python.
I don't feel too strongly about this, but I like that this makes it a very low maintenance solution.
Are there any downsides we should address by using dedicated wheels?
WHY
api/python/compileris a downloader?? A maturin-poweredpyproject.tomlis enough for this case.The idea was to re-use the same binaries we've already built for the release, instead of doing another build for every host combination for Python.
I don't feel too strongly about this, but I like that this makes it a very low maintenance solution.
Are there any downsides we should address by using dedicated wheels?
No, but current downloader is not friendly to mirrors, which is important for poor network situation like in china mainland. And... personally, I like to follow the Occam's Razor principle, so if we can get it from PyPI, there's no need for our script to download it again from GitHub. In my opinion, this brings more trouble, like the current downloader has to manually concatenate URLs, etc... Anyway, maturin, cibuildwheel and other smart developers have already taken care of these for us, why not take advantage of that?
I have drafted 35b6b9d (#9834) for this improvement. ~~although it increased the carbon footprint, we all use Rust, so just don't worry about it.~~
All enums in internal/common aren't wired...oh.
@tronical should we expose apis from i_slint_common::for_each_builtin_structs! & i_slint_common::for_each_enums!(add_enum);?
WHY
api/python/compileris a downloader??A maturin-powered
pyproject.tomlis enough for this case.The idea was to re-use the same binaries we've already built for the release, instead of doing another build for every host combination for Python.
I don't feel too strongly about this, but I like that this makes it a very low maintenance solution.
Are there any downsides we should address by using dedicated wheels?
No, but current downloader is not friendly to mirrors, which is important for poor network situation like in china mainland.
That's a very compelling argument. Thanks for the explanation.
I have drafted
35b6b9d(#9834) for this improvement. ~~although it increased the carbon footprint, we all use Rust, so just don't worry about it.~~
I agree with your proposed solution. Using maturin's bin bindings sounds good to me.
Would you be able to turn that into a separate PR? Im currently traveling, but I think I can help review next week.
@tronical should we expose apis from
i_slint_common::for_each_builtin_structs!&i_slint_common::for_each_enums!(add_enum);?
I don't think we expose these as-is in the other languages, so I don't think we should do so for Python.
(The rust testing api and platform api is an exception, but developing custom backends is not in scope for the Python port right now)
Or is there a specific enum that you have in mind?
@tronical #9870 is ready for you.
I don't think we expose these as-is in the other languages, so I don't think we should do so for Python.
Are you thinking that it's only necessary to let .slint access these APIs...?
Implementations of bindings in other languages perhaps should not involve the parts about layout or style control in .slint — these are all completed by callbacks that actually have similar ViewModel/Controller capabilities, is that right?
I'm just following this discussion.
In C++ we do generate public API for two of these enums, in https://github.com/slint-ui/slint/blob/33ab76918f49d5e90dcf6125c4d0d79d3aa6c13e/api/cpp/cbindgen.rs#L10
We could do the same for the enums that are meant to be used in python by the users.
Thanks for the contribution, by the way.
I don't think we expose these as-is in the other languages, so I don't think we should do so for Python.
Are you thinking that it's only necessary to let
.slintaccess these APIs...?
Yes
Implementations of bindings in other languages perhaps should not involve the parts about layout or style control in
.slint— these are all completed by callbacks that actually have similar ViewModel/Controller capabilities, is that right?
Yes
Now, it can generate typing.Any for fields of built-in enum/struct types while still generating the correct types for user-defined enum/structs.
@tronical SlintEventLoop always exit when the last window quit, even expose and use the run_event_loop_until_quit.