Cython source file is not compiled after pixi install/update
Checks
-
[X] I have checked that this issue has not already been reported.
-
[X] I have confirmed this bug exists on the latest version of pixi, using
pixi --version.
Reproducible example
pixi run test
ERRORS FileNotFoundError: [Errno 2] No such file or directory: '~/fastcan/build/cp312'
Here is my repo
Issue description
The project contains some cython code, which should be compiled when installing the default environment. The compiled file should be in the folder called build. There are some founding about the folder build
- when
.pixi/envs/does not containdefaultenvironment,pixi installwill correctly generatebuildfolder. - when
.pixi/envs/containsdefaultenvironment butbuildfolder for some reason is missing,pixi installwill not generatebuildfolder and only print "The default environment has been installed." - In the previous version of pixi (I can't remember the exact version number),
pixi updatewill solve the problem 2, but the latest version 0.26.1 cannot anymore. - When I use
prefix-dev/[email protected]for github CI (cache enable),buildfolder is missing and cannot be regenerate bypixi update.
Expected behavior
When I run pixi install or pixi update, pixi should check whether the environment properly install the project (in editable mode) which contain cython code. If the build folder missing, pixi should notice it and regenerate it.
Does the issue persist for editable = false?
How do you expect an editable cython package to behave? Should that rebuild on install or what does it mean?
Hi! I think this is because uv became a bit smarter with just building the metadata of wheels, during the solve. The workaround for now is to use pixi clean && pixi install. I do not know a really good way of fixing this yet, as we cannot really detect that it this has happened. Only by adding an exception(s) for different build-backends.
Something we could relatively easily implement is pixi install --force or something of the sort, that forces a re-install of enveryting (or maybe a specifiable packages), would this help at all?
Does the issue persist for
editable = false?
If editable = false, the cython source code will not be compiled. In other words, the command pip install . will install a python module without compiling cython source code. As far as I know, the common practice is to compile cython code inplace, either using python setup.py build_ext --inplace see ref or python -m pip install --no-build-isolation --editable . see ref.
How do you expect an editable cython package to behave? Should that rebuild on install or what does it mean?
As here says: "Please note that some kind of changes, such as the addition or modification of entry points or the addition of new dependencies, and generally all changes involving package metadata, require a new installation step to become effective." I think whenever cython source code change or package metadata change, the package should be reinstalled.
Something we could relatively easily implement is pixi install --force or something of the sort, that forces a re-install of enveryting (or maybe a specifiable packages), would this help at all?
I think it may not be plausible to reinstall the entire env whenever the cython code is changed. Because during the development, the cython code is changed very often. It will be helpful to provide cli to reinstall the local package only.
Another thing is about using prefix-dev/[email protected] GitHub Action. I think the local package should be automatically reinstalled even when caching is enabled.
We're currently always "checking" if the environment is up-to-date when you run pixi run/shell/install/etc. which would make a reinstall on all of those an unusable experience.
Would it make sense to have some special behavior for pixi install when using path dependencies? E.g. always reinstalling even if the cache is still valid. I'm not a fan of differences in behavior, but it might feel familiar for most people. But "familiar for most" is always a bad assumption so not sure what the best way forward would be.
Plus to solve the action cache problem, also rebuild on a clean prefix installation to skip the cache here.
Plus to solve the action cache problem, also rebuild on a clean
prefixinstallation to skip the cache here.
It will solve the problem to simply set cache=false, but the whole env will be reinstalled. The way I can find to reinstall the local package only is to run
pixi run "uv pip install --no-deps --force-reinstall -e ."
However, pixi seems not to pre-install any installer, such as pip and uv, so I have to install uv first via
curl -LsSf https://astral.sh/uv/install.sh | sh
Should pixi also support flags, such as --no-deps and --force-reinstall to make reinstall local package easier?
I think all of these settings could make sense for pixi.
--no-deps: For this we fist have to modifypixi installto install only a given package.--force-reinstall: This should not be to hard as we should just skip the cache.
We deliberatly don't install any other package managers as there should in theory be no reason to need extra installers. For these more demanding projects you can also install uv with pixi add uv for now.
--no-deps would still hit the cache though, I think, so would not re-install. The problem is that a frontend only needs to rebuild if the metadata could be changed, as we do not really know what the backend does, we cant be sure.
The quote:
"Please note that some kind of changes, such as the addition or modification of entry points or the addition of new dependencies, and generally all changes involving package metadata, require a new installation step to become effective."
So I think most of the times the C/C++ code does not change the metadata, so there is no need for a frontend to rebuild.
However, I think, we could think of a combination of flags that would work :)
With some more testing, I'm now kind of rethinking the whole thing. I've tried doing an install with uv and while it does not seem to build the code, it is actually bringing back the build folder. But that build folder, is not contained in the wheel as expected. So it makes me wonder where that is cached.
We should have a closer look at what's happening here. @olivier-lacroix do you have any experience with mesonpy in this regard?
None whatsoever @tdejager :-/
Again, now that I'm thinking of it, it might be meson doing it itself.
~~Sorry I'm thinking out loud here in this issue. But it works for uv if I add an empty dynamic = [] to the pyproject.toml. However, that does not work, for us that is a bug 🐛 and we should fix it. Let me make an issue later.~~
uv also does not seem to rebuild when dynamic is added and build folder is deleted. They do have a --reinstall-package that seems to work. It's still pretty slow.
@MatthewSZhang it would be great that if you could check if you like the flags for uv i.e. fit your workflow and behaves as you would expect, if you have the time of course. You can successfully install your repo with uv in an activated pixi shell.
e.g
# install uv, through github, homebrew, pip (pixi global install does not work as of now)
uv venv .venv
uv pip install -e .
And then do the modifcations as you see fit.
Hi @tdejager , I have tried uv pip install -e .. And after I delete build folder, uv pip install --no-deps --force-reinstall -e . will recompile my cython code to .so file (.pyd for windows), and put it to build folder in a reasonable speed. So I think if pixi supports flags --no-deps and --force-reinstall will be helpful. -e may be unnecessary, as we define whether a package is editable in the pyproject.toml such as
[tool.pixi.pypi-dependencies]
mypk = { path = ".", editable = true }
So the resulting cli is something like pixi update --no-deps --force-reinstall mypk I suppose?
There are two things that may need to be clarified:
- If using
uv pip install --force-reinstall -e ., that will be slow, as all the dependencies will be re-installed as well. See here says--force-reinstall:
Reinstall package(s), and their dependencies, even if they are already up-to-date. Therefore,
--no-depsis necessary.
- If
mypkis written in pure python and installed in editable mode, it is not necessary to reinstall it when only .py source code changes. So it is expected thatuv pip install -e .will not reinstall the local package when package meta does not change. However, ifmypkcontains .pyx cython code, even package meta data does not change, themypkneed to be re-installed whenever .pyx file changes. Because themypkneed to the compiled.so(.pydfor windows) file to work properly. Of course,.pyxfile can be compiled manually using such asmeson setup ..andmeson compile ..orpython setup.py build_ext --inplaceforsetuptools. But I have to set the build directory manually (such as build/cp312) to match the configuration used by the installer likepiporuv(alsopixi). So reinstallingmypkvia installers is better than compiling it manually to me.
Just chiming in on the possible API for this, as I had similar thoughts for docker layering purposes in https://github.com/prefix-dev/pixi/issues/1652#issuecomment-2254869822
I don’t feel like a —no-deps would be very suitable for pixi install, which deals with the entire environment. I feel like having flags to exclude / only deal with path dependencies, like poetry does, and as @ruben-arts suggested above is a more natural fit.
This being said, there may be some value in having a separate pixi command (e.g. pixi force-install ?) to deal with a single / select package(s)
@olivier-lacroix I reacted to you with one comment on https://github.com/prefix-dev/pixi/issues/1652#issuecomment-2269081565.
- However, if
mypkcontains .pyx cython code, even package meta data does not change, themypkneed to be re-installed whenever .pyx file changes.
This is not the case, meson-python will auto-rebuild all extension modules for which source files (.pyx, .c, .h, etc.) changed when the package is imported. The way that works is via import hooks, which call ninja during import. Hence you don't have to re-run pip install -e . --no-build-isolation. See https://mesonbuild.com/meson-python/how-to-guides/editable-installs.html for more explanation on that.
I'll note that editable installs have various limitations, some of which are fundamental and some are backend-specific. The need to re-run the build/install command for extension modules manually is a setuptools-specific limitations; both meson-python and scikit-build-core are able to auto-rebuild.
I have tried
uv pip install -e .
Note that if you're using meson-python, adding --no-build-isolation is essential (unless pixi does that under the hood already to ensure picking up the build deps from the pixi-created env, rather than pip going off and creating an isolated venv).
Please feel free to ping me on meson-python related questions/issues (I'm a meson-python maintainer).
Hi! Ralf.
Thank you for this response, it's great to see you here in this repository :)
So are you saying, and If I'm reading the documentation correctly, that if you throw away the built extension in this issues case the build folder, it should be re-created when importing the python code?
Happy to be here! Quite liking my first forays into Pixi land:)
that if you throw away the built extension in this issues case the
buildfolder, it should be re-created when importing the python code?
Not quite. That will actually give you a confusing error, the error message for which is hard to improve because of how Traversable works. What an editable install produces is:
- the hook in
<prefix>/lib/python3.xx/site-packages, namedpkgname-editable.pth(also there'll be apkgname-x.y.z..dist-info) - the
builddirectory, which is stored locally inside the repo
If you remove (2) but keep (1), then an import pkgconf will still find the pkgname-editable.pth hook, which (a) tries to look up the build dir and (b) rebuild any targets that is out of date. Step (a) will already fail, and since the now-missing build dir also contains all the metadata needed to rebuild (e.g., build.ninja), there is no way to recover.
If you remove the build dir (e.g., with a git clean -xdf or rm -rf build) then you have to rerun pip install -e . --no-build-isolation, which rebuilds everything from scratch. Of course, best never to delete the build dir, there is no reason to. And if you do, best to also run pip uninstall mypkg to not leave the editable install hook hanging around in site-packages.
I'd imagine that if you only do rm build/cp311/mypkg/my_ext.so then yes it will be recreated correctly, but I don't really see that happening in practice.
Hi Rolf!
Thanks for the clarification, this is very helpful.
I'm not sure what in the original workflow by the OP results in a missing build dir.
However, with that said and your details, I do think that an explicit re-install of that package is the right way to go. Which will essentially do what you suggested. Uninstall and re-install. I suppose the 'build_wheel' pep517 needs to be recalled as well because getting the built wheel from a cache will still not give you back the 'build' folder. Is that correct :)?
Also as an aside @rgommers if there is any way we can improve the experience of scientific Python users that you see, please make as many issues as you'd like 👍😁
Yes, that sounds right, assuming you know that the build folder is actually missing (always a full rebuild is very expensive). I think that you don't have to worry about the wheel cache, AFAIK pip and uv will not cache wheels from editable installs.
if there is any way we can improve the experience of scientific Python users that you see, please make as many issues as you'd like 👍😁
Will do, thank you! So far it's been a smooth ride, but just getting started (some progress at https://github.com/rgommers/pixi-dev-scipystack). Hopefully I have learned enough to ask intelligent questions in a couple of weeks at EuroSciPy.
Hi @rgommers Thanks for your explanation and I understand meson better now :)
Hi @tdejager
I'm not sure what in the original workflow by the OP results in a missing build dir.
The build dir will loss when using github action prefix-dev/[email protected] and set cache: true. It seems pixi will not cache build dir.
I’m encountering a similar issue. I’m working on a C++ and Python project using scikit-build-core as the build backend. When I make changes to the C++ source code, running pixi install doesn’t recompile the .so files. I’ve tried running pixi run "uv pip install --no-deps --force-reinstall -e ." as well as pixi clean && pixi install, but neither approach works. Previously, using pip install -e . would rebuild the entire package as expected.
The only way I’ve found to get pixi to reinstall my package is by toggling the editable flag in pyproject.toml between true and false.
I’m encountering a similar issue. I’m working on a C++ and Python project using scikit-build-core as the build backend. When I make changes to the C++ source code, running pixi install doesn’t recompile the .so files. I’ve tried running
pixi run "uv pip install --no-deps --force-reinstall -e ."as well aspixi clean && pixi install, but neither approach works. Previously, usingpip install -e .would rebuild the entire package as expected.The only way I’ve found to get pixi to reinstall my package is by toggling the editable flag in pyproject.toml between true and false.
I think it should be --reinstall for uv and not --force-reinstall. In any case, I plan on working on PR next week to be able to re-install and refresh (which should trigger rebuilds) PyPI packages.
Seems like --reinstall has already been implemented at uv pip install.
Can Pixi handle this functionality?
Yeah you are right, we should start including this!
Do you have any updates? I would greatly appreciate it if you could manage this, as it would free me from the tedious routines of pixi clean and pixi install for Cython code development.
Make sense, I'll update the issue once I know. Sorry for the delays!
So what I'm thinking:
--reinstall-package <pkg-name>to re-install a specific package.--reinstallto re-install everything.
Re-installing implies a refresh for the dependencies managed by uv, and that could be something we could add for conda later as well. Refresh meaning the refreshing of the cached metadata.
Re-installing also implies the rebuilding of both the conda and PyPI source packages.
Options to specifically refresh but not per se re-install for conda based source and binary packages is something that we can add CLI flags for at a later stage as well.
This is my plan for implementation:
- [x] Implement uv backend reinstallation
- [x] Implement reinstallation in rattler.
- [x] Expose in
pyrattlerbindings as well. - [x] Rattler release and update in pixi
- [x] Wire this throughout the code
- [x] Needs some good integration test somehow.
Sketch of functionality: