capnpy
capnpy copied to clipboard
Too-strict version check causes problems for wheels
A typical use case for capnpy
is the following chains of requirements (lib
, app
are Python packages):
capnpy (from app)
capnpy (from lib->app)
i.e. lib
contains a .capnproto
schema (and so has capnpy
as a compile-time requirement) and app
makes use of the compiled schema provided by lib
(and so has both capnpy
and lib
as run-time requirements).
Schema compilation can be slow, so it is desirable to publish a wheel for lib
. If so, then compile-time != run-time in general, and the versions of capnpy
used by lib
and app
may be different, but util.check_version
insists that these versions be identical. This means that new releases of capnpy
will break the existing wheel(s) for lib
, and require a new versions to be built and published with the latest capnpy
.
Pinning the capnpy
version in lib
unfortunately is not a solution, as pip
is too dumb to notice the conflict; it first installs the unconstrained version requested by app
, and later happily reports "Requirement already satisfied" when being asked to install the pinned version required by lib
. A possible workaround would be to omit capnpy
from the requirements of app
altogether, and rely on the lib
requirement to bring in the pinned version, but this seems like bad practice.
Possible solutions I can see:
-
Provide compatibility (between minor releases), such that
app
andlib
can both specifycapnpy<N+1.0.0
. This really just reduces the frequency of breakages, but may be good enough. -
Bundle the "incompatible" parts of the
capnpy
run-time library into wheels at compile-time. I have no idea how hard this would be!
I agree that it's a problem, but I don't see any easy solution:
-
this could work, but only if I find a way to actually test that I didn't break anything between minor versions. I don't trust myself enough to be sure I won't make a mistake :sweat_smile: . And the problem is that "a mistake" could easily lead to a segfault
-
this is probably too hard: AFAIK, cython does not really provide a way to statically link the external code
I suggest a third approach which might be worth investigating: what about modifying the output wheel after it has been generated, and "fix" the required capnpy version there?
I suggest a third approach which might be worth investigating: what about modifying the output wheel after it has been generated, and "fix" the required capnpy version there?
Do you mean e.g. in app.X.Y.Z.dist-info/metadata.json
? This doesn't work, at least if you do the pinning in setup.py
of lib
(pip
is too dumb; https://github.com/pypa/pip/issues/988). Do you have any reason to expect pip
to behave more sensibly if we bypass setup.py
and hack the version pinning somewhere else in the wheel?
oh sorry, I didn't understand the actual problem at first. Indeed, if pip
is so dumb, my solution doesn't work.
But then I don't really know how to solve :disappointed: . It seems really a pip
fault to me (not very helpful, I know)
There is a beta of a new dependency resolver in pip 20.2 -- perhaps you could see whether that helps?
pip 20.3 has the new dependency resolver on by default; please see the documentation on how to test and migrate in case it helps you address this problem.