rattler
rattler copied to clipboard
Solve for the "lowest" versions of specs
In some discussion I heard that Cargo tries to resolve against the earliest versions of all the specs to verify that the lower bounds are correct.
I think that could be a cool feature to have under some flag. This would allow users to make sure (e.g. in a test) that their software still works with the minimum supported version of NumPy etc.
I'm interpreting that this would allow the user make sure that minimums support their software by solving a different matchspec where all the constraints that imply a lower bound constraint get converted to another type of constraint and all else are preserved. My question would be, what should be the behavior for each of the constraints?
Would a spec be "expanded" to have defaults? i.e. >=1.2 would test for minimum with the spec ==1.2.0.0.0 (or as many .0's/defaults needed to fully specify)?
My original interpretation is that you use the exact same specs but flip the priority so that lower version numbers are favored, and then you'd get a single solution with low versions for at least most things.
I can see the value in both interpretations. I'm really curious what others think.
I think it would be potentially hard or impossible to solve for all packages "pinned" to the lowest versions (e.g. you might depend on two packages with conflicting lower bounds).
For that reason, the way I see it working would be to just reverse the sorting and try all the lowest versions first.
Yeah thats also the approach I would take. That should also be relatively simple to implement.
I like it, easier to specify behavior leads to less surprise. Simple to implement is nice too, I can spend some time working on that later next week. Would it make sense to notify the user if the solve's lowest versions are higher than what their matchspec implies?
Would it make sense to notify the user if the solve's lowest versions are higher than what their matchspec implies?
In my view this wouldn't be terribly helpful as default behavior. For me, the lower bounds I include in my dependency specifications typically indicate the feature set that my particular project is using. For instance, maybe I use Pandas 2.0 which depends on NumPy 1.20.3. If my code uses NumPy 1.19 features then I'll include numpy >=1.19. I don't care that the pandas >=2 pin is stricter. I want the solver to solve my specs not write my specs.
A possible exception would be if I'm actively trying to support python >=3.7 but I've pinned pandas >=2 which requires python >=3.8. Then it'd be nice to know that it's time to give up on Python 3.7. But if I'm actively trying to support Python 3.7, then I'll have a CI run dedicated to testing this. So actually I don't find this argument so convincing.