uv
uv copied to clipboard
Upgrade dependencies in `pyproject.toml` (`uv upgrade`)
this will update the version pins in the pyproject.toml file
prior art:
- deno: https://deno.com/blog/v2.2#dependency-management
- poetry: https://github.com/MousaZeidBaker/poetry-plugin-up
- npm: https://docs.npmjs.com/cli/v10/commands/npm-update
- npm-check-updates: https://www.npmjs.com/package/npm-check-updates
> uv upgrade
upgraded 10 dependencies in 0.001 seconds 🚀
why? when i want to update all the dependencies in a project, it can be tedious to manually search for and update each and every dependency. additionally, for application projects (not libraries), i like to be able to squizz the pyproject file for an easy overview of which dependencies are actually installed, not just some possible range
Does uv sync cover this use case?
Also, I'm pretty sure dependencies will also be updated during every uv run call by default.
I don't think uv sync can bump the pinned versions of every dependency to the latest available version, which I believe is what the issue creator is asking about. uv sync just downloads modules that match the pin, it doesn't change the pin.
This topic always ends up being confusing, people have different definitions of what it means to update or upgrade a dependency. 😅
Ahh, yes uv sync updates the environments but not the install specs.
Would it be recommended to not pin exact dependencies in the pyproject.toml - instead to use the lock file for tracking that?
pyproject.toml should track compatibility and uv.lock should track exact versions for deployments/reproducibility.
I suspect a lot of teams have a similar workflow to my team, which most project tooling has poor support for.
In the pre-Dependabot days, once a month or so we'd open up the project file on one screen and pypi on the other, then manually bump all the versions to the latest we could find. Then we'd install everything, read release notes, check what broke, and perhaps go back a version or two for some of the dependencies (which is why unpinning all the deps wouldn't work). After that we'd generate a new lock file.
Having an upgrade command would remove the manual labor of having to look up all the versions and editing the file for each individual dependency.
Poetry has a nice workflow by using poetry show --outdated rather than opening PyPI.
uv upgrade-interactive would also be great, like yarn (v1):
Use uv sync -U to update packages
Although I will note that it would be nice to be able to see which packages are outdated, similar to pip list --outdated or the poetry show --outdated command which does the same thing.
Although I will note that it would be nice to be able to see which packages are outdated,
That's tracked in #2150.
Like @KotlinIsland said, especially for applications it's really nice to be able to browse narrow ranges of your top level dependencies within pyproject.toml and have a tool automatically bump them for you so you don't have to go to PyPI and look at each top level dependency to see what the latest version is.
It's also important to be able to skip dependencies that the user has pinned to a specific version in pyproject.toml so that if you know you can't upgrade past a version, you can pin it in pyproject.toml until the issue is resolved upstream.
The old poetryup plugin had a --skip-exact flag for this
- https://github.com/MousaZeidBaker/poetryup?tab=readme-ov-file#usage
For more prior art, see npm-check-updates
- https://www.npmjs.com/package/npm-check-updates
It has a nice output UI:
(Long) discussion from the same feature requested in Poetry
- https://github.com/python-poetry/poetry/issues/461
I agree this would be a really helpful feature, especially useful in PRs for tracking version increments. Its a lot easier to see these in pyproject.toml than a lock file
I'm also a big fan of yarn's yarn upgrade-interactive for interactively updating dependency versions in package.json and would love to see similar uv functionality for updating dependency versions in pyproject.toml.
I use this tiny script. Works like a charm.
Is this intended to handle the following use case?
dependencies = [
"httpx==0.27",
]
uv upgrade
dependencies = [
"httpx==0.28",
]
(where 0.28.1 is the latest version of httpx as listed in uv tree --outdated)
Would it require something like uv upgrade --outdated [<package>] or uv upgrade --override [<package>] or similar?
Currently, even an ugly hack like
uv tree --outdated -d1 | grep "latest" | sed -E 's/.*── ([^ ]*) .*latest: v(.*)\)/\1==\2/' | xargs -I {} uv lock --upgrade-package {}
isn't working, because uv lock --upgrade-packages still attempts to satisfy the existing constraints. And the uv remove/uv add workaround doesn't address handling different dependency groups.
@wgordon17
[The]
uv remove/uv addworkaround doesn't address handling different dependency groups.
You might be interested in @KotlinIsland’s improved version, as this should handle dependency groups.
Would this count as using uv?
uvx pdm update --unconstrained
🤔 Does --unconstrained then write the difference back to pyproject.toml, i.e., the updated packages? Or is pyproject.toml now lagging behind?
Would this count as using uv?
uvx pdm update --unconstrained
PDM is excruciatingly slow...
The current work around with a script that uses UV to remove and readd deps is many times faster
+1
Any update on this feature request? We also used to rely on the poetry up plugin which worked well.
This is really useful for top-level dependencies, we typically care about specific versions of these packages and it's very helpful to be able to bump this without having to manually go check if there's an update available for a given package.
Please don't ask for updates (see #9452) — we definitely post updates if we have them.
We haven't started working on this. It's obviously high impact / priority, but so are lots of other things.
The script posted above has been useful to me, but it does ignore pins and bounds, on purpose, whereas I was looking for something where I could selectively not upgrade using the existing pyproject.toml syntax (like the poetryup --skip-exact argument). I.e. specify dependency versions with >= by default, and add pins or upper bounds if needed, whilst the tool handles bumping the minimum versions on >=. So I coded up a small tool for the time being to do this, in case anyone finds it useful, you can find it here: https://github.com/zundertj/uv-bump
@wgordon17
🤔 Does
--unconstrainedthen write the difference back topyproject.toml, i.e., the updated packages? Or ispyproject.tomlnow lagging behind?
pyproject.toml is updated (overwritten) with the latest resolved versions of the pinned packages.
pdm update -h for --unconstrained:
-u, --unconstrained Ignore the version constraints in pyproject.toml and overwrite with new ones from the resolution result
@zanieb
Please don't ask for updates (see #9452) — we definitely post updates if we have them.
We haven't started working on this. It's obviously high impact / priority, but so are lots of other things.
One could also say that this feature were better handled by an external tool, like Renovate. That tool can update dependencies more broadly, and in more advanced ways. Work on the end of uv towards a universal lockfile or other such changes is more fruitful that crafting a CLI for this.
As a stop gap do people just use the lock file for PR reviews of dependency changes?
If you specify all your (direct) dependencies in pyproject.toml using pins (i.e. ==), then the lock file can only change if someone has run uv sync --upgrade as, by default, uv sync will not upgrade if the current lock file satisfies the requirements in pyproject.toml. And it would only be dependencies of dependencies. If a PR comes in and updates a pinned version, then only that package version will change. So you would not need to review the lock file for your direct dependencies, the pyproject.toml is sufficient.
On the other hand, if you have not all versions pinned in pyproject.toml, all bets are off basically. You may get upgrades and/or downgrades, and the lock file is the place where you can see what is happening on the PR. Unfortunately, because of the contents, it is not a an easy to read summary of "package A moved from version Y to Z, package B got removed, package C was added with version X".
I dont think it is too bad right now when it comes to reviewing PR's though, keeping pyproject.toml up to date is the more painful part.
For more prior art, see
npm-check-updates* https://www.npmjs.com/package/npm-check-updates
I'm a fan of npm-check-updates personally, especially because you can filter via -t for major, minor, and patch changes at direct and transitive layers and bump at the semver tiers which for application development is quite nice. I originally opened https://github.com/astral-sh/uv/issues/2745 with the desire of one day having semver filtering in uv for this kind of purpose.
For poetry, I've used a combination of poetry show --outdated with -T to filter for direct dependencies. I've also used pip-check-updates in the past with poetry export and uv, but nothing quite like the ergonomics of ncu exists to my knowledge.
Hi guys, as a temporary solution I've wrote a small program to sync the minimum versions of top-level dependencies. Please note that since it's intended to be a temporary solution it's not very robust or feature-rich and there's a few known issues I haven't had time to solve.
Anyways, hope it might be helpful! Feel free to check it out: https://github.com/kedvall/pysync.
I suspect a lot of teams have a similar workflow to my team, which most project tooling has poor support for.
In the pre-Dependabot days, once a month or so we'd open up the project file on one screen and pypi on the other, then manually bump all the versions to the latest we could find. Then we'd install everything, read release notes, check what broke, and perhaps go back a version or two for some of the dependencies (which is why unpinning all the deps wouldn't work). After that we'd generate a new lock file.
Having an upgrade command would remove the manual labor of having to look up all the versions and editing the file for each individual dependency.
For anyone using nvim, there is a completer that might speed up this process called cmp-pypi which will autocomplete available versions of a package read from the PyPi index.
Basically just curls the index for the package and pulls out the available versions.
I recently improved it by adding the ability to search the optional dependencies.
Note: It doesn't currently sort into version order.
So, I used a variation of the script discussed above for this. But now I am working inside a monorepo with multiple UV modules, and writing a script for this has become increasingly complex.
I think this has to be a part of the built-in uv command, and I frankly don't see why this should be complex - it's rather smallish.
The question is, is there interest and willingness from the uv maintainers? If there is, I or someone else can contribute to this.