llm
llm copied to clipboard
llm loses track of plugins when upgraded (with uv and others)
This may not be an llm issue, but could have to do with the interaction of the plugin mechanism with python packaging tooling.
I have llm installed (on MacOS) via uv tool install llm. When I update llm with uv tool upgrade llm, it forgets about all of its installed plugins:
~ > llm plugins
[
{
"name": "llm-claude-3",
"hooks": [
"register_models"
],
"version": "0.4.1"
},
{
"name": "llm-llamafile",
"hooks": [
"register_models"
],
"version": "0.1"
},
{
"name": "llm-perplexity",
"hooks": [
"register_models"
],
"version": "0.9"
},
{
"name": "llm-ollama",
"hooks": [
"register_commands",
"register_models"
],
"version": "0.5.0"
},
{
"name": "llm-cmd",
"hooks": [
"register_commands"
],
"version": "0.2a0"
},
{
"name": "llm-claude",
"hooks": [
"register_models"
],
"version": "0.4.0"
},
{
"name": "llm-mistral",
"hooks": [
"register_commands",
"register_embedding_models",
"register_models"
],
"version": "0.5"
}
]
Now I update:
~ > uv tool upgrade llm
Modified llm environment
- anthropic==0.34.2
- charset-normalizer==3.3.2
- filelock==3.16.0
- fsspec==2024.9.0
- httpx-sse==0.4.0
- huggingface-hub==0.24.7
- llm-claude==0.4.0
- llm-claude-3==0.4.1
- llm-cmd==0.2a0
- llm-llamafile==0.1
- llm-mistral==0.5
- llm-ollama==0.5.0
- llm-perplexity==0.9
- ollama==0.3.3
- packaging==24.1
- prompt-toolkit==3.0.47
- pygments==2.18.0
- requests==2.32.3
- tokenizers==0.20.0
- urllib3==2.2.3
- wcwidth==0.2.13
Now, no more plugins
~ > llm plugins
[]
Having saved the JSON of plugins, I can re-install them, and it appears that everything is installed from the cache. Is this because llm install is using pip or something similar, and uv handles things differently?
You are right, llm installs plugins with pip into the same environment it runs in. uv does not know about these dependencies and will remove them when upgrading the llm package.
A workaround is to install llm plugins as additional Python dependencies with uv tool install --extra <llm-plugin> llm, as documented here.
One caveat though, you cannot add additional dependencies this way. You always have to provide the full list of extra dependencies with multiple --extra options.
EDIT: fix uv tool install command
A workaround is to install llm plugins as additional Python dependencies with uv tool install --extra
llm, as
Use --with rather than --extra. For example, install with Anthropic, Deepseek and Gemini plugins...
uv tool install --with llm-anthropic --with llm-deepseek --with llm-gemini llm
Just released a new plugin that (hopefully) solves this: llm-uv-tool.
I want to fix this in LLM itself. I think the answer is to write to a plugins.txt file in the llm.user_dir() folder every time a plugin is installed or uninstalled with llm install / llm uninstall. Then I can have a mechanism somewhere (maybe on startup?) which notices if plugins that were listed there have gone missing.
I'm not sure what to do next though. Automatically re-installing them feels a little surprising. Maybe prompt the user about it, perhaps in a warning?
There could be a llm reinstall command that actually does the work to reinstall them.
Maybe the trigger for the message is if suddenly NONE (or ~90%) of the plugins in that file are available?
See also:
- #1113
Some miscellaneous thoughts:
- Recording which plugins have been installed is obviously a good idea no matter what
- Two options for that:
logs.dbor a file inuser_dir() logs.dbfeels wrong - what happens if a user usesllm -d other.db? Souser_dir() / "plugins.txt"it is- Record the plugin package name alone or the version number too? Now we are getting into package freezing territory, at which point I'd rather leave users to figure it out themselves with
uvor similar - Worth recording a log of when each package was installed? Feels potentially useful and not very space consuming
- What do we do with this information? How do we use it to solve the usability problem of plugins vanishing on upgrade?
Assuming we have a recording of the plugins the user has installed previously - with or without timestamps...
- We could check it on every call to
llm promptand show a warning (a one time only warning?) if some are missing - We could have an
llm reinstallorllm plugins syncor similar command for fixing the problem