llm
llm copied to clipboard
Installation borked after installing llm-gpt4all
Background / context
- was originally following install instructions for Mac at https://simonwillison.net/2023/Aug/1/llama-2-mac/ - yeah, I should have spotted that this was an older post....but I didn't
- Macbook Pro on Intel silicon, Sonoma 14.6.1
- Python 3.12.5
- Relative noob to Python and not a Dev for 25 years: it has been a few weeks of hacking with Claude-Dev so this could be "user error"
What I did
- Installed llm via Homebrew -
brew install llm
- Installed the llm-llama-cpp plugin -
llm install llm-llama-cpp
- Installed the python bindings -
llm install llama-cpp-python
- Ran the test of models installed -
llm models
(success: got 11 listed, all OpenAI) - Instead of downloading a specific model, I opted to install the plugins
llm install llm-gpt4all
(I didn't do this in a virtual Env btw, just at the command line in terminal)
Among other things I got this from the terminal:
Successfully installed charset-normalizer-3.3.2 gpt4all-2.8.2 llm-gpt4all-0.4 requests-2.32.3 urllib3-2.2.2
- Ran the test of models installed AGAIN -
llm models
(but this time got the error below)
Traceback (most recent call last): File "/usr/local/bin/llm", line 5, in
from llm.cli import cli File "/usr/local/Cellar/llm/0.15/libexec/lib/python3.12/site-packages/llm/init.py", line 18, in from .plugins import pm File "/usr/local/Cellar/llm/0.15/libexec/lib/python3.12/site-packages/llm/plugins.py", line 17, in pm.load_setuptools_entrypoints("llm") File "/usr/local/Cellar/llm/0.15/libexec/lib/python3.12/site-packages/pluggy/_manager.py", line 421, in load_setuptools_entrypoints plugin = ep.load() ^^^^^^^^^ File "/usr/local/Cellar/[email protected]/3.12.5/Frameworks/Python.framework/Versions/3.12/lib/python3.12/importlib/metadata/init.py", line 205, in load module = import_module(match.group('module')) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/Cellar/[email protected]/3.12.5/Frameworks/Python.framework/Versions/3.12/lib/python3.12/importlib/init.py", line 90, in import_module return _bootstrap._gcd_import(name[level:], package, level) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/Cellar/llm/0.15/libexec/lib/python3.12/site-packages/llm_gpt4all.py", line 1, in from gpt4all import GPT4All as _GPT4All File "/usr/local/Cellar/llm/0.15/libexec/lib/python3.12/site-packages/gpt4all/init.py", line 1, in from .gpt4all import CancellationError as CancellationError, Embed4All as Embed4All, GPT4All as GPT4All File "/usr/local/Cellar/llm/0.15/libexec/lib/python3.12/site-packages/gpt4all/gpt4all.py", line 23, in from ._pyllmodel import (CancellationError as CancellationError, EmbCancelCallbackType, EmbedResult as EmbedResult, File "/usr/local/Cellar/llm/0.15/libexec/lib/python3.12/site-packages/gpt4all/_pyllmodel.py", line 34, in if subprocess.run( ^^^^^^^^^^^^^^^ File "/usr/local/Cellar/[email protected]/3.12.5/Frameworks/Python.framework/Versions/3.12/lib/python3.12/subprocess.py", line 571, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['sysctl', '-n', 'sysctl.proc_translated']' returned non-zero exit status 1.
No llm
related commands seem to work e.g. llm --help (I always get some tracback error)
Next will be try and uninstall llm via HomeBrew.....but I've not gone there yet. No idea if that will work anyway. And I wanted to see if the community here could help first. :)