gpt4all
gpt4all copied to clipboard
Support running on a Linux server with no GUI installed
Feature request
Support installation as a service on Ubuntu server with no GUI
Motivation
ubuntu@ip-172-31-9-24:~$ ./gpt4all-installer-linux.run qt.qpa.xcb: could not connect to display qt.qpa.plugin: Could not load the Qt platform plugin "xcb" in "" even though it was found. This application failed to start because no Qt platform plugin could be initialized. Reinstalling the application may fix this problem.
Available platform plugins are: minimal, xcb.
Aborted (core dumped) ubuntu@ip-172-31-9-24:~$
Your contribution
not sure where to submit
There already seems to be an implementation of a CLI, plus several bindings in different programming languages. I don't know all that much about the Linux installation, but these might already be included, even if you can't start the GUI itself. In the repository, see:
- CLI: https://github.com/nomic-ai/gpt4all/tree/main/gpt4all-bindings/cli
- bindings for: C#, Go, Python, Typescript: https://github.com/nomic-ai/gpt4all/tree/main/gpt4all-bindings
If they aren't already included in the installer, you can always clone and compile yourself.
You will have to download the models separately, however. But there should be links for that on the homepage (look for 'Model Explorer').
I also have a problem with installing this on a headless Ubuntu server. "error while loading shared libraries: libxcb-icccm.so.4:"
So I can't use the cli bindings to something that can't even be installed. I don't see the point of just having a stand alone chat client that only works for the one person that is logged in. I was expecting to be able to install it, run it as a server. Then query that server with rest calls and get answers as a response.
Not trying to test myself, but libxcb-icccm.so.4 should be part of the libxcb-icccm4
package on Ubuntu. Maybe see what this says:
apt show libxcb-icccm4
and if it isn't installed:
sudo apt-get install libxcb-icccm4
Edit: The REST API isn't finished yet, btw. See https://github.com/nomic-ai/gpt4all/tree/main/gpt4all-api
Not trying to test myself, but libxcb-icccm.so.4 should be part of the
libxcb-icccm4
package on Ubuntu. Maybe see what this says:apt show libxcb-icccm4
and if it isn't installed:
sudo apt-get install libxcb-icccm4
Edit: The REST API isn't finished yet, btw. See https://github.com/nomic-ai/gpt4all/tree/main/gpt4all-api
Thank you!!! This led me down a path of dependency hell, install libxcb-icccm4, then another dependency, install that, another dependancy, install that... Finally got to a point where I didn't know what else to install. By the sounds of it, the project just isn't quite there yet. :) hopefully with continued development that will change.
Thank you!!! This led me down a path of dependency hell, install libxcb-icccm4, then another dependency, install that, another dependancy, install that... Finally got to a point where I didn't know what else to install. By the sounds of it, the project just isn't quite there yet. :) hopefully with continued development that will change.
I guess several things are in various states of completion at the moment.
But the one time I tried to run the provided Linux installer on a Linux Mint VM (based on Ubuntu 22.04), it worked right away. I don't know how they package their installer, but my assumption now is that it's on an Ubuntu 22.04 desktop system (or some container with a similar base).
Linux can be tricky because there's a lot of fragmentation.
@rmasci I've tried it myself now on an Ubuntu 22.04 Server virtual machine, completely fresh.
I see what you mean: It crashes even after installing all the dependencies, of which there are many (libxcb-glx0 libx11-xcb1 libxcb-icccm4 libxcb-image0 libxcb-keysyms1 libxcb-randr0 libxcb-render-util0 libxcb-shape0 libxcb-sync1 libxcb-xfixes0 libxcb-xinerama0 libxcb-xkb1 libxkbcommon-x11-0 libgl1).
So the installer is not suitable for an Ubuntu Server system. The solution is to compile it yourself:
sudo apt-get install build-essential cmake
git clone --recurse-submodules https://github.com/nomic-ai/gpt4all
# following the instructions here: https://github.com/nomic-ai/gpt4all/tree/main/gpt4all-bindings/python#local-build-instructions
cd gpt4all/gpt4all-backend
mkdir build && cd build
cmake ..
cmake --build . --parallel
ls *.so # should print 'libllmodel.so'
# python bindings:
cd ../../gpt4all-bindings/python
sudo apt-get install python3-pip
# pip install:
# '-e' means 'editable', so you can edit binding code here, if desired; '.' means the package to install is in this folder
# i'd strongly recommend '--user' installation, otherwise needs sudo and clobbers your system
pip3 install --user -e .
# running the CLI
cd ../cli
pip3 install --user typer
python3 app.py repl
This should hopefully work for you. I stopped the CLI when it tried to start downloading a model, and that should be all that's left for it to run.
Thank you -- I'll give that a try
On Tue, May 23, 2023 at 1:35 PM cosmic-snow @.***> wrote:
Not trying to test myself, but libxcb-icccm.so.4 should be part of the libxcb-icccm4 package on Ubuntu. Maybe see what this says:
apt show libxcb-icccm4
and if it isn't installed:
sudo apt-get install libxcb-icccm4
— Reply to this email directly, view it on GitHub https://github.com/nomic-ai/gpt4all/issues/682#issuecomment-1559874924, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAGRIPNDX2YDD7PXBVDQ4V3XHTYPJANCNFSM6AAAAAAYK6FQNM . You are receiving this because you commented.Message ID: @.***>
Sorry, had a typo problem in those instructions. Corrected now. It was:
mkdir build && cd build
cmake --build . --parallel
but it has to be:
mkdir build && cd build
cmake ..
cmake --build . --parallel
I hope the rest is correct. While I did recheck before posting here -- I copied things over by hand -- there's always a chance something goes missing.
Worked ok until the last line:
python3 app.py repl
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 662, in urlopen
self._prepare_proxy(conn)
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 950, in _prepare_proxy
conn.connect()
File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 366, in connect
self.sock = ssl_wrap_socket(
File "/usr/lib/python3/dist-packages/urllib3/util/ssl_.py", line 370, in ssl_wrap_socket
return context.wrap_socket(sock, server_hostname=server_hostname)
File "/usr/lib/python3.8/ssl.py", line 500, in wrap_socket
return self.sslsocket_class._create(
File "/usr/lib/python3.8/ssl.py", line 1040, in _create
self.do_handshake()
File "/usr/lib/python3.8/ssl.py", line 1309, in do_handshake
self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1131)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 719, in urlopen
retries = retries.increment(
File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 436, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
... more errors...
An SSL error? What's your system's OpenSSL package? What version are the certificates?
I did test the things I've written on a pristine virtual machine (of course I did an update to the packages before that). Are you even on Ubuntu 22.04 or higher?
I need to use proxies to get out to the internet. Maybe that's why?
Yeah, I'm thinking you're not on 22.04 or higher. I just noticed from your error messages that your Python version is 3.8. In 22.04 it should be v3.10.
But if you don't want to tell me what version you're on / what versions your packages are, I can't really help.
Maybe try a 22.04 server instead. Or maybe have a look at this: #634
Unfortunately my system cant' be upgraded to 22.04 -- THANK YOU very much for your assistance, I appreciate it.
On Thu, Jun 1, 2023 at 1:39 PM cosmic-snow @.***> wrote:
Yeah, I'm thinking you're not on 22.04 or higher. I just noticed from your error messages that your Python version is 3.8. In 22.04 it should be v3.10 https://packages.ubuntu.com/jammy/python3.
But if you don't want to tell me what version you're on / what versions your packages are, I can't really help.
Maybe try a 22.04 server instead. Or maybe have a look at this: #634 https://github.com/nomic-ai/gpt4all/issues/634
— Reply to this email directly, view it on GitHub https://github.com/nomic-ai/gpt4all/issues/682#issuecomment-1572513998, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAGRIPNQDRZ3CP3JN4PRMTTXJDHUZANCNFSM6AAAAAAYK6FQNM . You are receiving this because you were mentioned.Message ID: @.***>
Hi, Thank you for the instruction, I followed it on RPi4B (4GB RAM 12 GB swap) Ubuntu 22.04.01 LTS. GCC 11.3 Python 3.10.6
The build completed successfully. There were a few warnings.
I've started the gpt4all python CLI. But it failed when I've submited the first prompt with the following error: TypeError: LLModel.prompt_model() got an unexpected keyword argument 'std_passthrough'
ubuntu@scipi:~/gpt4all/gpt4all-bindings/cli$ python3 app.py repl
100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 3.79G/3.79G [37:09<00:00, 1.70MiB/s]
Model downloaded at: /home/ubuntu/.cache/gpt4all/ggml-gpt4all-j-v1.3-groovy.bin
gptj_model_load: loading model from '/home/ubuntu/.cache/gpt4all/ggml-gpt4all-j-v1.3-groovy.bin' - please wait ...
gptj_model_load: n_vocab = 50400
gptj_model_load: n_ctx = 2048
gptj_model_load: n_embd = 4096
gptj_model_load: n_head = 16
gptj_model_load: n_layer = 28
gptj_model_load: n_rot = 64
gptj_model_load: f16 = 2
gptj_model_load: ggml ctx size = 5401.45 MB
gptj_model_load: kv self size = 896.00 MB
gptj_model_load: ................................... done
gptj_model_load: model size = 3609.38 MB / num tensors = 285
Using 4 threads
██████ ██████ ████████ ██ ██ █████ ██ ██
██ ██ ██ ██ ██ ██ ██ ██ ██ ██
██ ███ ██████ ██ ███████ ███████ ██ ██
██ ██ ██ ██ ██ ██ ██ ██ ██
██████ ██ ██ ██ ██ ██ ███████ ███████
Welcome to the GPT4All CLI! Version 0.1.0
Type /help for special commands.
⇢ /help
Special commands: /reset, /exit, /help and /clear
⇢ tell me about Elon Musk
Traceback (most recent call last):
File "/home/ubuntu/gpt4all/gpt4all-bindings/cli/app.py", line 119, in <module>
app()
File "/home/ubuntu/gpt4all/gpt4all-bindings/cli/app.py", line 89, in repl
full_response = gpt4all_instance.chat_completion(
File "/home/ubuntu/gpt4all/gpt4all-bindings/python/gpt4all/gpt4all.py", line 233, in chat_completion
response = self.model.prompt_model(full_prompt, streaming=streaming, **generate_kwargs)
TypeError: LLModel.prompt_model() got an unexpected keyword argument 'std_passthrough'
Edit 2023-06-20: This has been fixed with b66d0b4fffdbab6b67820fd7f1e1db8b779da8f5. The following is no longer necessary.
But it failed when I've submited the first prompt with the following error: TypeError: LLModel.prompt_model() got an unexpected keyword argument 'std_passthrough'
See: https://github.com/nomic-ai/gpt4all/issues/820#issuecomment-1582955946 and following comments, plus #910.
Thank you @cosmic-snow ! It is working now with your patch.