alpaca.cpp
                                
                                 alpaca.cpp copied to clipboard
                                
                                    alpaca.cpp copied to clipboard
                            
                            
                            
                        Locally run an Instruction-Tuned Chat-Style LLM
``` /home# mkdir alpaca /home# /home# cd alpaca/ /home/alpaca# git clone https://github.com/antimatter15/alpaca.cpp Cloning into 'alpaca.cpp'... remote: Enumerating objects: 509, done. remote: Total 509 (delta 0), reused 0 (delta 0), pack-reused...
Typing in the word "seed" when using -s -1 and typing in "hello" gives a "too many requests" and another time I tested it began to write in chinese afterwards....
Hi, I tried the chat on a Macbook 2014 & Mojave and it works, slowly but works. When I tried it on a Macmini 2011 & 10.13 (to test the...
I managed to get alpaca running on a Hyper-V VM on my PowerEdge R710. The VM has 8 cores and 16GB of RAM. Running Ubuntu 22.04. I had to Make...
I remember there were 30B and 65B weights on this repo at one point, now I just see 7B, having some trouble finding them if anyone could point me in...
It would be nice if the input text could be edited (support for left/right cursor keys) and if you could navigate the input history with the up/down cursor keys :)
``` ❯ gmake chat I llama.cpp build info: I UNAME_S: FreeBSD I UNAME_P: amd64 I UNAME_M: amd64 I CFLAGS: -I. -O3 -DNDEBUG -std=c11 -fPIC -pthread -mavx -mavx2 -mfma -mf16c I...
Right now, every prompt seems to generate a brand new response with no memory of the previous conversation.
I get error, problem details said.. Problem signature: Problem Event Name: APPCRASH Application Name: chat.exe Application Version: 0.0.0.0 Application Timestamp: 641a2480 Fault Module Name: chat.exe Fault Module Version: 0.0.0.0 Fault...
is there a way to adjust the output to be longer?