alpaca.cpp
alpaca.cpp copied to clipboard
win 10 not work
PS C:\Users\Администратор\Downloads\opera\alpaca.cpp> .\Release\chat.exe main: seed = 1680110322 llama_model_load: loading model from 'ggml-alpaca-7b-q4.bin' - please wait ... llama_model_load: ggml ctx size = 6065.34 MB PS C:\Users\Администратор\Downloads\opera\alpaca.cpp>
also not working on windows 10 like previous comment. here is message from command prompt : C:\Users\batkoko\Desktop\CHAT_GPT_WINDOWS>chat.exe main: seed = 1680155118 llama_model_load: loading model from 'ggml-alpaca-7b-q4.bin' - please wait ... llama_model_load: failed to open 'ggml-alpaca-7b-q4.bin' main: failed to load model from 'ggml-alpaca-7b-q4.bin'
missing file ggml-alpaca-7b-q4.bin. dont know where i can find it from all links in repo to place in folder to can work. make a full release for windows 10 to work directly after download and unrar in folder.
7Bggml-alpaca-7b-q4.bin
magnet:?xt=urn:btih:5aaceaec63b03e51a98f04fd5c42320b2a033010&dn=ggml-alpaca-7b-q4.bin&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=udp%3A%2F%2Fopentracker.i2p.rocks%3A6969%2Fannounce
13Bggml-alpaca-13b-q4.bin
magnet:?xt=urn:btih:053b3d54d2e77ff020ebddf51dad681f2a651071&dn=ggml-alpaca-13b-q4.bin&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=udp%3A%2F%2Fopentracker.i2p.rocks%3A6969%2Fannounce&tr=udp%3A%2F%2Ftracker.openbittorrent.com%3A6969%2Fannounce&tr=udp%3A%2F%2F9.rarbg.com%3A2810%2Fannounce
.\Release\chat.exe -m ggml-alpaca-7b-q4.bin
.\Release\chat.exe -m ggml-alpaca-13b-q4.bin
is this work online or offline chat gpt ??? ot this file are only model for ai and tasks/answers working online ???? nowhere is say that. i see that is offline but what works offline is not clear...
karadevnet these are just the weights themselves (ggml-alpaca-7b-q4.bin and ggml-alpaca-13b-q4.bin) to work on the local machine. put in the project folder
lagait > PS C:\Users\Администратор\Downloads\opera\alpaca.cpp> .\Release\chat.exe main: seed = 1680110322 llama_model_load: loading model from 'ggml-alpaca-7b-q4.bin' - please wait ... llama_model_load: ggml ctx size = 6065.34 MB PS C:\Users\Администратор\Downloads\opera\alpaca.cpp> Perhaps you have the same problem that I was looking for a solution a couple of days ago (see link) https://github.com/antimatter15/alpaca.cpp/issues/160
ok, this is good, but can you make a software that work ONLINE and answers are in console command promt to use registered user account in chat gpt ??? because if you can make work offline i am shore that you can make work online just answers are in local computer using api key from user account. can you make that ???
...can you make a software ...
my level is user, not programmer
This project may not be the tool you're looking for. This project provides a chat system you can run on your own computer locally. This is not OpenAI's ChatGPT system, and you do not need Internet access to use it once installed.
Many people have developed various interfaces for OpenAI's ChatGPT (for example, https://github.com/marcolardera/chatgpt-cli, search engines are your friend).
https://github.com/antimatter15/alpaca.cpp/issues/160 - doesn't work .\Release\chat.exe -m ggml-alpaca-7b-q4.bin- doesn't work
######################################
PS C:\Users\Администратор\Downloads\opera\alpaca.cpp> .\Release\chat.exe -m ggml-alpaca-7b-q4.bin main: seed = 1680259754 llama_model_load: loading model from 'ggml-alpaca-7b-q4.bin' - please wait ... llama_model_load: ggml ctx size = 6065.34 MB llama_model_load: memory_size = 2048.00 MB, n_mem = 65536 llama_model_load: loading model part 1/1 from 'ggml-alpaca-7b-q4.bin' llama_model_load: .................................... done llama_model_load: model size = 4017.27 MB / num tensors = 291
system_info: n_threads = 4 / 4 | AVX = 1 | AVX2 = 1 | AVX512 = 0 | FMA = 0 | NEON = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 0 | VSX = 0 | main: interactive mode on. sampling parameters: temp = 0.100000, top_k = 40, top_p = 0.950000, repeat_last_n = 64, repeat_penalty = 1.300000
PS C:\Users\Администратор\Downloads\opera\alpaca.cpp>
######################################
This works
https://github.com/SiemensSchuckert/alpaca.cpp
#160 - doesn't work .\Release\chat.exe -m ggml-alpaca-7b-q4.bin- doesn't work
it WORKS - look at your screenhoot and compare it with mine (in the thread you are referring to. https://github.com/antimatter15/alpaca.cpp/issues/160#issuecomment-1488201502 ). you have AVX=1 and AVX2=1, instructions enabled, I have them disabled (AVX=0 and AVX2=0) by editing CMakeLists.txt. until I did that, it didn't work for me either, crashes stopped after editing. To do this, you need to change the lines (I attached the modified file CMakeLists.txt and printed what parameters changed in my same post in the same thread )