dalai
dalai copied to clipboard
dalai llama 7B crashed on first request
$ npx dalai serve
mkdir /home/user/dalai
Created custom directory: /home/user/dalai/config/prompts
Server running on http://localhost:3000/
> query: { method: 'installed', models: [] }
modelsPath /home/user/dalai/alpaca/models
{ modelFolders: [] }
modelsPath /home/user/dalai/llama/models
{ modelFolders: [ '7B' ] }
exists 7B
> query: {
seed: -1,
threads: 4,
n_predict: 200,
top_k: 40,
top_p: 0.9,
temp: 0.8,
repeat_last_n: 64,
repeat_penalty: 1.3,
debug: false,
models: [ 'llama.7B' ],
prompt: 'Who are you?',
id: 'TS-1683201361270-27393'
}
/home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:219
let [Core, Model] = req.model.split(".")
^
TypeError: Cannot read properties of undefined (reading 'split')
at Dalai.query (/home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:219:35)
at Socket.<anonymous> (/home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:534:20)
at Socket.emit (node:events:511:28)
at Socket.emitUntyped (/home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/socket.io/dist/typed-events.js:69:22)
at /home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/socket.io/dist/socket.js:703:39
at process.processTicksAndRejections (node:internal/process/task_queues:77:11)
Node.js v20.0.0
more than a few people are seeing this. I tried both the docker and npm versions, no joy. fyi, there are multiple tickets open for this issue. also see https://github.com/cocktailpeanut/dalai/issues/334 looks like fix was identified, but not merged, yet https://github.com/cocktailpeanut/dalai/pull/348
Is it related to this error somehow? /root/dalai/alpaca/main --seed -1 --threads 4 --n_predict 200 --model models/7B/ggml-model-q4_0.bin --top_k 40 --top_p 0.9 --temp 0.8 --repeat_last_n 64 --repeat_penalty 1.3 -p "The expected response for a highly intelligent chatbot to ">PROMPT" is "" exit root@22d62ba0ce90:~/dalai/alpaca# /root/dalai/alpaca/main --seed -1 --threads 4 --n_predict 200 --model models/7B/ggml-model-q4_0.bin --top_k 40 --top_p 0.9 --temp 0.8 --repeat_last_n 64 --repeat_penalty 1.3 -p "The expected response for a highly intelligent chatbot to ">PROMPT" is
"" main: seed = 1683224155 llama_model_load: loading model from 'models/7B/ggml-model-q4_0.bin' - please wait ... llama_model_load: invalid model file 'models/7B/ggml-model-q4_0.bin' (bad magic) main: failed to load model from 'models/7B/ggml-model-q4_0.bin' root@22d62ba0ce90:~/dalai/alpaca# exit exit
Is it related to this error somehow? /root/dalai/alpaca/main --seed -1 --threads 4 --n_predict 200 --model models/7B/ggml-model-q4_0.bin --top_k 40 --top_p 0.9 --temp 0.8 --repeat_last_n 64 --repeat_penalty 1.3 -p "The expected response for a highly intelligent chatbot to ">PROMPT" is "" exit root@22d62ba0ce90:~/dalai/alpaca# /root/dalai/alpaca/main --seed -1 --threads 4 --n_predict 200 --model models/7B/ggml-model-q4_0.bin --top_k 40 --top_p 0.9 --temp 0.8 --repeat_last_n 64 --repeat_penalty 1.3 -p "The expected response for a highly intelligent chatbot to ">PROMPT" is
"" main: seed = 1683224155 llama_model_load: loading model from 'models/7B/ggml-model-q4_0.bin' - please wait ... llama_model_load: invalid model file 'models/7B/ggml-model-q4_0.bin' (bad magic) main: failed to load model from 'models/7B/ggml-model-q4_0.bin' root@22d62ba0ce90:~/dalai/alpaca# exit exit
The model file is corrupted or incompatible. Download new file from here: https://huggingface.co/Pi3141/alpaca-native-7B-ggml/blob/397e872bf4c83f4c642317a5bf65ce84a105786e/ggml-model-q4_0.bin
Then replace the file in this path: alpaca/models/7B/ggml-model-q4_0.bin
Now you can try to launch the server again: npx dalai serve
If it won't work, clean install everything npx clear-npx-cache npm cache clean --force sudo apt-get update -y sudo apt-get upgrade -y node -v # should be 18.16.0 source ~/.bashrc npx dalai alpaca install 7B (then replace the file again)
What about the other models? Are they all "damaged" as well? Where to download working LLAMA models?
What about the other models? Are they all "damaged" as well? Where to download working LLAMA models?
I honestly don't know, i have only 10 yr old 6gb ram laptop, and managed to get up and running only 7b model with file from huggingface website, you can try other model files from there: https://huggingface.co/Pi3141
Can confirm the latest huggingface alpaca 7b model linked in @SpaceTimeEvent's comment above works perfectly fine 👍🏾
I'm having this issue too. Why are the models included with this package corrupt?
How do I delete the entire install of my local install? want to start over from scratch with an all clean install.
clearing my browser local storage works for me.
more than a few people are seeing this. I tried both the docker and npm versions, no joy. fyi, there are multiple tickets open for this issue. also see #334 looks like fix was identified, but not merged, yet #348
I tried the solution in that PR and yet it doesn't work for me.