dalai icon indicating copy to clipboard operation
dalai copied to clipboard

dalai llama 7B crashed on first request

Open Aleksei-Badyaev opened this issue 1 year ago • 10 comments

$ npx dalai serve
mkdir /home/user/dalai
Created custom directory: /home/user/dalai/config/prompts
Server running on http://localhost:3000/
> query: { method: 'installed', models: [] }
modelsPath /home/user/dalai/alpaca/models
{ modelFolders: [] }
modelsPath /home/user/dalai/llama/models
{ modelFolders: [ '7B' ] }
exists 7B
> query: {
  seed: -1,
  threads: 4,
  n_predict: 200,
  top_k: 40,
  top_p: 0.9,
  temp: 0.8,
  repeat_last_n: 64,
  repeat_penalty: 1.3,
  debug: false,
  models: [ 'llama.7B' ],
  prompt: 'Who are you?',
  id: 'TS-1683201361270-27393'
}
/home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:219
    let [Core, Model] = req.model.split(".")
                                  ^

TypeError: Cannot read properties of undefined (reading 'split')
    at Dalai.query (/home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:219:35)
    at Socket.<anonymous> (/home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/dalai/index.js:534:20)
    at Socket.emit (node:events:511:28)
    at Socket.emitUntyped (/home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/socket.io/dist/typed-events.js:69:22)
    at /home/user/.npm/_npx/3c737cbb02d79cc9/node_modules/socket.io/dist/socket.js:703:39
    at process.processTicksAndRejections (node:internal/process/task_queues:77:11)

Node.js v20.0.0

Aleksei-Badyaev avatar May 04 '23 12:05 Aleksei-Badyaev

more than a few people are seeing this. I tried both the docker and npm versions, no joy. fyi, there are multiple tickets open for this issue. also see https://github.com/cocktailpeanut/dalai/issues/334 looks like fix was identified, but not merged, yet https://github.com/cocktailpeanut/dalai/pull/348

joelduerksen avatar May 04 '23 15:05 joelduerksen

Is it related to this error somehow? /root/dalai/alpaca/main --seed -1 --threads 4 --n_predict 200 --model models/7B/ggml-model-q4_0.bin --top_k 40 --top_p 0.9 --temp 0.8 --repeat_last_n 64 --repeat_penalty 1.3 -p "The expected response for a highly intelligent chatbot to ">PROMPT" is "" exit root@22d62ba0ce90:~/dalai/alpaca# /root/dalai/alpaca/main --seed -1 --threads 4 --n_predict 200 --model models/7B/ggml-model-q4_0.bin --top_k 40 --top_p 0.9 --temp 0.8 --repeat_last_n 64 --repeat_penalty 1.3 -p "The expected response for a highly intelligent chatbot to ">PROMPT" is

"" main: seed = 1683224155 llama_model_load: loading model from 'models/7B/ggml-model-q4_0.bin' - please wait ... llama_model_load: invalid model file 'models/7B/ggml-model-q4_0.bin' (bad magic) main: failed to load model from 'models/7B/ggml-model-q4_0.bin' root@22d62ba0ce90:~/dalai/alpaca# exit exit

image

kartik-madhak avatar May 04 '23 18:05 kartik-madhak

Is it related to this error somehow? /root/dalai/alpaca/main --seed -1 --threads 4 --n_predict 200 --model models/7B/ggml-model-q4_0.bin --top_k 40 --top_p 0.9 --temp 0.8 --repeat_last_n 64 --repeat_penalty 1.3 -p "The expected response for a highly intelligent chatbot to ">PROMPT" is "" exit root@22d62ba0ce90:~/dalai/alpaca# /root/dalai/alpaca/main --seed -1 --threads 4 --n_predict 200 --model models/7B/ggml-model-q4_0.bin --top_k 40 --top_p 0.9 --temp 0.8 --repeat_last_n 64 --repeat_penalty 1.3 -p "The expected response for a highly intelligent chatbot to ">PROMPT" is

"" main: seed = 1683224155 llama_model_load: loading model from 'models/7B/ggml-model-q4_0.bin' - please wait ... llama_model_load: invalid model file 'models/7B/ggml-model-q4_0.bin' (bad magic) main: failed to load model from 'models/7B/ggml-model-q4_0.bin' root@22d62ba0ce90:~/dalai/alpaca# exit exit

image

The model file is corrupted or incompatible. Download new file from here: https://huggingface.co/Pi3141/alpaca-native-7B-ggml/blob/397e872bf4c83f4c642317a5bf65ce84a105786e/ggml-model-q4_0.bin

Then replace the file in this path: alpaca/models/7B/ggml-model-q4_0.bin

Now you can try to launch the server again: npx dalai serve

If it won't work, clean install everything npx clear-npx-cache npm cache clean --force sudo apt-get update -y sudo apt-get upgrade -y node -v # should be 18.16.0 source ~/.bashrc npx dalai alpaca install 7B (then replace the file again)

SpaceTimeEvent avatar May 04 '23 22:05 SpaceTimeEvent

What about the other models? Are they all "damaged" as well? Where to download working LLAMA models?

Aleksei-Badyaev avatar May 05 '23 07:05 Aleksei-Badyaev

What about the other models? Are they all "damaged" as well? Where to download working LLAMA models?

I honestly don't know, i have only 10 yr old 6gb ram laptop, and managed to get up and running only 7b model with file from huggingface website, you can try other model files from there: https://huggingface.co/Pi3141

SpaceTimeEvent avatar May 05 '23 09:05 SpaceTimeEvent

Can confirm the latest huggingface alpaca 7b model linked in @SpaceTimeEvent's comment above works perfectly fine 👍🏾

ujj avatar May 08 '23 05:05 ujj

I'm having this issue too. Why are the models included with this package corrupt?

zuluana avatar May 09 '23 17:05 zuluana

How do I delete the entire install of my local install? want to start over from scratch with an all clean install.

Grislard avatar May 09 '23 22:05 Grislard

clearing my browser local storage works for me.

koosty avatar May 14 '23 08:05 koosty

more than a few people are seeing this. I tried both the docker and npm versions, no joy. fyi, there are multiple tickets open for this issue. also see #334 looks like fix was identified, but not merged, yet #348

I tried the solution in that PR and yet it doesn't work for me.

xero-q avatar Jul 07 '23 13:07 xero-q