dalai icon indicating copy to clipboard operation
dalai copied to clipboard

No output is appearing from typing a prompt.

Open Njasa2k opened this issue 1 year ago • 5 comments

I've typed in a prompt exec: C:\Users\*****\dalai\alpaca\build\Release\main --seed -1 --threads 16 --n_predict 6942069 --model models/30B/ggml-model-q4_0.bin --top_k 420 --top_p 0.9 --temp 0.9 --repeat_last_n 64 --repeat_penalty 1.3 -p "hey" in C:\Users\ninja\dalai\alpaca but nothing is coming out.

Njasa2k avatar Mar 22 '23 18:03 Njasa2k

Check task manager to see if its using up 4gb of memory. If its not, then you might not have a model in dalai\alpaca\models\7B

If it is loading, then you have the same problem I have. Not sure how to fix it yet.

Tenetri avatar Mar 22 '23 22:03 Tenetri

It is using at least that much with 30B parameter model.

Njasa2k avatar Mar 23 '23 01:03 Njasa2k

I had no output because I used a too low Python version, installed 3.10 and it seems to be fine now

lydianb79 avatar Mar 23 '23 08:03 lydianb79

I'm using python 3.10.

Njasa2k avatar Mar 23 '23 17:03 Njasa2k

i get this when enteringa prompt :

C:\Windows\system32>npx dalai serve mkdir C:\Users\Adam\dalai Server running on http://localhost:3000/

query: { method: 'installed', models: [] } modelsPath C:\Users\Adam\dalai\alpaca\models { modelFolders: [ '7B' ] } exists 7B modelsPath C:\Users\Adam\dalai\llama\models { modelFolders: [] } query: { method: 'installed', models: [] } modelsPath C:\Users\Adam\dalai\alpaca\models { modelFolders: [ '7B' ] } exists 7B modelsPath C:\Users\Adam\dalai\llama\models { modelFolders: [] } query: { seed: -1, threads: '8', n_predict: '2000', top_k: 40, top_p: 0.9, temp: '0.1', repeat_last_n: 64, repeat_penalty: 1.3, debug: false, models: [ 'alpaca.7B' ], prompt: 'why sun is yellow ?\n', id: 'TS-1679847334536-84896' } C:\Users\Adam\AppData\Roaming\npm\node_modules\dalai\index.js:219 let [Core, Model] = req.model.split(".") ^

TypeError: Cannot read properties of undefined (reading 'split') at Dalai.query (C:\Users\Adam\AppData\Roaming\npm\node_modules\dalai\index.js:219:35) at Socket. (C:\Users\Adam\AppData\Roaming\npm\node_modules\dalai\index.js:534:20) at Socket.emit (node:events:513:28) at Socket.emitUntyped (C:\Users\Adam\AppData\Roaming\npm\node_modules\dalai\node_modules\socket.io\dist\typed-events.js:69:22) at C:\Users\Adam\AppData\Roaming\npm\node_modules\dalai\node_modules\socket.io\dist\socket.js:703:39 at process.processTicksAndRejections (node:internal/process/task_queues:77:11)

Node.js v18.15.0 Can someone cast some light on this please ?

ramzeez88 avatar Mar 26 '23 16:03 ramzeez88

Do you guys have folders at

/home/username/dalai/llama/models

or

/home/username/dalai/alpaca/models

Apparently when it's working, people have a full blown install in those directories. Mine don't even HAVE those directories unless I manually create them.

mapleroyal avatar Mar 26 '23 18:03 mapleroyal

Do you guys have folders at

/home/username/dalai/llama/models

or

/home/username/dalai/alpaca/models

Apparently when it's working, people have a full blown install in those directories. Mine don't even HAVE those directories unless I manually create them.

yes, i have folder models in alpaca directoty but it still doesn't produce output for me.

ramzeez88 avatar Mar 26 '23 18:03 ramzeez88

I don't know. I didn't have those folders, so I ran the stupid install command even though I already have the models, and it started building that directory out. I cancelled it once it started downloading the model and I just put my own stuff in there. Works perfectly now. 🤡🤡🤡

mapleroyal avatar Mar 26 '23 18:03 mapleroyal

I don't know. I didn't have those folders, so I ran the stupid install command even though I already have the models, and it started building that directory out. I cancelled it once it started downloading the model and I just put my own stuff in there. Works perfectly now. 🤡🤡🤡

just did the same but still no output and errors in cmd.

ramzeez88 avatar Mar 26 '23 18:03 ramzeez88

I would highly recommend getting the official models, however you can, then checking their b3sums, and doing the converting and quantizing yourself.

Unlike this project, the llama.cpp one's instructions actually work (assuming you have already installed all the prerequisites, which sadly developers, even him, are really bad about clarifying). You run the commands and end with quantized models that work. Then do what I said worked for me above, start the download in dalai and cancel it when it starts to download the model. Put your models in those folders, clear the website data and cookies for dalai, then re-run it.

mapleroyal avatar Mar 26 '23 18:03 mapleroyal