dalai
dalai copied to clipboard
No output is appearing from typing a prompt.
I've typed in a prompt
exec: C:\Users\*****\dalai\alpaca\build\Release\main --seed -1 --threads 16 --n_predict 6942069 --model models/30B/ggml-model-q4_0.bin --top_k 420 --top_p 0.9 --temp 0.9 --repeat_last_n 64 --repeat_penalty 1.3 -p "hey" in C:\Users\ninja\dalai\alpaca
but nothing is coming out.
Check task manager to see if its using up 4gb of memory. If its not, then you might not have a model in dalai\alpaca\models\7B
If it is loading, then you have the same problem I have. Not sure how to fix it yet.
It is using at least that much with 30B parameter model.
I had no output because I used a too low Python version, installed 3.10 and it seems to be fine now
I'm using python 3.10.
i get this when enteringa prompt :
C:\Windows\system32>npx dalai serve mkdir C:\Users\Adam\dalai Server running on http://localhost:3000/
query: { method: 'installed', models: [] } modelsPath C:\Users\Adam\dalai\alpaca\models { modelFolders: [ '7B' ] } exists 7B modelsPath C:\Users\Adam\dalai\llama\models { modelFolders: [] } query: { method: 'installed', models: [] } modelsPath C:\Users\Adam\dalai\alpaca\models { modelFolders: [ '7B' ] } exists 7B modelsPath C:\Users\Adam\dalai\llama\models { modelFolders: [] } query: { seed: -1, threads: '8', n_predict: '2000', top_k: 40, top_p: 0.9, temp: '0.1', repeat_last_n: 64, repeat_penalty: 1.3, debug: false, models: [ 'alpaca.7B' ], prompt: 'why sun is yellow ?\n', id: 'TS-1679847334536-84896' } C:\Users\Adam\AppData\Roaming\npm\node_modules\dalai\index.js:219 let [Core, Model] = req.model.split(".") ^
TypeError: Cannot read properties of undefined (reading 'split')
at Dalai.query (C:\Users\Adam\AppData\Roaming\npm\node_modules\dalai\index.js:219:35)
at Socket.
Node.js v18.15.0 Can someone cast some light on this please ?
Do you guys have folders at
/home/username/dalai/llama/models
or
/home/username/dalai/alpaca/models
Apparently when it's working, people have a full blown install in those directories. Mine don't even HAVE those directories unless I manually create them.
Do you guys have folders at
/home/username/dalai/llama/models
or
/home/username/dalai/alpaca/models
Apparently when it's working, people have a full blown install in those directories. Mine don't even HAVE those directories unless I manually create them.
yes, i have folder models in alpaca directoty but it still doesn't produce output for me.
I don't know. I didn't have those folders, so I ran the stupid install command even though I already have the models, and it started building that directory out. I cancelled it once it started downloading the model and I just put my own stuff in there. Works perfectly now. 🤡🤡🤡
I don't know. I didn't have those folders, so I ran the stupid install command even though I already have the models, and it started building that directory out. I cancelled it once it started downloading the model and I just put my own stuff in there. Works perfectly now. 🤡🤡🤡
just did the same but still no output and errors in cmd.
I would highly recommend getting the official models, however you can, then checking their b3sums, and doing the converting and quantizing yourself.
Unlike this project, the llama.cpp one's instructions actually work (assuming you have already installed all the prerequisites, which sadly developers, even him, are really bad about clarifying). You run the commands and end with quantized models that work. Then do what I said worked for me above, start the download in dalai and cancel it when it starts to download the model. Put your models in those folders, clear the website data and cookies for dalai, then re-run it.